[1] Fundamentals of Data Engineering:
https://www.oreilly.com/library/view/fundamentals-of-data/97...
[2] Fundamentals of Data Engineering Review:
https://maninekkalapudi.medium.com/fundamentals-of-data-engi...
Apparently, Fundamental of Data Engineering book does refer to cargo-cult metaphor inside its content [1].
[1] Fundamentals of Data Engineering:
https://www.oreilly.com/library/view/fundamentals-of-data/97...
Hopefully the authors can update the book soon to reflect the latest information and expand with another entire chapter for data management as they did to data architecture.
[1] Fundamentals of Data Engineering:
https://www.oreilly.com/library/view/fundamentals-of-data/97...
The fundamentals of data engineering (https://www.oreilly.com/library/view/fundamentals-of-data/97...) is adamant on this. If you're a new data engineering team, just buy things that download the data you need, because fetching data from APIs is (usually) pretty simple to do with automated tooling. Then your team can focus on the part that we haven't nailed, like making sane models.
1) parse the ASN.1 files, 2) generates the equivalent declaration in a programming language (like C or C++), 3) generates the encoding and decoding functions based on the previous declarations
All of these of exercise are apparently part of data engineering process or lifecycle [1].
Back in early 21st century Python is just another interpreted general purpose programming language alternative, not for web (PHP), not for command tool (TCL), not for system (C/C++), not for data wrangling (Perl), not for numerical (Matlab/Fortran), not for statistics (R).
D will probably follow similar trajectory of Python, but it really needs a special kind of killer application that will bring it to the fore.
I'm envisioning that real-time data streaming, processing and engineering can be D killer utility and defining moment that D is for data.
[1] Fundamentals of Data Engineering:
https://www.oreilly.com/library/view/fundamentals-of-data/97...