The Ford GT is an intelligent supercar, and being such a powerful vehicle, has a lot of sensors and their corresponding algorithms processing heaps of data each second to keep the car running. As Ford states, the car has over 50 individual sensors collating data from the surroundings to adjust and refine every single element of the vehicle. Combining all the data amounts to a staggering heap of 100GB of data every hour that the Ford GT's onboard computers process, to keep the car running.
The algorithms comprise a staggering 10 million lines of code, which process 300MB of information each second by using more than a dozen computing systems onboard. These gauge everything from terrain pattern, humidity, wheel speeds, driving styles, proximities, obstacle approach speeds and more, to then put all the graphic information into processing and gather an outcome in terms of implementable actions. All of this puts the future of automobiles into perspective, in terms of exactly how much data the cars of our future will need to process, and how competent and powerful the future in-car computing systems will need to be.
However, the scale of data is only slated to increase. For instance, having 10 million lines of code running the entire system is not unheard of, and that is the same scale that is required to run Google's Android OS. At present, counting all of Google's businesses online, the entire set of operations require a staggering two billion lines of code to run. In future, the numbers are bound to increase as connected and autonomous services in cars are slated to only increase. This calls for modern, higher data bandwidths, expansive cloud computing options, and powerful, compact supercomputers to process all of the data. Machines, it seems, won't really have it easy at all.