Processing Machine Learning Data without context is already having catastrophic and fatal consequences in daily life.
Because Oracle and TESLA are very powerful and can influence government parties, with lawyers under their control, I'm not disclosing Oracle's official admissions about this problem as well as Oracle's 27-year history of coverup (1996 - 2023). I'm only disclosing information that's publicly available. I have disclosed and will disclose more to lawyers and government parties.
People who lack knowledge about the internal workings of Oracle will have a hard time to see how, out-of-context data is causing catastrophic damage in Oracle related failures. With this envelope example, in the pdf file below on this page, anyone who's familiar with the simple task of writing an address on an envelope, can see what happens to the envelope when the data is 100% accurate, but the requirement for context is not satisfied.
If you follow the envelope example, and you understand the Previous President of Stanford University, Dr. Donald Kennedy's analogy on my website (bottom part of the document below), then imagine what happens to the inner workings of modern day Artificial Intelligence based software, where massive amounts of data get collected to make sense of things. Ask yourself, how many pixels of data is a TESLA collecting per second to visualize a STOP sign? What happens when any part of that data is stripped of it's context related constraint?
Maybe now you know why Self Driving cars in San Francisco, all of a sudden got on sidewalks or why TESLA cars kept hitting parked emergency vehicles with lights flashing. Elon Musk is on Youtube making fun of this problem, and is saying that it's easy to fix. Oracle has not been able to fix this problem since 1996, and Oracle fired the mathematician who had a 100% success rate with it.
1 - A TESLA Autopilot or Full Self Driving is collecting a massive amount of data from multiple sensors and diverse sources. Then it evaluates that data and makes a decision on which way to continue its drive. Like Dr. Kennedy's analogy on my website if it mislabels the baseball game where the main attraction is the 10,000 hotdogs that got sold, or the 40,000 napkins that accompanied the hotdogs, then the TESLA will hit a parked Sheriff's car or a parked fire truck, all the time thinking it's got things right.
This is not an unconfirmed theory. Oracle conducted a 2.5 year investigation, proving conclusively that processing massive amounts of machine learning data in a short time, from diverse sources, without paying attention to context, will result in disaster.
2 - The same is true for when a driver is being monitored by sensors and cameras by fleet-owning companies. If a driver
engages in a hard braking action because a kid was running after his soccer ball, he should not be penalized, but
many sensors and in-dash cameras do not capture that context.
As an industry leading lab with the most important implementations for worldwide government and business, Oracle was warned about this problem in 1996, ignored it until 2007, does not have a solution today, because Oracle's trace files and diagnostics that are in every module were processing data out-of-context. Oracle can be implicated, sometimes criminally, for many of the failures in many mission critical systems, globally. Oracle's database Engine has been processing Machine Learning data out of context since 1996.
Vicken succeeded in detecting this problem on 400 of the toughest cases at Oracle Database Support over 13 years, with 100% success, solving problems in lightening speed and demonstrated the role of proper context for diags, then Oracle fired him for it. Larry Ellison stepped down in 2014 to focus on this problem and fix it. So far we are not seeing a solution. We're seeing meritless regulatory and litigation action by Oracle.
Self driving car challenges and my Oracle story MS WORD