Enough Trace



Question:      How many Database Server trace files does it take for Oracle Support to change a light bulb?

Answer:      We need 2 more Database Server trace files to tell you.









Vicken Khachadourian

Processing Machine Learning Data without context is already having catastrophic and fatal consequences in daily life. Oracle's 2.5 year investigation shows that when data gets de-coupled from its context, all kinds of failures are guaranteed, including TESLA and other self driving fatalities.

Because Oracle and TESLA are very powerful and can influence government parties, with lawyers under their control, I'm not disclosing here Oracle's official admissions about this problem as well as Oracle's 27-year history of coverup (1996 - 2023). I'm only disclosing information that's publicly available. I have disclosed and will disclose the complete results of the investigation to lawyers and government parties.



This page presents 2 pdf documents, scroll down:

1 - An example showing what happens to the simple action of writing an address on an envelope, when data and context get de-coupled. Ordinary people do not have the pre-requisites to understand what happens in the internals of a TESLA driving software or the internals of Oracle's code. This example bypasses those obstacles.


2 - In 1995, 2 Carnegie Melon Univesity Computer Scientists drove across country, and the car drove itself 99% of the time. I explain why the self driving car industry is still stuck at 99%. After investing $10 billion each, both GM and Apple Computer gave up too.



If you follow the envelope example, and you understand the Previous President of Stanford University, Dr. Donald Kennedy's analogy on my website (bottom part of the document below), then imagine what happens to the inner workings of modern day Artificial Intelligence based software, where massive amounts of data get collected to make sense of things. Ask yourself, how many pixels of data is a TESLA collecting per second to visualize a STOP sign? What happens when any part of that data is stripped of its context related constraint?

Maybe now you know why Self Driving cars in San Francisco, all of a sudden got on sidewalks or why TESLA cars kept hitting parked emergency vehicles with lights flashing. Elon Musk is on Youtube making fun of this problem, and is saying that it's easy to fix. Oracle has not been able to fix this problem since 1996, and Oracle fired the mathematician who had a 100% success rate with it.





1 - A TESLA in normal mode, or on Autopilot or Full Self Driving is collecting a massive amount of data from diverse sources. During the October 2024 Quarterly Earnings call, Elon said that it's up to 9 cameras, if you include the in-cabin camera, and he labeled it as "Collecting" context. It is also processing massive amounts of data between its functions internally. After collecting all that data per second it makes a decision on which way to continue its drive. Like Dr. Kennedy's analogy on my website if it mislabels the baseball game where the main attraction is the 10,000 hotdogs that got sold, or the 40,000 napkins that accompanied the hotdogs, then the TESLA will hit a parked Sheriff's car or a parked fire truck or a tree or a parked car, all the time thinking it's got things right.

This is not an unconfirmed theory. Oracle conducted a 2.5 year investigation, proving conclusively that processing massive amounts of machine learning data in a short time, from diverse sources, without paying attention to context, will result in disaster.

2 - The same is true for when a driver is being monitored by sensors and cameras by fleet-owning companies. If a driver engages in a hard braking action because a kid was running after his soccer ball, he should not be penalized, but many sensors and in-dash cameras do not capture that context.



As an industry leading lab with the most important implementations for worldwide government and business, Oracle was warned about this problem in 1996, ignored it until 2007, does not have a solution today, because Oracle's trace files and diagnostics that are in every module were processing data out-of-context. Oracle can be implicated, sometimes criminally, for many of the failures in many mission critical systems, globally. Oracle's database Engine has been processing Machine Learning data out of context since 1996.


Vicken succeeded in detecting this problem on 400 of the toughest cases at Oracle Database Support over 13 years, with 100% success, solving problems in lightening speed and demonstrated the role of proper context for diags, then Oracle fired him for it. Larry Ellison stepped down in 2014 to focus on this problem and fix it. So far we are not seeing a solution. We're seeing meritless regulatory and litigation action by Oracle.
























Self driving car challenges and my Oracle story MS WORD