Question: How many Database Server trace files does
it take for Oracle Support to change a light bulb?
Answer: We need 2 more Database Server trace files to tell you.
Processing Machine Learning Data without context is already having catastrophic and fatal consequences in daily life.
1 - When a TESLA driver invokes autopilot, leaves the driver seat and ends up in a crash, TESLA is processing Machine
Learning data without context. TESLA's rule is that the driver has to be in the driver seat when autopilot is on, but
obviously, the TESLA sensors are being fooled by the rule-violating driver. If the sensor data is evaluated within
context, then the failure will be prevented.
2 - The same is true for when a driver is being monitored by sensors and cameras by fleet-owning companies. If a driver
engages in a hard braking action because a kid was running after his soccer ball, he should not be penalized, but
many sensors and in-dash cameras do not capture that context.
3 - When the wrong person gets arrested because of faulty face recognition software, Machine Learning data is being
processes out of context.
4 - My simple 2021 model car, if I drive it slowly in a parking lot, it will immediately and automatically apply
the brakes to stop, if it detects that there is a person standing in front of the car. I assume
it will do so to make sure I do not run over an innocent person. Well enough. What if the person in front of the car
is a carjkacker, with a gun pointing at me? Shouldn't I be able to run him over to protect my life?
The car's decision to stop is another, very simple example of Machine
Learning or AI driven processing of data without context. Over decades, we have determined that a person standing
in front of the car is most likely an innocent person and should not get hit.
As an industry leading lab with the most important implementations for worldwide government and business, Oracle
was warned about this problem in 1996, ignored it until 2007, does not have a solution
today, because Oracle's trace files and diagnostics, that
are in every module, were detected to be processing data out-of-context. Oracle can be implicated, sometimes criminally,
for many of the failures in many mission critical systems, globally. Oracle's database Engine has been processing Machine
Learning data out of context since 1996.
Vicken succeeded in detecting this problem on 400 of the toughest cases at Oracle
Database Support over 13 years, with 100% success, solving problems in lightening speed and demonstrated the role of proper
context for diags, but Oracle fired him for it. Larry Ellison stepped down in 2014 to focus on this problem and fix
it, so far we are not seeing a solution. We're seeing meritless regulatory and litigation action by Oracle.