Processing Machine Learning Data without context is already having catastrophic and fatal consequences in daily life.
1 - When a TESLA driver invokes autopilot, leaves the driver seat and ends up in a crash, TESLA is processing Machine Learning data without context. TESLA's rule is that the driver has to be in the driver seat when autopilot is on, but obviously, the TESLA sensors are being fooled by the rule-violating driver. If the sensor data is evaluated within context, then the failure will be prevented.
2 - The same is true for when a driver is being monitored by sensors and cameras by fleet-owning companies. If a driver
engages in a hard braking action because a kid was running after his soccer ball, he should not be penalized, but
many sensors and in-dash cameras do not capture that context.
3 - When the wrong person gets arrested because of faulty face recognition software, Machine Learning data is being processes out of context.
As an industry leading lab with the most important implementations for worldwide government and business, Oracle was warned about this problem in 1996, ignored it until 2007, does not have a solution today, because Oracle's trace files, and diagnostics that are in every module, were detected to be processing data out-of-context. Oracle can be implicated, sometimes criminally, for many of the failures in many mission critical systems, globally. Oracle's database Engine has been processing Machine Learning data out of context since 1996.
Vicken succeeded in detecting this problem on 400 of the toughest cases at Oracle Database Support over 13 years, with 100% success, solving problems in lightening speed and demonstrated the role of proper context for diags, but Oracle fired him for it. Larry Ellison stepped down in 2014 to focus on this problem and fix it, so far we are not seeing a solution. We're seeing meritless regulatory and litigation action by Oracle.