This conversation is closed.

What are the differences between emergent behavior models like Watson and the statistics used in the proposed judicial tool?

The measurement tool that is proposed is just that, a measurement. In Lean Six Sigma, the statistics program would only fulfill the "M" portion of the DMAIC process of define, measure, analyze, improve and control. Without proper measurements, though, it is difficult to find valuable analytic tools as was made apparent by the sticky note anecdote. I am wondering if emergent behavior programs like Watson would be able to generate more accurate predictive statements. While a healthy skepticism of the capabilities of computers is warranted, there are an increasingly large list of tasks for which they exceed human capabilities. Providing tools such as these can help tremendously when called upon to make informed decisions, whether it applies to questions on Jeopardy, in hospitals or in the courtroom.

  • Feb 15 2014: Not sure how to answer the question, but one issue is loss of analytical ability of thsoes who become dependent on Watson.

    Another issue is the inability to analyze or reason on human issues (morality, equality, etc.). When these are influential factors, Watson could only look to past similar decisions, which would likely have different conditions and context. Until Watson can somehow statistically emulate juries, discussion groups, or whole population segments, then the weight of human related factors cannot be fairly judged. A smart lawyer will always be able to show "reasonable doubt" that Watson or his off-spring were able to weigh the merits of a case fairly.


    As far as improving predictions, my guess would be somehow using historical similar decisions to predict the out come. Like the difference between complicated odds making and market strategy algorithms, and the reality of a good snow storm influencing a football game or world events influencing the market, the ACTUAL reality will generally be very different than a simulated predicted reality for complex situations involving humans and nature.
  • Feb 15 2014: The problem I see with imperfect systems such as Watson, is that they lack the ability to assess the subjective nuance of the accumulative nature of human expression. In other words, presently the strength of computation is the rapid comparison of large bodies of data based on biased algorithms.

    Does the rapid processing increase the quality of justice or simply provide an economically efficient substrate for a society that is immersed in the other. The other being the one who isn't you. The one who impedes you. Until it is you.

    Efficiency isn't efficacy.
  • Feb 17 2014: So Watson stays above the fray or can't admissible cry fowl when a crime scene is inconsistent to the crime or the prosecution builds its case on weak circumstantial evidence. It could only give us probabilities that out of x number of persons x number probably did xyz. Cleaver. What if Watson could only be used by the defense and not the prosecution. Would the prosecution lose more cases? The same with medicine using a legal ethics measurement of probabilities would it measure only a hospital's liability or patient vulnerability ?