In an op-ed for the New York Times, Pro Publica's Julia Angwin writes that while algorithms are "ubiquitous in our lives," "they are also being employed to inform fundamental decisions about our lives" and that's a bit ominous. Angwin writes: "Companies use them to sort through stacks of résumés from job seekers. Credit agencies use them to determine our credit scores. And the criminal justice system is increasingly using algorithms to predict a defendant’s future criminality. Those computer-generated criminal 'risk scores' were at the center of a recent Wisconsin Supreme Court decision that set the first significant limits on the use of risk algorithms in sentencing." Angwin writes that the court ruled that while judges could use the risk scores, they could not be a “determinative” factor in whether a defendant was jailed or placed on probation. "And, most important, the court stipulated that a presentence report submitted to the judge must include a warning about the limits of the algorithm’s accuracy." She argues that the warning requirement "is an important milestone in the debate over how our data-driven society should hold decision-making software accountable. But advocates for big data due process argue that much more must be done to assure the appropriateness and accuracy of algorithm results."