APHSA Symposium 2019

She described multiple examples of how Michigan is using its Enterprise DataWarehouse – which includes health and non-health data linked through a master-client index – to help providers and beneficiaries access integrated information that can improve health outcomes. Projects include: • CareConnect360, a statewide care management web portal providing a comprehensive view of individuals’ and populations’ participation

• Make your AI system-agnostic so that it can be used with multiple systems, including new systems that replace legacy systems. • Use integrated data to train AI. (Don’t use AI to automate existing rules and process steps since AI can inform improvements to those.) (Please slides in Appendix C-6. ) Presentation: Getting “Big Data” to the Good Guys. Chris Kingsley, Senior Associate for Data Initiatives at the Annie E. Casey Foundation, discussed trends in how state and local governments are using integrated data to better understand population needs and predict which individuals and communities are at greatest risk (e.g., children at risk of abuse or juveniles at risk of engaging in crime.) Key ideas and examples (please see slides in Appendix C-7 ) included: • Several high-profile predictive analytics initiatives recently failed because of flawed algorithms, botched implementation, and lack of transparency with community members. The following articles highlight the factors that led to abandoned predictive analytics projects: • Data mining program designed to predict child abuse proves unreliable 5 • Palantir Has Secretly Been Using NewOrleans toTest its Predictive PolicingTechnology 6 • Well-intended, well-designed analytics that highlight overrepresentation of minorities in a risk group or system can be viewed as inherently racist and lead to community backlash. For example, when data from schools and local law enforcement are used to predict which youth are most likely to engage in crime, communities may fear that the predictions will magnify disparities and essentially operate like racial profiling. • The MetroLab Network has published four principles and a useful checklist 7 for ethically applying predictive tools within human services . The principles are: (1) Engage (internally within the organization and with the community); (2) Pre-validate the model; (3) Review (by evaluating and adjusting the model) and (4) Open Up (proactively sharing information about the model). Kingsley facilitated an interactive discussion with symposium participants about practical strategies for ensuring appropriate and ethical use of data and

in programs, demographic information, health conditions, and services received.

• MI Bridges, an integrated service delivery portal to enable individuals to apply for benefits, explore and connect with resources based on the information

they enter, and manage their own cases. • Predictive analytics to improve targeting of outreach and services. (Please see slides in Appendix C-6 .)

Don Johnson described how artificial intelligence (AI) can support more proactive, predictive solutions to improve customer-centric care . AI dramatically increases the amount of information that can be collected and analyzed frommultiple sources (e.g., legacy systems, distributed systems, cloud-based systems), enabling employees to focus more attention on tailoring services to client needs. “Machine learning” (which uses structured/labeled data) and “deep learning” (which can work with vast amounts of unstructured data) are subsets of AI that use new data to continually improve how a system classifies information and performs tasks. In the future, the same AI techniques that Amazon now uses to predict what customers will want to purchase could be used in human services to reduce or eliminate application forms, intervene before a situation becomes critical, or identify preventive actions that address social determinants of health. AI can make predictions that help case workers know how to engage clients and what questions to ask. (It represents a tool for case worker decision making.) Some key principles for building capacity to use AI include:

• Do NOT build AI into legacy systems. • Externalize your data (i.e., store it in an external repository outside your operational systems) – in real time.

5 ChicagoTribune (December 2017), “Data Mining Program Designed to Predict Child Abuse Proves Unreliable”: https://www.chicagotribune.com/ investigations/ct-dcfs-eckerd-met-20171206-story.html 6 TheVerge (February, 2018), “Palintir Has Been Secretly Using NewOrleans toTest its Predictive PolicingTechnology”: https://www.theverge. com/2018/2/27/17054740/palantir-predictive-policing-tool-new-orleans-nopd 7 MetroLab Network’s Ethical Guidelines for Applying Predictive Tools in Human Services: https://metrolabnetwork.org/wp-content/uploads/2017/09/ Ethical-Guidelines-for-Applying-Predictive-Tools-within-Human-Services_Sept-2017.pdf

8

APHSA H/HS Analytics Symposium for Action 2019

Made with FlippingBook - Online catalogs