Women’s College Hospital is on the journey to becoming a Learning Health System (LHS). This is a model in which we will foster a culture of curiosity and engagement where everyone at the organization is continuously asking thoughtful questions about our current processes, delivery, and outcomes, to determine if and where we are adding value and then to assess how we can work together to effect improvements.
What is a Learning Health System?
Did you know that on average, it takes about 17 years for research findings to be implemented into clinical practice? A learning health system involves the use of feedback and data to make improvements to patient care in real-time – not years from now.
An LHS embodies the integration between research, clinical care, operations, and the community where everyone is engaging in research and quality improvement. This fosters continual improvement and learning across all programs, centers, and institutes, with the aim of impacting positive change for the best possible patient care.
The four building blocks to a learning health system are that:
- It is patient-centered, focusing on individuals, families, and communities;
- It is data and evidence driven;
- It is fully supported and enabled by the organizational structure; and
- It takes place within a culture of ongoing rapid learning and continual improvement.
To achieve a learning health system, an entire organization – people, patients, and infrastructure – all need to be aligned.
LHS Seminar Series
Another great way to learn about an LHS is by attending one of our monthly virtual seminars! This is an opportunity for speakers from across the globe to share their experience with an LHS, the value-add, and how it has influenced the way they conduct their work.
Upcoming LHS Seminars
Dates | Speaker(s) | Topic | Institution |
10-Dec-24 | Krisztina Vasarhelyi & Margaret Lin | Getting started with an LHS | Vancouver Coastal Health Authority (Vancouver, BC) |
14-Jan-25 | Samantha Anthony & Sarah Pol | Integrating PROMs into EPIC using Voxe | The Hospital for Sick Children (Toronto, ON) |
11-Feb-25 | Sunil Kripalani | Realizing the LHS vision + LHS Pragmatic RCTs | Vanderbilt University Medical Center (Nashville, TN) |
4-Mar-25 | Romana Hasnain-Wynia & Laura Podewils | Realizing the LHS vision with a strong focus on advancing health equity | Denver Health Medical Center (Denver, CO) |
8-Apr-25 | Sangeeta Hingorani | Seattle Foodbank and Food Insecurity | Seattle Children’s Hospital (Seattle, WA) |
6-May-25 | LHS Working Groups (2) | 1) LHS Datasets 2) PROMs 3) LHS Collaboratory | WCH |
10-Jun-25 | WCH Learnings (Internal Initiatives) | WCH |
The role of the LHS Steering Committee is to support the execution of the vision, goals, and mandate of WCH in building a Learning Health System (LHS). Meetings occur every 4-6 weeks, between September to June.
Vision
To foster a culture of curiosity and engagement where everyone in WCH is continuously asking questions about our current processes, delivery, and outcomes, to determine if and where we’re adding value and then to assess how we can work together to effect improvements.
Goals
- Develop a consensus and common vision of LHS across WCH through engagement from various partners and community members;
- Build the Learning Health System Collaboratory, which will serve to plan, implement, evaluate and sustain the LHS model; and
- Establish the building blocks for an LHS, including hospital infrastructure and change management processes.
Membership
- Chairs
- Dr. Rulan S. Parekh MD, MSc (Vice-President, Academics; Staff Nephrologist; Professor, Department of Medicine, Institute of Health Policy, Management and Evaluation, University of Toronto)
- Dr. David Urbach MD, MSc (Head, Department of Medicine; Medical Director, Perioperative Services)
- Soumia Meiyappan (Manager, Learning Health System Collaboratory, Academics)
- Members
- Dr. Onil Bhattacharyya MD, PhD (Family Physician; Frigon Blau Chair in Family Medicine Research; Director of the Institute for Health Systems Solutions and Virtual Care)
- Dr. Noah Ivers MD, PhD (Family Physician and Clinician Scientist; Canada Research Chair, Implementation of Evidence-based Practice)
- Dr. Geetha Mukerji MD, MSc (Corporate Medical Information Officer; Staff Endocrinologist)
- Theresa Kay (Director, Professional Practice)
- Celia Laur (Implementation Scientist and Health Services Researcher)
- Chandra Farrer (Quality Manager, Quality, Safety and Patient Experience)
- Elaine Goulbourne (Clinical Director, Primary Care and the Peter Gilgan Centre for Women’s Cancers)
- Sandra Grgas (Clinical Director, Specialized Medicine & Mental Health)
- Sandra Robinson(Clinical Director, Perioperative Services)
- Brenda Chin (Manager, Organizational & Talent Development)
- Anne Forsyth (Director, Clinical Applications & Decision Support)
Note: Two experience advisors will be engaging community members with lived experience on this Steering Committee and are in the process of identifying these partners. Please see our page on Patient and Community Engagement for more details.
A timeline of the Learning Health System development at WCH over the next 5 years (2023-2028).
We invited two Experience Advisors (EAs) to join our LHS Steering Committee. The role of the EAs on this Committee has been a combination of providing feedback and input on ideas and projects and partnering with staff Committee members as an equal Committee member.
Our approach with building an LHS at WCH is to start small, and then build out. In the next few months, we will develop “pilot” projects, which are ways of testing out different ideas and will certainly benefit from EA input, especially as we come up with these ideas and think about how they can be rolled out. For example, one pilot idea could be to look at patient questionnaires (known as patient-reported outcome measures, or “PROMs”) that could be used across different departments at the hospital.
The impact our EAs have had on our work towards building an LHS at WCH is already evident – one key area is around the notion of survey fatigue that we need to consider as we investigate PROMs. As patients move across WCH, they may receive the same if not similar surveys, which can be overwhelming. Our EAs have shared that there are several surveys out there and that they often seem perfunctory – highlighting the apparent disconnect between the questionnaire and the care they have received. They are helping us think of ways to ensure we do not create any additional burden for patients by organically integrating these into patient flow. They have also highlighted the importance of communicating to patients what we are doing with this information and how it is connected to their care and informing care delivery. This exercise has also brought to light the other surveys are out there in the WCH landscape and how we can work with others within WCH to approach this work from an organization-level.
We are confident that connecting with and involving our EAs in an ongoing manner will continue to make an impact on how we develop our LHS at WCH.
Below you will find a list of helpful resources that describe national and international examples of LHS’ in development and stories of successful LHS implementations.
- The Learning Health Care Project
- The Learning Health System Collaboratory at the University of Michigan
- Institute for Better Health, Trillium Health Partners (Ontario)
- LHS Toolkit from Newcastle University
The Learning Health System (LHS) toolkit was developed by The International Network for Activating Learning Health Systems (INACT-LHS).
Please refer to this resource as a guide for developing and implementing an LHS initiative at WCH.
Coming soon
Coming soon
Dr. Blanca Bolea, MD, MSc, PhD
Staff Psychiatrist and Director of SCOPE-MH, Women’s College Hospital
“Patients help us see what we need to do. Demand is infinite, resources are limited. Where do we put our resources? New models and new ways of thinking about healthcare are needed. We focus on technology, but sometimes we need to think differently and be open to new ways of working.”
Background
- There is a global shortage of trained health workers. Lots of vacancy rates in nursing, family physicians, specializations. Overtime hours and burnout. Equity imbalances amplified during COVID pandemic. WHO recommended Task Shifting: rational re-distribution of tasks among workforce.
- Task shifting in mental health: Social workers trained as therapists and coordinate clinic (supervised by psychiatrists). Provide brief interventions, nurse navigation, and connections with other clinics. In 2022, supported 84 family practitioners.
- Task sharing: effective collaboration with range of practice models. Promote mental health, coordinate and provide effective services. Includes stakeholders: patients, families, providers, etc. in common goal.
Highlights
- SCOPE-MH: Provides direct care services and indirect consultation/resource finding. Ethos of Service: low barriers, simple referral pathway, patient-led and needs-based interventions, mostly virtual intervention, equitable care for under-served populations. Patients and physicians are closely connected to social workers who screen, triage, and address psychosocial needs. If patients need psychiatric consultation, psychiatry team triages and sees patients. Consistent communication with primary healthcare physician (PCP).
- SCOPE-MH Hub model co-designed with patients, doctors, and specialists. Social workers speak to PCPs and provide orientation, handouts, case examples. Patient resource guide.
- Evaluation essential to maintain service, increase funding, and provide better care.
- Findings: Program well-received by PCPs and patients. Adoption rates moderate and 50% PCPs used services. Physicians felt that SCOPE-MH saved time, patients improved, and learned about new MH resources. Patients felt needs were understood, questions answered, and on right path to improve their mental health.
- SCOPE-MH expansion: 7 sites in Toronto, 2 rural sites. CAMH following same model and evaluation.
- Community of Practice (CoP): group of people who share common concerns and interests and come together to fulfill goals. Patients provide input (Patient Advisory Committee) on evaluation, surveys, service policies, screening processes.
Lessons Learned
- Tradeoff between serving more patients vs. serving patients with higher needs. Low-barrier access is important, but sometimes see complex patients that require many resources. Concentrating on doctors who speak a second language and whose patients have greater barriers to access care.
- We focus on technology, but sometimes we need to think differently and be open to new ways of working. If we did the program again, would start with a system (EMR) that everybody uses (standardized) to make evaluation easier. Different healthcare professionals using different systems led to many problems.
- Scale and spread the program across the province through outreach. Toolkit on website describes how to start a program from scratch.
- Problem with collaborative care: measures don’t work well if not at level of patient care. Made a PREM that works well, and is in use in other sites, but we need more. Difficult to find a patient-oriented measure when outcomes of patients are so variable. Qualitative perspective is valuable.
- Working on evaluating this initiative to formalize in future and expand to other professionals.
Dr. William E. Smoyer
Vice President and Director, Center for Clinical and Translational Research, Nationwide Children’s Hospital
“What would it look like if we did research on every patient every day? What if people were interested in using and analyzing data to improve care?” “Change is hard because people overestimate the value of what they have and underestimate the value of what they may gain by giving that up.”
Background
- Rising cost of healthcare is unsustainable. We need to do something different. Need systems to improve care while reducing costs. US National Academy of Medicine has called for a “Learning Healthcare System.” Integration of clinical care and research is a moral obligation to patients –need to use data to improve care.
- Significant barriers: Organizational readiness; different information standards; poor technology integration; poor workflow integration (profoundly different processes and cultures in care and research).
- Paradigm shift: imagine an LHS future state. All stakeholders serve a bigger goal. Interests aligned for better care at lower costs, with improved quality and value. Need innovation to overcome barriers.
- Provide evidence-based medicine and fully integrate it with evidence-generating medicine.
- LHS aggregates large datasets across multiple hospitals/systems. What can we learn from all that data?
- Vision: share our learnings to build an LHS from the bottom up.
Highlights
- “Learn from Every Patient Program” (LFEP): Share findings with other clinical programs. Can move to other clinical programs within a Learning Health Organization, and across systems.
- Start with all patients receiving evidence-based standardized care -> collect clinical and research data from patients during clinical visits -> migrate data to Data Mart (images, prescriptions, proteomics data etc.) -> systematically analyze data (translational research) -> publish and disseminate new knowledge. Cycle often stops here. But in LFEP vision, new knowledge drives incremental improvements in cost-effectiveness and standard of care. Systematic application of improvements to care of all patients (implementation science).
- All stakeholders change the way they do their work, clinical practice, collaboration.
- Ethics, legal services involved and nervous about this new program. What is the data for? How will it be used? Created ethics database protocol to be gatekeeper of data for anyone who wanted to access the data.
- Created routine clinical care-related data fields. Approached clinical care team and told them to deliver initial standard of care (evidence-based standards + expert clinical judgment). Cannot measure and prove improvements unless you show changes, and cannot measure change without a defined objective starting point. Everyone in trial would get the same care, documented in the same way each time.
- Built research data fields and data elements to populate answers to questions in EHR. Fill out all the data when patients come in. Clinical and research data fields entered in the same EHR. All documentation done in context of billable patient encounter. Data then migrated to Data Mart.
Experiment: Developed, implemented and evaluated a model of EHR-supported care in 131 children with cerebral palsy. In every visit, integrated clinical care, QI, and research.
- Outcome measure: healthcare utilization rates and charges 12 months later.
- LFEP group (standardized care, routine clinical data collected in EHR). EMR has always been incomplete and error-prone. Someone with content expertise was hired to check the data.
- Results: Utilization Rates – LFEP group had reductions in inpatient admissions, total inpatient days, ER visits, and urgent care visits compared to pre-LFEP AND non-LFEP group. Healthcare Charges – LFEP group had reductions in inpatient, ER, and urgent care charges.
Lessons Learned
- Needed to get doctors on board – had already asked them to give up autonomy. Asked them to craft 3-5 questions about their patients that if answered, would improve the care of their patients. Full-time clinical staff participate in today’s care, but rarely get the opportunity to generate tomorrow’s care.
- Broad-based buy in from all stakeholders is essential for success.
- Came up with a workable solution for IRB and the Legal team.
- Integration requires significant culture change; people had to change how they did their work.
- Benefits: Systematic improvement in clinical care, reduce healthcare costs, market advantage for robust delivery of evidence-based care, opportunity to phenotype biologic samples, PROMs/PREMs, career advancement of academic faculty. Clinicians received data about their patients’ care.
Dr. Peter Margolis
Co-Director, James M. Anderson Center for Health System Excellence, Cincinnati Children’s Hospital
“What if your doctor could tell you what you could expect, based on the last thousand patients like you that she saw?”
Background
- Pediatric IBS Pilot: limited data about what treatments work for Crohn’s disease. Clinical trial can give an idea, but will not guarantee personalized treatment.
- Patients, clinicians, and researchers worked together to create sophisticated model-based dosing approach embedded in EHR and part of a trial. Metaphor: we often see the trains (doctors, nurses, hospital) and not the tracks (how we work). Re-imagine how we give care.
- Network organization: behind the scenes forces that shape our work. Three components: 1. Actors (patients, hospitals, people). 2. Common resource pool of data, knowledge. 3. Processes and protocols that make collaboration possible.
- What could healthcare achieve under a model of openness and collaboration? Immense potential in collaborative networks. Leads to better patient outcomes and system effectiveness.
- How to create a Learning Health Network? 1. Create shared purpose and community through focus on outcomes and urgency for results. 2. Create shared processes, technical, and data infrastructure. 3. Create shared resources of information, knowledge. 4. Build a rapid learning system to continuously evaluate progress and optimize network performance.
Highlights
- ImproveCareNow: LHS for IBS. Started in 2010. ~82% children are now in remission. 105 care centres, 50,000 patients in registry. 20 networks and organizations internationally.
- Stepwise approach to triple use registry: collect data once for clinical care, QI, and research. Start with Case Report Form. Physicians test most important variables and outcomes to produce useful QI reports. Build EHR forms so data collection takes place at the point of care.
- Near Real-Time Situational Awareness: On any given day, can get a report on % of kids in remission. Can sort and filter data to see other centres’ work in ImproveCareNow network.
- Awareness -> Participation -> Contribution -> Ownership. Patients can contribute and become owners where they create tools and lead a team.
- Shared resources in network. Model IBD Care updated yearly by clinical committee, for updated best practices. Website: different resources created by patients for patients, with community.
- Network for research. Patients teamed up with doctors to create a trial focused on personalizing dietary treatments for Crohn’s disease. Patients randomized to diets and tracked their symptoms. Outcomes measured using IBD registry. Used an app with a patient-facing end and back-end for data storage and analyses (Eureka). Results of trial provided individualized answers for each patient. About 35% of patients could avoid additional medications with dietary treatment. Aggregated data for population effect.
- Networking sprints – people come together to solve specific problems.
Lessons Learned
- How to engage everyone in LHS model? Work on things that matter to patients most – can they live the life they want (exercise, travel etc.)? Start with understanding what’s important to people with lived experience.
- Developed relationship with Epic to create tools in data capture in EHR and distribute them to network. They are willing to do this when hospitals come together and agree on what to collect.
Dr. Lillian Sung, MD, PhD
Professor, Senior Scientist, Chief Clinical Data Scientist, The Hospital for Sick Children
Background
- SickKids (SK) is the largest pediatric tertiary care hospital in Canada. But, experiences challenges with data access. Solution: SickKids Enterprise-wide Data in Azure Repository (SEDAR) to provide deidentified data source and enable new machine learning (ML) program. Centralized source of curated data across multiple purposes. Used to populate dashboards, create reports, data sets for research users.
- In 2018, SK transitioned to Epic. To enable reporting, transformed to relational database: Clarity; very large (18,000 tables), most data sciences programs use because of direct translation from real-time data.
- One data element distributed through many tables. The same data element may be in different tables depending on the workflow. Makes it challenging to find any particular element. Knowledge is often lost with each new query; no way to easily save knowledge.
- Decisions should be based on best-available data and delivered quickly (hours vs. months).
Highlights
- Copy of Clarity database put in MS Azure; updated every night. Every morning, obtain a complete replicate of data in Azure. Want a copy of Clarity to not interfere with hospital operations/workflows.
- Identify important tables and transform data so it’s easy to understand and query. Make de-identified data set in OMOP Common Data Model (only for research).
- Now have a source of data for ML. Can find outcomes, but also use it to predict features and train model.
- In 2022, a report emerged of unexplained hepatitis in children worldwide. Urgent request made at SK to understand patients who had hepatitis. Requestor was told it would take months to receive so turned to SEDAR and request completed in 1 day. Found all children who met criteria since 2018, described prevalence, tests done, and outcomes. Hepatitis had not changed at SK over time.
- Using SEDAR to meet reporting needs. Submit data to CIHI (admissions, discharges, visits, surgeries etc.) SEDAR as source of truth. QI: create dashboards (antibiotic utilization), also lab testing utilization to improve prescription practices.
- Rheumatology flowsheets to track children with arthritis. QoL, PROMS data. Wanted to obtain flowsheet data. When they approached SEDAR team, the request had been waiting 1 year. They wanted help to create an initial database and repeat this. Performed initial query in a few hours. Now all flowsheets can be easily updated because of curated schema structure.
- Multi-center research. Normally extract data and send to 1 person to collate and clean data. Institutions use different terms. Need to map terms across centres. What if all centers had data in same format, same structure, and same vocabulary? Could do analyses independently and then synthesize results. This is the idea behind OMOP.
- OMOP: Observational Medical Outcomes Partnership Common Data Model. Standardizes structure and vocabulary of data to facilitate global collaboration. Concept: every medical entity labeled with standard concept ID to ensure interoperability. Can write one query and apply it at any institution to get the answer.
Dr. Adam Yan, MD, PhD
Assistant Professor, University of Toronto; Pediatric Oncologist, The Hospital for Sick Children
Background
- PREDICT: clinically oriented ML group at SK. Goal is to bring ML to bedside to improve patient care and provider experience.
- ML life cycle. Take data in SEDAR/Epic and develop model using historic data. Test and integrate model in clinical workflow. Physicians receive predictions and can use information to adjust care -> ongoing monitoring based on feedback.
- Evaluate healthcare team context and clinical context.
-
- Pick an outcome. Is the problem clinically important? Common? Measurable in EHR?
- Build model. Reasonable expectation that EHR data would be able to predict outcome.
- Implement model. Know the model prediction would change clinical care. What would be the difference in care or outcomes based on differential care provided on those predictions?
- ML implementation relies on clinical team to accept and adopt framework. Every problem = identify clinical champion that is willing to lead. That person will survey peers and articulate that clinical team is receptive to receiving information. Don’t want to provide predictions that people would not trust or use in their care.
Highlights
- Predict vomiting in pediatric cancer patients to intervene earlier and change anti-emetics. It met all clinical context criteria. Then identified clinical champions (doctors, pharmacists).
- Proceeded with ML model. Who will receive predictions, and would someone be available at that time to receive prediction and intervene? Focus on clinically interpretable metrics that clinicians can understand = false positive and false negative rates.
- Build features for predictors in model using all patient’s historic data in Epic. Large volume of data going into each predictor. Then split data, train model and test it on separate set of data, then validate the model.
- Look at existing thresholds and baselines. Perform sub-group analysis. E.g.: different age groups, men vs. women, to make sure model is performing fairly.
- When enough predictions are made, re-evaluate performance and work with end users to decide whether to make model live. Integration into clinical workflow. Work with end users to create clinical pathways and provide context for predictions.
Dr. Lin Lawrence Guo, PhD
Machine Learning Specialist, The Hospital for Sick Children
Background
- Since ChatGPT, there has been an explosion of AI applications. Foundation Models (FMs) are large AI models trained on vast datasets at scale using self-supervised learning. E.g.: GPT language model trained to predict next words in sentence to learn structure of language.
- In healthcare, FM for Structured EMR (FEMR) can summarize entire patient medical history to any point in time. Once model is trained, can learn structure from EHR and use it to summarize the entire patient history into a set of features.
- FEMR evaluation. Models tend to decay over time due to changes in patient population or clinical care practice. Traditional learning model performance decays faster than FEMR. FEMR also performs better overall. Models trained in one population tend not to perform well in another population.
- Used an external foundation model from Stanford to generate patient features for a SK cohort. Used features to train prediction models. Also trained FEMR on SK EHR to capture local data heterogeneity and generate features. External FEMR performed competitively to traditional ML. FEMRs also perform superior to traditional ML with few training examples.
- Sharing foundation models across institutions and knowledge on how to adapt them.
Sue MacRae, RN, Med, RP
Transforming Healthcare through Asynchronous Trauma-informed (TIC) Education
Background
- Trauma survivors are high utilizers of healthcare system but also often report needs being unmet.
- Healthcare professionals also prone to trauma and exposed vicariously through work with patients.
- See an increase in TIC education strategies but a gap in lack of accessible, affordable, scalable training for staff. Seeking to address this gap with a course that can be scaled and spread broadly.
Highlights
- Phase 1 (Needs Assessment) to understand potential barriers to implementing virtual asynchronous TIC curriculum in anonymous survey of healthcare providers; Semi-structured interviews with patients and providers; Feedback allowed for curriculum to be outlined:
- 81% indicated TIC relevant for their role/area and 93% would engage in virtual TIC course.
- Staff and patients identified top topics and indicated need for confidence around identifying trauma and tools to identify symptoms. Providers want specific practical tools in application of knowledge after identifying trauma.
- Phase 2: Developing and testing TIC curriculum. Will build multi-media curriculum using continuous learning approach with feedback from patients, healthcare staff, and healthcare experts. Vision: everyone in the hospital knows about TIC. Once curriculum built, pilot launch to evaluate.
- Phase 3: Evaluate implementation of course. Build and test additional modules specific to needs of different groups. Iterative testing and improvements.
Lessons Learned
- Healthcare providers interested in virtual, self-paced course on TIC.
- Needs assessment confirmed gap and potential solution. Involving patients in every phase.
- Educational initiatives in TIC must focus on translating knowledge to practice. Once launched, TIC course can change practice. Patient-forward, research-driven, and collaborative approach.
Chandra Farrer, PT, MSC and Jacqueline Follis, MSN, RN
Revolutionizing the Best Practice Guideline (BPG) Program at WCH
Background
- In 2015, WCH received Best Practice Spotlight Organization (BPSO) designation from Registered Nurses Association of Ontario (RNAO). Focus: moving research evidence to implementation through BPGs.
- Developed BPG QI Capacity Building Program for nurses, health disciplines and staff to increase QI capacity building and support health system needs. Monthly didactic QI training to apply to a project, with coaching throughout. Leveraged existing structures to reallocate resources. Collaborated with leadership to identify projects and staff embedded in clinical operations and ensure balanced workloads. Engaged with decision support, IM/IT, and APQIP to access data.
Highlights
- Increased from 9 BGP champions in 2021, to 106 presently. ~17% of nurses/health disciplines have participated.
- 30 projects using 8 BPG guidelines across 26 programs; support improving patient outcomes, safety, and interprofessional practice. Formation of cross-clinic teams for knowledge exchange across WCH.
- Program fosters culture of rapid learning and enhances competencies. Grassroots approach fostered collaboration between frontline staff and leadership.
- Next steps: strengthen patient engagement, provide guidance on BPG collection, and provide input on projects. Work towards spread and sustainability.
Lessons Learned
- Build QI operational infrastructure at program level to ensure continuity and lead future work.
- When projects embedded in program priorities, more likely to succeed.
- Coaching, mentorship, and developing local leadership are critical.
- In-person component built excitement into the program and maintained interest.
- Healthcare organizational culture is integral to success/failure of initiatives.
Arun Prasad, MBBS, DA, FRCA, FRCPC
Pushing Boundaries in Ambulatory Anesthesia
Background
- How can we improve ambulatory surgery? Some patients are not selected for same-day surgery. Which patients can be included so they can recover at home? Can we modify pathways?
- Many patients on surgery wait lists. Ambulatory surgery is a bonus for the health system. 2 procedures common at WCH: inguinal hernia repair; hysteroscopic surgery.
Highlights
- Inguinal hernia repair: Simple procedure, usually not urgent. If not operated in timely manner, the hernia can get strangulated and become an emergency. Hysteroscopy: also simple and short procedure. Usually done under general anesthetic, but takes more time and airway intervention is required, with a higher risk of respiratory complications and symptoms.
- Looked at surgical techniques – sometimes done under local anesthesia and sedation. Tried Fast Track Model at WCH. Pre-op: identify suitable patients + anesthetic assessment to optimize patients. Local anesthesia by surgeon. Faster recovery, pre-emptive anti-emetics and pre-emptive analgesia to improve recovery. Intervention at all 3 surgical phases modified.
- Results: # of local anesthetic inguinal surgeries almost doubled. Cut down on general anesthesia to almost 0 cases in 2023. Similar trends in hysteroscopy: more being done under local anesthesia.
Lessons Learned
- Brought Fast Track model into practice. Expect further data collection and more robust feedback. Explore other procedures: e.g.: knee surgery. Collaborate with care pathways between departments.
Dr. Thomas Foley
Principal Investigator, Learning Health Project, Newcastle University (UK)
Background
- An LHS learns from every patient. But, this sounds too easy. Requires effort and investment to achieve.
- LHSs operate at many different scales. Some look at cohorts of patients, treatment modalities, diseases, etc.
- Newcastle developed a framework for designing and evaluating LHSs to inform on how to build one. Each organization will need to co-design its own LHS. Framework can help find gaps, and where investments are needed.
- Start by celebrating how much work is already underway that fits into the LHS model.
- Toolkit developed by the Learning Health Project Team: https://lhstoolkit.learninghealthcareproject.co.uk/
Highlights
Key questions from Framework:
- Why do you want an LHS? E.g., Improve outcomes, value, generate knowledge, apply knowledge from outside the system, maximize technology, boost clinical performance.
- Which technical building blocks? Every LHS has a cycle (Practice, Data, Knowledge) and learning community at the centre: people come together, reflect, and co-design how they learn and improve; positive error culture.
• Getting data from practice is difficult. Data saves lives; knowledge applied at frontlines.
• Knowledge to practice: changes take a long time but there are techniques to speed up this process.
• Identify platforms to scale which can be implemented and shared among organizations. Digital solutions are complex and don’t work all the time.
- Which technical building blocks? Every LHS has a cycle (Practice, Data, Knowledge) and learning community at the centre: people come together, reflect, and co-design how they learn and improve; positive error culture.
- What Strategic Approaches to Change? Need to be aligned with strategy of organization. Important to have structure and governance in place to deliver. Need workforce capable to do this and leadership to promote.
- The LHS Toolkit – Collaboration across UK, Ireland, Europe, Australia, US. Built a website repository of tools to guide people to an LHS. Pre-loaded with 70-80 tools, want to add more; A dissemination plan ongoing.
- Initial version launched and presented at a conference with feedback. Then gathered interested stakeholders to take part in user testing, and integrated findings into the system.
- Can click on different parts of the LHS wheel to access different resources.
Lessons Learned
- Reason LHSs can fail is uncontrolled complexity within the organization. Difficult to create a system that will meet all pressures. Technology that needs to be heavily integrated or experimental adds complexity. Need realistic expectations of adoption. If LHS threatens clinical professional identity, will nurture resistance and complexity. If too much complexity across too many domains, will be difficult to succeed.
- Need to operate at the scale you have control over. The core cycle of Data, Knowledge, and Practice will be the same. Provider organization is a classic level to do an LHS because there can be lots of control within.
- Practice to Data to Knowledge is in essence a QI model, but organizations often struggle scaling up QI. The infrastructure makes the difference and allows QI to be scaled up – e.g. informatics and digital platforms.
- LHS Toolkit might help an organization that has the resources and determination for an LHS. Run lots of experiments and monitor closely with rapid cycle evaluation. Quickly discard the ones that don’t work and learn from them. For the ones that work, amplify, scale and put resources behind them. Harness experimentation that is already going on and learn from it.
Tammy MacLean, RN, PhD
Research Associate, WCH Centre for Wise Practices in Indigenous Health and Centre for Digital Health Evaluation (WIHV)
“Frameworks should look into ‘how’ rather than ‘what.’ So, the process helps us engage with communities. Each community will have its own approaches and frameworks. How as an LHS do we identify ourselves?”
Background
- Canada has a harmful history with Indigenous health. Similar to residential schools, Indian hospitals were oppressive. Racially segregated healthcare and enforced hospitalization. Patients physically restrained, underwent experimental treatments (painful and disabling).
- Provision of healthcare for Indigenous peoples today falls under the 1939 Supreme Court ruling that federal government has legal responsibility for Indigenous health.
- Environmental scan of 37 resources to understand frameworks, guidelines, and toolkits in LHS literature to advance health and address anti-Indigenous racism. Indigenous perspective of reviewing and critiquing guidelines. Some limitations in literature to applications to Indigenous health.
Highlights
- Accountability Checklist for healthcare providers: for organizations thinking about LHS.
- See Dr. Maclean’s paper for more details: https://onlinelibrary.wiley.com/doi/full/10.1002/lrh2.10376
Lessons Learned
- Difficult to change culture in an organization, especially with micro-cultures.
- OCAP training should be mandatory for those who make decisions about patient data.
- Need ongoing maintenance of training. Information dumps are not helpful to inform practice on a daily basis.
- Within frameworks, there needs to be consideration to understand definitions of progress/success are diverse depending on who’s interpreting them. Power dynamics. Who is speaking for whom, and how?
Joshua C. Rubin, JD, MBA, MPH, MPP
Program Officer, Learning Health System Initiatives, University of Michigan
Executive Director and Vice-President, Board of Directors, Joseph H. Kanter Family Foundation
“When patients can access their own data, they can transform their care.”
Background
- Every decision affecting health is informed by actionable knowledge and what works best. This became the basis of the LHS. Empower and increase capacity of individuals to make choices and transform those choices into actions and outcomes.
- Need to build a communication system to get knowledge to the bedside.
Highlights
- Learning Loops in UofM. Forming community around problem of interest, being thoughtful about how to collect data and mobilize knowledge. Multiple cycles of continuous improvement. Hub of data scientists, clinicians, researchers, etc. Need people to ask good questions.
- What will the health system look like in the future? Will it be more equitable, with greater patient and clinician satisfaction? Empower people to think for themselves, have data, knowledge and tools they need to make decisions.
- LHS Collaboratory at UofM: Forged connections across campus, across disciplines and skills; in-house consultants; meeting place to discuss and showcase different learning health projects.
- UofM built a Legal, Social and Ethical Implications unit for LHS to de-identify data and bring a patient’s personal data back to improve their own care. Get more data by consent; Protect privacy but put the patient at the centre of what they want to come from their data.
Lessons Learned
- Need the culture in place at the organization-level to become an LHS, or else resistance to learning.
- To achieve agreement of people on building an LHS, understanding it, identifying components, and how to achieve it: Every patient’s data available for the study. Best practice knowledge fed back quickly. Part of the culture and infrastructure. The community has a maturity model. Hoping for collaboration with other interested institutes.
Dr. Laura Desveaux, PhD, PT
Scientific Director, Institute of Better Health, Trillium Health Partners
Leadership Fellow, International Women’s Forum
“The least effective way to get someone to change their behaviour is to tell them to do it differently.”
Background
- Health system and policies in the past 50 years haven’t worked. Need a new system for a healthier community. Learn from what works and what doesn’t.
- LHS is a union of the health system and research, involving knowledge translation, and rapid cycle learning and evaluation.
Highlights
LHS Framework as a Learning engine with learning and health system gears that intersect:
Lessons Learned
- Before embarking on an LHS, need to identify a clear vision of what the organization wants to achieve; Everyone needs to be aligned on the concept and how to operationalize an LHS to have productive discussions.
- Expect to see iterations over time. Trial and error. Need humility and willingness to fail. Our job is to learn things, not to know things. Be open about failures and losses, as well as successes. Requires people to be vulnerable and willing to admit what they don’t know. Will lead to organizational resilience.
- Don’t confine LHS to a single institution. Partner with system more broadly: formal healthcare sectors and others that impact on health (e.g., education). Local context is unique to a certain health system – specific constraints, resources unique to that organization.