
Create AI videos with 240+ avatars in 160+ languages.
Support skill development in the flow of work with AI video
AI has made learning accessible and on-demand.
βIf Iβm in the middle of work and need to learn something, I ask ChatGPT. Itβs often a good enough starting point to get a useful explanation or push my thinking.
Thatβs how many people are learning now β using AI tools already embedded in their tech stack to get real-time help with real problems.
This reflects how learning actually happens on the job.
Whatβs changed is what the business now expects L&D to enable.
Situations where people can practice, get feedback, and build skills in the flow of work.
The learning and development trends shaping 2026 are responses to that expectation.
Before looking at individual trends, it helps to step back and look at the pressures L&D teams are responding to.
Trend 1: Using AI for coaching and practice at scale
AI has already made content production faster. The more meaningful shift is how L&D teams are starting to use AI to support coached practice and feedback at scale.
This shows up when learning is designed around situations people actually face and the decisions they need to make in those moments.
In these designs, enablement takes the form of a coaching loop:
- A learner is placed in a realistic situation they are likely to encounter at work
- They respond by making a decision or taking an action
- Feedback clarifies what effective handling looks like
- The learner tries again, with variation
Human judgment remains central. L&D teams define expectations, set guardrails, and stay accountable for quality and outcomes, even as interaction scales.
π Bringing this into your organization
Identify situations that lend themselves to scenario-based learning. These are often where AI coaching can support practice and feedback that improve on-the-job performance.
Trend 2: Treating skills as performance outcomes
L&D teams are starting to ground skills more directly in observable behavior β how someone handles a situation, makes a decision, or responds under pressure.
Instead of relying solely on skill labels or self-assessments, teams are paying closer attention to repeated patterns in how people perform in situations that matter to the business. Those patterns increasingly serve as evidence when thinking about development, readiness, and role progression.
This shift shows up in a few early but consistent ways:
- Skills are described through situations and behaviors
Skill definitions reference where a skill shows up in work and what effective handling looks like in those moments. - Development is designed around demonstration
Practice and feedback are anchored in situations where skills are visible, making improvement easier to observe over time. - Readiness and mobility draw on evidence from work
Skills begin to inform decisions about progression, role movement, and scope based on how consistently someone handles key situations.
A common example: managerial capability
Rather than treating people management as a broad skill, teams look at how managers handle recurring moments such as performance conversations or pushback. Over time, patterns in those interactions become evidence of readiness and development needs.
Over time, this is reshaping how skills libraries and career ladders are used. Skills libraries act as behavioral reference points, while career ladders signal readiness for what comes next β supporting mobility and succession in ways leaders already recognize.
π Bringing this into your organization
Identify roles where leaders already make judgment calls about readiness or progression. Clarify the situations and behaviors leaders look for in those decisions, then design practice and feedback that help people demonstrate readiness more consistently.
Trend 3: Designing learning in the flow of work
Designing learning in the flow of work starts by identifying where work breaks down and where people get stuck.
L&D teams are increasingly starting from those moments, working with managers and functional leaders to identify recurring situations where performance varies, errors repeat, or judgment is inconsistent.
This work often concentrates on recurring patterns leaders already recognize as performance risks:
- Breakdowns in handoffs and coordination
- Process-heavy decisions under time pressure
- Inconsistent application of standards or policies
- Feedback that arrives too late to help
When L&D designs for these moments, people respond more effectively the next time.
That improvement shows up directly in performance.
π Bringing this into your organization
Identify situations that lend themselves to scenario-based learning. These are often strong entry points for in-flow practice and feedback.
Trend 4: Measuring learning through application and impact
Most organizations already track learning activity. Completion rates. Attendance. Satisfaction scores like NPS. These metrics help L&D understand reach and perception, but they rarely answer the questions leaders are asking.
Those questions are about decisions. Who is ready to take on more responsibility? Where is performance breaking down? What should we reinforce, adjust, or scale next?
As learning moves closer to work and skills are grounded in observable behavior, measurement is shifting toward evidence leaders can actually use.
Teams are paying closer attention to what changes in the work itself and using those signals alongside traditional learning data. Completion and satisfaction still matter, but they no longer carry the whole story.
Whatβs emerging is a different measurement posture:
- Evidence is tied to specific situations
Measurement starts from recurring moments that matter β customer escalations, approval loops, safety incidents, or handoffs β where improvement is visible when learning is effective. - Signals come from systems leaders already trust
Manager observations, quality reviews, rework rates, customer outcomes, and operational metrics become part of how learning impact is understood, not a separate reporting layer. - Measurement informs action, not just reporting
Evidence is used to decide where to invest next, who needs support, and which practices are ready to scale β supporting readiness, mobility, and performance decisions.
Completion data and satisfaction scores still play a role. They help teams understand reach and perception. On their own, though, they rarely explain whether learning is changing outcomes the business cares about.
That gap is why many L&D teams are adding work-based evidence to their existing measurement and reporting.This includes things like manager observations, quality reviews, and operational metrics, alongside completion data and satisfaction scores.
The goal is to help leaders see whether learning investments are influencing business outcomes in the situations that matter most.
π Bringing this into your organization
Start with a business outcome leaders already care about. Work backward to identify what would need to change in how people handle those situations, then look for evidence of improvement after practice and feedback are introduced.
Trend 5: Governing AI with human judgment
AI is now speaking, responding, and recording inside learning workflows.
Avatars deliver messages on behalf of the organization. AI coaches provide feedback. Practice sessions generate transcripts, responses, and performance signals. As this happens, questions of ownership and responsibility become unavoidable.
L&D teams are clarifying what happens across the learning loop β what the system says, what gets stored, what is reused, and what data is collected along the way.
Teams are working through issues such as:
- Who owns what the AI says and shows
Avatars and AI coaches represent the organization. Scripts and scenarios require clear approval before being used broadly. - How learner data is handled
Transcripts and performance signals raise questions about retention, access, and review. - How internal knowledge and IP are used
AI systems draw on internal policies and examples, requiring coordination with IT and Legal. - What learners understand about AI use
Clarity about when AI is providing guidance and where human judgment applies helps maintain trust.
As AI-supported learning scales, governance becomes part of learning design itself β shaping responsibility, trust, and alignment as systems evolve.
π Bringing this into your organization
Identify where AI is already active in learning workflows. Clarify ownership of content approval, data retention, and IP use in those moments.
What to do with these trends
These trends donβt call for a reset. They call for focus.
The teams making progress arenβt adopting everything at once. Theyβre choosing a small number of situations that matter, designing learning around those moments, and paying attention to what changes.
AI helps scale this work. Clarity makes it effective.
Less coverage. More capability.
Less activity reporting. More help for real decisions.
Thatβs where L&D earns its seat.
A learning challenge for L&D leaders
Before adding a new tool, program, or metric:
- Pick one situation where outcomes vary or work breaks down
- Define what better handling looks like
- Create one opportunity to practice, with feedback
- Decide what evidence from work would show improvement
- Be clear about who owns quality, data, and judgment if AI is involved
Do this once. Then do it again.
Thatβs how trends turn into practice β and how L&D helps organizations work better at scale.
About the author
Learning and Development Evangelist
Amy Vidor
Amy Vidor, PhD is a Learning & Development Evangelist at Synthesia, where she researches emerging learning trends and helps organizations apply AI to learning at scale. With 15 years of experience across the public and private sectors, she has advised high-growth technology companies, government agencies, and higher education institutions on modernizing how people build skills and capability. Her work focuses on translating complex expertise into practical, scalable learning and examining how AI is reshaping development, performance, and the future of work.

You might also like
What are the most important learning and development trends for 2026?
The most important learning and development trends for 2026 reflect a shift in expectations rather than a surge of new tools. As AI becomes a normal part of L&D workflows, organizations are increasingly focused on skills, performance, and measurable impact.
Instead of prioritizing content volume or completion rates, mature L&D teams are designing learning systems that build capability in context, support performance in the flow of work, and hold up under scale. AI is an enabler in this shift, but differentiation now comes from learning design quality, governance, and how closely learning is tied to business outcomes.
How is AI changing learning and development in 2026?
AI is changing learning and development by reducing the cost and time required to create and adapt learning content, while also enabling more personalized and responsive learning experiences. In practice, this means faster iteration, better localization, and new opportunities for coaching, practice, and reinforcement.
However, the biggest change is not speed. In 2026, AI is pushing L&D teams to rethink how learning supports skill development and performance. Teams that see the most impact use AI to support diagnosis, guided practice, and feedback.
Does using AI in learning undermine critical thinking or skill development?
This is a common concern, but research suggests the impact of AI on learning depends on how it is designed and used.
Recent research shows that AI can improve learning outcomes when it provides personalized, high-quality examples and scaffolding. In these cases, AI supports skill development rather than replacing it. The key finding is that AI can strengthen learning when it functions as a coach β helping learners see good examples, practice decisions, and reflect.
For L&D teams, this reinforces an important design principle: AI should support thinking, practice, and feedback.
β
What does L&D maturity mean in practice?
L&D maturity refers to how effectively a learning function translates activity into real capability and performance. Less mature teams tend to focus on delivering content, responding reactively to requests, and measuring success through participation or satisfaction.
More mature teams design learning systems around skills and outcomes. They integrate learning into work, use data to inform priorities, partner closely with the business, and apply AI deliberately rather than experimentally. In 2026, maturity is increasingly defined by whether L&D can demonstrate impact and support the organization through ongoing change.
What should L&D teams measure beyond completion rates?
While completion rates and satisfaction scores still have a place, they are no longer sufficient indicators of success on their own. In 2026, L&D teams are expanding measurement to include indicators such as skill proficiency, role readiness, internal mobility, time to competence, and observable behavior change.
The goal is not to measure everything, but to measure what matters for performance. Mature teams focus on metrics that reflect whether learning is helping people do their jobs more effectively and adapt as roles evolve.
When does in-person learning still matter?
AI-enabled and asynchronous learning have expanded what can be delivered at scale, but they do not replace all forms of in-person learning. Face-to-face learning remains especially valuable when trust, leadership, identity, or complex human dynamics are central to the outcome.
In-person sessions are often most effective for activities like leadership development, conflict resolution, culture building, and deep collaboration. AI and video-based learning work best alongside these moments by preparing people in advance, reinforcing skills afterward, and providing consistent support between live interactions.
How can L&D partner with other functions to drive business impact?
As AI becomes embedded in learning systems, L&D cannot operate in isolation. Driving business impact increasingly requires close partnership with functions such as IT, security, legal, HR, and business leadership.
These partnerships help ensure that learning initiatives are secure, ethical, scalable, and aligned with real organizational needs. In 2026, one of the clearest signs of L&D maturity is the ability to work cross-functionally to design learning systems that balance performance, responsibility, and human judgment.











