Most corporate training teams moved to virtual classrooms during the pandemic. They picked Zoom, Teams, or Webex. They called it virtual classroom training. Then they spent the next five years wondering why completion rates stayed flat and new hires still took five months to ramp.
The tool was the problem. Not the format.
Virtual classroom training works. The evidence on that is not ambiguous. But most organizations are not running virtual classroom training. They are running meetings with slides and calling it learning. The distinction is not semantic. It is the difference between training that changes behavior and training that checks a box. Here is what the data shows, and why the gap between what virtual classroom training could deliver and what most L&D teams are delivering is larger than anyone has bothered to calculate.
A 2025 study by Training magazine, Class Technologies, and Microsoft surveyed 661 L&D professionals and found that 72% cited learner engagement as their top obstacle in virtual instructor-led training, even as nearly all organizations reported using it.1
That number is telling, but not for the reason most people think. The engagement problem is not a training design problem. It is a tooling problem. The same study found that L&D leaders want engagement analytics, enhanced breakout rooms, built-in assessments, and better participation monitoring.1 Those are not features you add to a meeting platform. Those are features of a purpose-built virtual classroom.
A virtual classroom, as most organizations use the term, is a video call with a presenter. One person talks. Everyone else is muted. Cameras are off. The facilitator is presenting into a grid of black squares and hoping someone is still listening. That is not a virtual classroom. That is a webinar with attendance tracking.
A virtual classroom, in the sense that produces measurable outcomes, is a structured environment where learners solve problems in small groups, respond to prompts, work through scenarios with peers, and get real-time feedback from an instructor who can see exactly who is participating and who is not.
The difference matters because the outcomes are not close.
The largest meta-analysis of undergraduate STEM education to date, led by Freeman et al. and published in the Proceedings of the National Academy of Sciences, analyzed 225 studies comparing traditional lecturing to active learning. The findings: exam performance improved by roughly 6% under active learning, and students in traditional lecture courses were 1.5 times more likely to fail. Average failure rates were 33.8% in lecture-based courses versus 21.8% in active learning courses.2
Those results are from higher education, but the mechanism is the same in corporate training: people learn by doing, not by watching. The question is whether your virtual classroom is designed to make learners do things.
Engageli’s teaching and learning research, run across live corporate deployments, found that active virtual sessions produce 54% higher test scores than the same content delivered passively. Learner talk time is 13x higher. Non-verbal engagement is 16x higher.5 Those are not engagement metrics. They are predictors of whether the training actually transferred.
For a sales team of 200 quota-carrying reps, that gap in knowledge retention shows up directly in close rates, deal size, and time to first deal. The 54% is not a feel-good number. It is a revenue variable.
This is not a criticism of Zoom. Zoom is excellent at what it was built for: connecting people in real-time for conversations. But a conversation and a learning experience are structurally different. A meeting needs a shared screen and a chat box. Training needs small-group collaboration, real-time assessment, participation data, and an environment that makes it harder to hide than to participate.
The 2025 Training magazine / Class / Microsoft study reinforces the point. When asked what enhancements they most wanted for their video conferencing tools to improve VILT, L&D professionals said: assessments (41%), engagement analytics (40%), enhanced breakout rooms (39%), and participation monitoring tools (37%).1 Those are not incremental improvements to a meeting platform. They describe a different category of product.
Zoom does not have persistent small groups. It has breakout rooms that a host manually creates, assigns, and monitors one at a time. There is no built-in analytics dashboard showing which learners are engaged and which checked out ten minutes ago. There is no table architecture that keeps five people working together across an entire session without the facilitator having to rebuild the room every time.
Engageli’s internal data from corporate deployments shows active participation rates of 62.7% on its platform compared to roughly 5% in standard video conferencing tools used for training.7 That is not a rounding error. That is a 12x gap in the percentage of learners who are actually doing something besides watching.
When L&D teams say virtual training does not work, what they usually mean is: virtual training on a meeting platform does not work. They are correct. The conclusion they draw from that, that in-person is the only format that delivers, is where the math breaks down.
U.S. companies spent $102.8 billion on training in 2024–2025, an average of $874 per learner.4 Of those training hours, 28% were delivered via instructor-led classroom and 24% via virtual classroom or webcasting, down from 27% the prior year. The slight decline in virtual delivery does not mean organizations are abandoning it. It means they are getting more selective about when and how to use it.
The standard justification for in-person training is that it produces better outcomes. For active in-person training with a skilled facilitator, that has historically been true. Retention rates for well-run in-person sessions sit around 85–90%.
Here is what most CLOs do not put into the cost model: active virtual classroom training, run on a purpose-built platform with the right facilitation, produces equivalent retention at a fraction of the cost.
Engageli’s total cost of ownership analysis puts the cost per learner for active virtual training at roughly one-tenth the cost of active in-person delivery. The math is structural. An in-person session with a skilled facilitator caps at 25–40 learners. Active virtual sessions on Engageli handle 120–150 learners with the same level of small-group interaction, because the table architecture scales the facilitation model. To reach 1,000 learners, you need 25–40 in-person sessions or 7–9 virtual ones.6
That is not a marginal efficiency gain. That is a structural cost difference.
And the retention numbers hold. Active virtual training with AI-reinforced follow-up produces 85–90% retention, on par with the best in-person programs.5 The variable is not the modality. It is whether the learner is doing something or watching someone else do something.
People Untapped, a leadership development firm, ran a global program through Engageli for over 1,200 learners. They measured a 21-point improvement in matrix leadership skills.9 That program was not a series of webinars. It was structured around active virtual sessions where learners worked in persistent small groups, completed real-time exercises, and received AI-generated feedback between sessions.
DeVry University moved to Engageli and measured a 7% improvement in pass rates, double the number of A grades, and a 155-basis-point gain in student persistence.8 Those are not soft engagement metrics. They are the kind of numbers a Provost can present to a board, or a CLO can use to defend a budget.
At the University of Nicosia, poll engagement hit 92%.11 At Coventry University, participation reached 100% in RSI-compliant sessions.10 These numbers are measured in production environments, with real learners, at institutions that publish their outcomes.
The pattern across these cases is consistent: the platform provides the structure for active participation, the facilitator uses that structure to run sessions where learners cannot passively observe, and the analytics layer gives both the facilitator and the organization visibility into what actually happened.
The platform is the instrument. The facilitator plays it.
This is not an argument that switching platforms is simple. It is not.
A 2026 report by Class Technologies and Training magazine, based on responses from 545 L&D professionals, found that 98% of organizations now use VILT but only 21% report the highest levels of success. The gap is not about adoption. It is about execution.3
Running live virtual classroom training well takes real facilitator skill. It is harder than running a webinar. The facilitator is managing activities, watching participation data, adjusting pacing, and building enough trust that people will speak up in front of strangers on a screen. Organizations that invest in the platform without investing in facilitator development will underperform.
Scheduling is a real friction point for global teams. Live sessions require time-zone coordination and calendar space, and those logistics get harder at scale. The on-demand and asynchronous components of a platform matter here because they extend the learning beyond the live window without losing the active component.
Not every learner has a reliable enough connection to fully participate. Bandwidth and device equity are constraints that do not disappear because the platform is good. Any honest business case for virtual classroom training needs to account for the learners who cannot access it at full fidelity.
These costs are real. They are still smaller than the cost of flying 500 people to a conference center four times a year. And they are dramatically smaller than the cost of running passive virtual training that produces no measurable behavior change and calling it a line item well spent.
The data on that is clear. A meta-analysis of 225 studies shows active learning reduces failure rates and raises exam scores.2 Engageli’s own research shows 54% higher test scores, 13x more learner talk time, and 16x higher non-verbal engagement in active virtual sessions.5 Named institutions running real programs — DeVry, Coventry, University of Nicosia, People Untapped — are measuring the difference in pass rates, persistence, and skill lift.
The question is how long your organization can afford to keep running the alternative. According to the 2025 Training Industry Report, companies are spending $874 per learner on training that, in most virtual formats, still struggles with the engagement problem that 72% of L&D professionals identify as their top challenge.4 Every quarter spent on passive virtual training is a quarter of ramp time that did not improve, quota attainment that did not move, and retention risk that did not get addressed.
Those costs do not appear on the L&D budget. They appear on the P&L, where they are much harder to explain.
Virtual classroom training is not a technology decision. It is a business decision with a specific, calculable return. The organizations that have made it are measuring the difference. The ones that have not are still presenting into silence.
References
1 Training magazine, Class Technologies, Microsoft (2025). “The Virtual Training Paradox: High Confidence, Low Engagement.” Survey of 661 L&D professionals. businesswire.com
2 Freeman, S. et al. (2014). “Active learning increases student performance in science, engineering, and mathematics.” PNAS, 111(23), 8410–8415. pnas.org
3 Class Technologies, Training magazine (2026). “The State of Live Virtual Training in 2026.” Survey of 545 L&D professionals. class.com
4 Training magazine (2025). “2025 Training Industry Report.” trainingmag.com
5 Engageli internal research: Teaching & Learning Research (54% test score improvement, 13x talk time, 16x non-verbal engagement; 85–90% retention with AI-reinforced follow-up).
6 Engageli TCO white paper (cost per learner ratios, session scaling data).
7 Engageli internal data, corporate deployments (62.7% vs. 5% participation rates).
8 DeVry University case study (+7% pass rate, 2x A grades, +155bp persistence).
9 People Untapped case study (+21 points matrix leadership, 1,200+ learners).
10 Coventry University case study (100% participation).
11 University of Nicosia case study (92% poll engagement).