• AI Isn’t the Problem. Reckless Publishing Practices Are

    The knee-jerk adoption of AI across the publishing industry is quickly revealing a troubling absence of principles, guardrails, and accountability. While AI promises powerful capabilities—automated content creation, personalised recommendations, and streamlined workflows—the way it’s being deployed often feels less like innovation and more like a freefall. And the looming crash will leave no sector unscathed, especially education and publishing, where trust is paramount.

    But let’s be clear: AI is not inherently the villain here. The real issue lies in how organisations are integrating it into their workflows, or more accurately, failing to integrate it responsibly. What we’re seeing is innovation on autopilot, where decisions are made with little oversight, bias creeps in unnoticed, and accountability vanishes into the ether. This isn’t a technology problem; it’s a leadership failure, and it’s symptomatic of deeper systemic flaws in how industries approach governance in the digital era.

    The Publishing Industry’s AI Gold Rush

    The publishing sector, particularly in educational contexts, has been quick to embrace AI solutions. From algorithmically generated textbooks to adaptive learning platforms that promise customised experiences for every student, the narrative of AI in publishing is one of unbridled efficiency and scalability. But dig beneath the surface, and you’ll find a precarious lack of oversight. Where are the frameworks to ensure these systems don’t reinforce existing inequities? Who’s auditing the data sets that underpin AI algorithms for bias? And perhaps most importantly, who’s taking responsibility when things inevitably go wrong?

    Consider the implications for educational publishing, where content created or curated by AI could shape the knowledge and perspectives of millions of students. If bias slips into these systems—whether through unexamined training data or flawed algorithms—it won’t just skew a single textbook; it could warp the foundational education of an entire generation. Yet, few publishers appear to be addressing these risks head-on. Instead, development teams are left to roll out AI initiatives without meaningful guidance or the expertise to evaluate ethical concerns. The result is a Wild West approach to innovation, where the speed of adoption outpaces the establishment of safeguards.

    Governance Isn’t a Luxury—It’s a Necessity

    Effective governance isn’t about stifling innovation; it’s about enabling organisations to innovate responsibly. Yet, many publishing and education companies treat governance as a bureaucratic hurdle rather than a fundamental necessity. This mindset is shortsighted at best and reckless at worst. Without clear policies, training, and accountability structures, organisations normalise chaos and risk eroding the very trust they depend on.

    The absence of governance is particularly glaring in the realm of AI ethics. For instance, who determines whether content generated by AI is accurate, unbiased, and appropriate for its intended audience? Such decisions require more than technical expertise; they demand leadership capable of setting clear standards and holding teams accountable. Yet, leadership in these sectors often seems content to delegate responsibility to the technology itself—a dangerous abdication that leaves critical ethical questions unanswered.

    And let’s not forget the data. AI systems are only as good as the information fed into them, yet data governance remains a blind spot for many organisations. Educational publishers, in particular, are custodians of sensitive student data. How is this data being protected? How is it being used? And are students and educators even aware of how their information is being leveraged? Without rigorous standards for data privacy and security, the risk of exploitation or breach grows exponentially—a risk that could have devastating consequences for both individuals and institutions.

    The Cost of Inaction

    The longer organisations avoid setting principles for AI use, the more they risk normalising a culture of irresponsibility. And the consequences won’t just be financial. If publishers and education companies fail to build trust in their use of AI, they may find their credibility irreparably damaged. Students, educators, and readers will not forgive systems that fail them, nor will they tolerate organisations that prioritise efficiency over ethics.

    Moreover, regulatory scrutiny is inevitable. Governments worldwide are beginning to recognise the need for oversight in AI applications, particularly in sensitive industries like education and publishing. Companies that fail to proactively establish governance frameworks may find themselves playing catch-up when new regulations come into force. Worse, they may face public backlash if their AI systems are exposed as harmful or exploitative.

    Leadership Before Tooling

    Responsible AI use doesn’t start with technology—it starts with leadership. The organisations that will thrive in this new era are those willing to ask hard questions and make principled decisions before deploying new tools. What are the ethical implications of this technology? How will it impact the people who rely on us? Are we prepared to take responsibility for its failures? These are not questions for developers alone; they are questions for leaders at every level of the organisation.

    For publishing and education companies, this is an opportunity to lead, not just react. By establishing robust governance frameworks, training teams in ethical AI practices, and fostering a culture of accountability, they can turn AI into a tool for building trust rather than eroding it. The question is whether they’ll seize that opportunity—or let it pass them by.

    The publishing industry has long been a custodian of knowledge and culture. If it wants to maintain that role in the age of AI, it must do so with principles firmly in place. Because innovation without ethics isn’t progress—it’s peril. And the consequences of getting this wrong are too great to ignore.

  • Challenges and Implications of EdTech Growth in Australia

    EdTech’s Explosive Growth: Are We Building Solutions or Selling Slogans?

    The Australian EdTech market is surging, with forecasts placing its value at USD 3.9 billion and climbing. Platforms are scaling, investment is flowing, and the sector is awash with buzzwords like “personalisation” and “adaptive learning.” On paper, the ecosystem seems primed to revolutionise education. But beneath the glossy product sheets and investor optimism lies a critical question: who is this technology actually serving?

    The LinkedIn post quoted above touches on a fundamental tension in EdTech. While 60% of Australian educators reportedly prioritise personalisation, the term itself has become a catch-all marketing pitch for features that often fail to address the realities of everyday teaching. In practice, “personalisation” frequently translates into algorithm-driven dashboards and automated assessments—tools that may look impressive in demos but struggle to deliver meaningful support in the chaos of a real school day.

    This disconnect isn’t new. It’s a feature of the EdTech landscape, not a bug. For decades, tech vendors have touted solutions that promise to “save teachers time” while simultaneously offloading administrative burden onto educators in the form of data entry, troubleshooting, and content management. What’s often missing is an honest discussion about whether these tools genuinely reduce cognitive load or simply add to it under the guise of innovation.

    The Hidden Costs of “Solutions”

    The core issue here isn’t just poor interface design or a lack of understanding about the rhythm of a school day (though those are critical flaws). The problem is structural. EdTech companies are incentivised to sell features that align with funding priorities—buzzwords like “AI-driven personalisation” or “adaptive learning pathways”—rather than tools that address the less glamorous, but far more impactful, needs of teachers and students.

    For example, consider the vast investment in adaptive learning systems. These platforms often promise to tailor content to individual learners based on algorithmic assessments. But what happens when the “personalised” pathways don’t align with curriculum requirements or the teacher’s broader learning objectives? What happens when a glitch compromises the integrity of the data—or worse, when a lack of transparency means educators can’t even verify how those pathways are being calculated? These systems often shift responsibility for learning outcomes away from educators and toward opaque algorithms, which raises serious questions about accountability.

    Moreover, the promise of saving time is undermined by the reality of implementation. Teachers are often expected to integrate these tools into their workflows with minimal support, leading to what can only be described as a second job: troubleshooting technology. This is particularly acute in under-resourced schools, where the gap between the promise and the reality of EdTech becomes even starker.

    Power Dynamics and Vendor Priorities

    The EdTech market’s growth isn’t just a story of innovation; it’s also a story of consolidation. As platforms scale, the power dynamics between vendors and institutions shift. Schools and universities increasingly find themselves locked into proprietary ecosystems that dictate not only how learning is delivered but also how data is managed. And herein lies another critical blind spot: the privacy and security implications of this trend.

    When platforms claim to “personalise” learning, what they’re really doing is collecting vast amounts of data—on students, teachers, and even institutional practices. This data is often sold as a pathway to better outcomes, but it’s just as likely to be used for purposes that fall outside the scope of education, such as marketing or behavioural profiling. Australian regulators have been slow to catch up with these practices, leaving schools vulnerable to both breaches and exploitation.

    It’s worth asking: how many educators truly understand where their data—let alone their students’ data—is going? Are schools equipped to audit their EdTech vendors for compliance with Australian privacy laws? Or are they simply taking these companies at their word, trusting that the promises of “security by design” and “GDPR compliance” mean something more than a checkbox on a sales pitch?

    What Should EdTech Be Asking?

    To be clear, this isn’t an argument against technology in education. Done well, EdTech can transform classrooms, empower educators, and open up new pathways for students. But the industry needs to start asking better questions—not just about what teachers need today, but about the long-term implications of its tools.

    Here are a few questions that vendors and institutions alike should be asking:

    Who benefits most from this technology? Is the primary beneficiary the teacher, the student, or the vendor’s bottom line?

    What happens when the technology fails? Does the classroom grind to a halt, or are there robust contingencies in place?

    How transparent is the data pipeline? Can educators easily understand and audit what the platform is doing with their information?

    What’s the real cost of integration? Beyond the upfront price, what are the hidden costs in terms of training, maintenance, and long-term vendor lock-in?

    The Risk of Noise Without Insight

    EdTech’s promise has always been to make learning more accessible, engaging, and effective. But if the industry continues to prioritise sales pitches over substance, it risks becoming little more than noise—a cacophony of dashboards and adaptive pathways that look impressive but fail to resonate where it matters most: in the classroom.

    Educators are right to demand more. They’re right to challenge platforms to prove their worth—not just in terms of feature lists, but in terms of outcomes, usability, and trust. And they’re right to insist that technology should serve the realities of teaching, not the ambitions of venture capitalists. Until EdTech truly centres teachers and students in its design and strategy, its growth will remain a hollow victory.

  • Systemic Failures in Accessibility in EdTech and Digital Publishing

    Accessibility Is Not a Checkbox, But a Test of Integrity

    The statistic that 94.8% of top websites fail basic accessibility standards is not just sobering—it’s a mirror held up to an industry that has repeatedly prioritised convenience and aesthetics over equity and usability. For education technology and digital publishing, this failure is more than just an oversight; it’s a systemic exclusion that undermines the very foundations of their mission. If education is meant to empower, then the tools and platforms designed to deliver it should not be shutting out the very people they claim to serve.

    The Accessibility Illusion: Marketing vs Reality

    Accessibility is often reduced to a checkbox—a compliance exercise performed to pass audits or avoid legal liability. Vendors tout accessibility as a feature, but in reality, it’s rarely integrated into the DNA of their products. Missing alt text, low-contrast interfaces, and unlabelled forms are not just design flaws; they’re barriers that signal to users with disabilities that their needs are peripheral, if considered at all.

    What’s troubling is the industry-wide tendency to treat accessibility as an afterthought. The promises made during product demos and marketing campaigns often crumble under the weight of real-world use. A platform might claim to meet WCAG (Web Content Accessibility Guidelines), but those claims frequently unravel when actual students, educators, and researchers attempt to navigate its labyrinth of poorly designed interfaces and inaccessible content.

    The Stakes Are Higher in Education

    In the context of education, these failings take on a heightened urgency. For students, inaccessibility isn’t merely frustrating—it’s a barrier to participation, learning, and ultimately, opportunity. When an educational platform excludes students with disabilities, it’s not just a technical oversight; it’s a betrayal of the promise of equity.

    Consider the ripple effects. When students can’t access their course materials or interact with their learning environments, the implications go far beyond their individual experience. Educators are forced to spend time creating workarounds, institutions face reputational damage and potential legal challenges, and the digital divide grows wider. And let’s not forget the emotional toll on students who are repeatedly confronted with systems that implicitly tell them they don’t belong.

    The Myth of “Universal Design”

    The recurring argument against prioritising accessibility is that it’s resource-intensive. But this assumes that accessibility is additive rather than foundational—a false dichotomy that has been debunked time and again. Inclusive design doesn’t just benefit those with disabilities; it improves usability for everyone. High-contrast text helps users in bright sunlight; descriptive alt text enhances content comprehension for all; well-labelled forms streamline navigation for every user.

    Accessibility upgrades are not burdens; they’re investments in better design. And yet, many organisations still treat them as optional line items to be deferred or ignored altogether.

    The Business Implications of Exclusion

    From a business perspective, failing to prioritise accessibility is shortsighted. The global disability market represents a significant segment of consumers, not to mention the growing body of regulations mandating accessible digital experiences. Inaccessible platforms risk alienating users, exposing organisations to legal action, and damaging their brand reputations.

    In education technology and publishing, the stakes are even higher. These sectors operate in spaces where public scrutiny is intense, and the ethical imperative to serve all users—not just the able-bodied majority—is non-negotiable. Institutions and vendors that fail to address accessibility risk losing the trust of their stakeholders, including students, parents, educators, and policymakers.

    Where Do We Go From Here?

    The question isn’t whether accessibility should be prioritised—it’s how organisations can make it an integral part of their operations. And the answer isn’t always sweeping reforms; sometimes, the smallest upgrades can make the biggest difference. Adding alt text to images, improving colour contrast, or properly labelling forms are steps that don’t require massive budgets or extended timelines. What they do require, however, is intention.

    Institutions and organisations should also be asking harder questions of their vendors: Are accessibility claims backed by third-party audits? Is accessibility considered in every stage of product design, or is it patched in after the fact? What mechanisms exist for users to report accessibility barriers, and are those reports acted upon promptly?

    The Ethical Imperative

    Ultimately, accessibility is not just about compliance or avoiding lawsuits—it’s about integrity. It’s a measure of whether organisations truly value inclusivity in their actions, not just their words. For education technology and publishing, it’s a test of whether they can live up to their stated missions of broadening access and fostering learning for all.

    If the industry continues to treat accessibility as a checkbox, it will fail not only its users but also its own potential. However, if organisations embrace accessibility as a central pillar of their design philosophy, they won’t just improve user experience—they’ll begin to bridge the very divides they claim to address.

    The real question isn’t what’s the smallest accessibility upgrade your team can make today? It’s why hasn’t this been prioritised already? Until accessibility is treated as a fundamental requirement rather than an optional extra, the statistics will remain a warning—and a damning one at that.

  • Impact of Algorithms on Publishing in the Attention Economy

    The Quiet Decline of the Publisher in the Age of Algorithms

    If you’re a publishing executive, you might want to sit down. Your competition isn’t who you think it is. It’s not the publisher across town or the digital upstart trying to snag your market share. It’s an algorithm, armed with a ring light, a slick editing interface, and a platform that doesn’t care about chapters or page numbers. While you’re debating font sizes and layout grids, your audience is learning from TikTok creators who deliver bite-sized lessons in seconds—or YouTubers who have mastered the art of modular, bingeable content.

    Let’s cut to the chase: publishing is losing relevance in the education market, and it’s not because it’s failing to produce quality content. It’s because it’s failing to adapt to how people consume information now. The attention economy doesn’t reward static formats or slow responsiveness; it rewards immediacy, stickiness, and above all, modularity. And that’s where publishing, as an industry, is faltering.

    The Attention Economy: A Game Publishers Aren’t Playing

    Publishing’s traditional model—linear narratives, static layouts, and rigid intellectual property protections—is fundamentally mismatched with the way modern learners behave. Today’s students don’t wait for a PDF to load or flip through pages to find what they need. They swipe through short-form videos, click through interactive quizzes, and absorb information in fragmented, on-demand bursts. Platforms like TikTok and YouTube thrive because they’ve mastered the art of modularity, breaking down complex topics into digestible, engaging chunks that can be consumed on the fly.

    Publishers, on the other hand, still cling to formats that prioritise the creator’s organisation of information over the learner’s experience of consuming it. A textbook chapter assumes you’ll want to read 30 pages on a single topic in one sitting. A TikTok video assumes you’ll give it 15 seconds. One is built for the classroom; the other is built for the bus ride or the study break. Guess which one wins when attention is scarce?

    EdTech vs. Publishing: A Battle Already Lost?

    EdTech vendors have been quick to capitalise on this shift. Platforms like Kahoot!, Quizlet, and Duolingo have embraced modular content delivery, gamification, and mobile-first design. These companies understand that the competition isn’t other educational tools—it’s the platforms learners are already addicted to. By contrast, publishers seem to think they’re still fighting a war over intellectual property rights or pricing models when the battlefield has moved entirely.

    The failure to adapt isn’t about technology—it’s about mindset. Publishers often tout their expertise, their carefully curated content, and the rigour of their editorial processes as their competitive edge. But none of that matters if learners don’t engage with the material. And engagement isn’t about your credentials; it’s about your ability to deliver content in the formats your audience expects.

    Modularity Isn’t Optional

    The modularity gap isn’t just a design issue; it’s a survival issue. Publishers need to stop thinking in terms of chapters, units, and editions and start thinking in terms of snippets, assets, and APIs. A learner today doesn’t care about the “completeness” of your textbook—they care about whether they can find the exact piece of information they need at the exact moment they need it.

    This shift requires more than just creating a digital version of your print product. It demands a fundamental rethinking of how educational content is structured, delivered, and monetised. And this is where most publishers stumble. Modular content isn’t just about breaking information into smaller chunks; it’s about designing those chunks to be repurposed, recombined, and redistributed across multiple platforms. It’s about meeting learners where they already are—not dragging them to where you think they should be.

    Algorithms as Gatekeepers

    Here’s the uncomfortable truth: the platforms are already winning, not because they produce better content, but because they control the algorithms that determine who sees what. For all their talk of innovation, publishers remain beholden to distribution models that are increasingly irrelevant. The middleman isn’t a bookstore anymore; it’s a recommendation engine. And if your content isn’t designed to thrive in that ecosystem, it’s going to be buried—no matter how good it is.

    This creates a troubling dynamic for educators and learners alike. While platforms prioritise engagement metrics that maximise ad revenue, they often neglect educational rigour or accuracy. This leaves publishers with a difficult choice: adapt to the algorithm and risk diluting their standards, or cling to traditional models and risk irrelevance.

    Can Publishing Innovate Its Way Out?

    The million-dollar question is whether publishers can innovate fast enough to stay in the game. Historically, the industry has been slow to adopt new technologies, often waiting until disruption is unavoidable before making changes. But the pace of change in the attention economy doesn’t allow for dithering. Publishers need to invest in modular design, mobile-first interfaces, and AI-driven personalisation—not as an afterthought, but as their primary strategy.

    Yet innovation isn’t just about technology; it’s about culture. Publishers need to shed their legacy mindset—the belief that authority comes from controlling the narrative—and embrace the fact that learners want control over how they access content. This means letting go of rigid formats, proprietary platforms, and outdated assumptions about how people learn.

    The Bottom Line

    The publishing industry is at a crossroads, and the path forward isn’t easy. Competing with algorithms requires more than just digital transformation; it requires a fundamental shift in how publishers think about their role in education. Are they content creators, curators, or facilitators? Are they even relevant in an ecosystem dominated by platforms that value engagement over expertise?

    The uncomfortable answer is that many publishers aren’t even in the same game anymore. Until they stop thinking in chapters and start thinking in snippets, they’ll continue to lose ground—not to other publishers, but to the algorithms that have already captured the attention economy. The question isn’t whether publishing can compete; it’s whether it can adapt before it’s too late.

  • Impact of Legacy Systems in Education and Publishing Technology

    The Real Cost of Legacy Systems in Education and Publishing Technology

    Legacy systems are often portrayed as the stubborn remnants of a bygone era—outdated, cumbersome, and ill-equipped to meet the demands of modern education and publishing industries. But the truth is far more insidious: these systems are not just passive obstacles but active saboteurs. They don’t just hold organisations back; they redirect time, money, and morale into a never-ending maintenance cycle that prioritises survival over strategy. And that’s a much larger problem than most leaders are willing to admit.

    The Quiet Stranglehold of Tech Debt

    It’s tempting to frame the issue as one of technical debt—a manageable backlog that can be addressed with the right funding and project timeline. But legacy systems don’t merely slow innovation; they actively erode the foundations on which it’s supposed to thrive. In education and publishing technology, this is particularly dangerous because these sectors serve as gatekeepers to knowledge, culture, and learning. When the infrastructure underpinning these industries is outdated, the ripple effects are felt not just by internal teams but by students, educators, authors, and readers.

    The most glaring problem isn’t necessarily the cost of maintaining legacy systems, though that’s substantial. It’s the organisational inertia they create. Teams spend their days patching vulnerabilities and creating workarounds instead of solving meaningful problems. Over time, they stop believing things can change, and when belief evaporates, so does ambition. This isn’t just a technical issue; it’s a cultural one. How do you foster innovation when the tools at hand are designed to prevent it?

    The Publishing Sector’s Chronic Dependency

    In publishing technology, legacy systems are often tied to deeply entrenched workflows that date back decades. Whether it’s content management platforms that struggle to integrate with modern APIs or distribution systems that can’t handle the complexity of global digital rights, the sector’s reliance on outdated infrastructure is profound. And while vendors promise shiny new solutions, they often fail to acknowledge the elephant in the room: a vast majority of organisations can’t afford the disruptive overhaul required to replace these legacy systems.

    Instead, you see a patchwork approach: bolt-on features, middleware, and endless duct tape. These temporary fixes may keep the lights on, but they also deepen the dependency on systems that should have been retired years ago. Worse, they reinforce vendor lock-in, as organisations find themselves tethered to proprietary solutions that make migration exponentially harder.

    Education’s Critical Pain Point

    For education technology, the stakes are even higher. Schools and universities are tasked with delivering 21st-century learning experiences, yet many are operating on systems designed for the last century. Student information systems, learning management platforms, and administrative tools often lack the flexibility and interoperability required to adapt to evolving pedagogical needs.

    Beyond the technical constraints, the security implications are chilling. Legacy systems often lack robust encryption protocols, leaving sensitive student data—grades, medical records, behavioural reports—vulnerable to breaches. In an era of escalating cybersecurity threats, this isn’t just an inconvenience; it’s a liability.

    And here’s the kicker: the longer these systems remain in place, the more expensive and complex their eventual replacement becomes. Organisations are essentially digging a deeper hole every year, all while claiming they don’t have the budget to climb out of it.

    The Morale Crisis

    Perhaps the most overlooked consequence of clinging to legacy systems is the toll it takes on morale. Teams tasked with maintaining these platforms often feel trapped in a cycle of diminishing returns. They’re not building; they’re firefighting. And when firefighting becomes the default mode of operation, burnout quickly follows.

    In education, this has a direct impact on the quality of teaching and learning. In publishing, it affects the ability to deliver timely, innovative content to audiences. Across both sectors, it erodes trust—trust in the institution, trust in leadership, and trust in the promise of technology itself.

    Breaking Free

    So, how do organisations escape this quagmire? The answer isn’t as simple as “replace the system.” Transitioning away from legacy infrastructure requires a strategic vision that prioritises long-term growth over short-term cost-cutting. It means asking tough questions:

    • What would our organisation look like if our systems actually supported innovation?
    • How can we ensure new platforms are future-proof and interoperable?
    • What are the security risks we’re quietly accepting, and how can we mitigate them?
    • Are we ready to confront the cultural inertia that has normalised survival mode?

    Most importantly, it requires leadership willing to invest in change—not just financially but philosophically. Because the longer organisations cling to outdated systems, the harder it becomes to imagine a world without them.

    Final Thoughts

    Legacy systems are doing their job—they’re just not working for you. They’re keeping your teams busy, your costs high, and your options limited. In the education and publishing technology sectors, this represents more than a missed opportunity; it’s a failure to meet the needs of learners, readers, and creators.

    Breaking free isn’t easy, but it’s necessary. Because no one wants to do great work on broken foundations—and no one should have to. The question isn’t whether you can afford to replace your legacy systems; it’s whether you can afford not to.

  • When Tech Claims to Solve Burnout but Risks Scaling the Problem

    The Australian teaching profession is in crisis, and the statistics are as grim as the stories they represent. Three-quarters of educators report feeling burned out, nearly half are considering leaving in the next year, and only one in five believes they have the time to do their job properly. These numbers aren’t just a wake-up call—they’re a clear signal that our education system is failing the very people it depends on. Teachers aren’t leaving because they’ve stopped caring; they’re leaving because the system is making it impossible to care without self-destruction.

    Into this maelstrom steps education technology, promising solutions that ostensibly aim to “support” teachers. The LinkedIn post above highlights one such effort—a company claiming to use AI to alleviate administrative burdens and create “space” for teachers to reconnect with their core mission. It’s a noble pitch, but as with so many edtech promises, the devil lies in the details. And those details reveal a troubling pattern: tools that claim to help educators often end up scaling the very problems they aim to solve.

    Burnout Isn’t Just About Workload—It’s About Power

    Before diving into the tech itself, it’s crucial to understand the root causes of burnout. Yes, hours spent on administrative tasks and juggling multiple roles contribute heavily, but burnout isn’t just a function of workload—it’s a symptom of powerlessness. Teachers are often asked to implement top-down mandates, adopt unproven technologies, and conform to bureaucratic processes that strip them of agency. Adding another layer of AI-driven automation risks exacerbating this dynamic, particularly if teachers aren’t involved in shaping how these tools are implemented.

    What the LinkedIn post fails to address is the deeper systemic issue: the deprofessionalisation of teaching. When technology pitches itself as a saviour, it often frames teachers as passive recipients of solutions rather than active collaborators. This approach risks further disempowering the workforce it claims to support. If AI tools are designed without meaningful input from educators, they’re not solving burnout—they’re reinforcing the conditions that cause it.

    AI as Automation: Help or Hindrance?

    The argument that AI creates “space” for teachers is seductive. In theory, automating repetitive administrative tasks should free up time for educators to focus on teaching. But in practice, the implementation of AI tools often introduces new complexities. Consider the learning curve required to adopt these platforms, the troubleshooting involved when systems inevitably fail, and the additional data entry burdens that arise as schools attempt to interface human workflows with machine logic. Far from reducing workload, many tech solutions simply shift it.

    Then there’s the question of what happens to the data these systems collect. Education technology vendors have a long history of treating schools as data goldmines, monetising student and teacher information under the guise of “personalisation” or “efficiency.” Even if a tool genuinely reduces administrative burdens, it may come with hidden costs in the form of privacy risks or algorithmic bias. Teachers, already stretched thin, are unlikely to have the bandwidth to interrogate these risks adequately. And that’s precisely what vendors are counting on.

    Scaling the Problem vs. Scaling the Solution

    It’s worth interrogating the broader implications of scaling “solutions” that focus solely on efficiency. Efficiency doesn’t fix a broken system; it often entrenches it. If AI tools primarily serve to help teachers manage unreasonable workloads, they’re not addressing the root cause—they’re normalising it. What happens when the systemic issues driving burnout remain unchallenged but are masked by a veneer of technological “progress”? Teachers might temporarily feel relief, but the structural pressures will remain, lurking beneath the surface until the next crisis hits.

    The LinkedIn post asks, “What would education look like if the system worked for teachers instead of wearing them down?” It’s a critical question, but one that AI alone cannot answer. A system that works for teachers requires more than automation—it demands a rethinking of how education is organised, funded, and governed. That means addressing inequities in school budgets, reducing class sizes, and empowering teachers with more autonomy, not less.

    Where Do Institutions Go From Here?

    For decision-makers in schools and educational organisations, the takeaway is clear: treat edtech promises with scepticism, especially when they claim to address burnout. Before adopting any new platform, ask hard questions about its implementation, data practices, and long-term impact on teacher agency. Insist on transparency from vendors—not just about what their tools do, but about what they don’t do. And most importantly, centre teachers in the decision-making process. If they’re not genuinely involved, the solution isn’t a solution at all.

    If the current trajectory continues—where tools are introduced to patch the symptoms of burnout rather than addressing its root causes—the education sector risks deepening the crisis rather than alleviating it. Technology can be part of the solution, but only if it’s deployed thoughtfully, transparently, and in collaboration with the people it’s meant to serve.

    Because if we keep scaling the problem, we’ll eventually run out of teachers willing to keep caring. And no AI tool can replace that.

  • Challenges of AI Integration in Education and Publishing

    AI in Education and Publishing: The Real Roadblock Isn’t the Technology, It’s the Workflow

    The breathless optimism around AI in education and publishing seems to have hit an inflection point. Not because the technology has plateaued—it hasn’t—but because organisations have failed to adapt their workflows to truly integrate AI. The promise was irresistible: AI could revolutionise how we teach, learn, create, and publish. Yet, for most institutions, the actual implementation appears more like a clunky add-on than a transformative leap forward.

    This isn’t a failure of artificial intelligence itself but a failure of the systems in which it’s deployed. And it’s a familiar story. From the early days of the internet in classrooms to the rise of digital platforms in publishing, the pattern repeats: technology is introduced with grand visions, but the surrounding infrastructure remains stubbornly rooted in outdated habits. The result is predictable inefficiency—a lot of noise with very little signal.

    The “Bolt-On” Syndrome: Treating AI as a Tool, Not a Transformation

    The fundamental issue is that many organisations treat AI as a tool rather than a structural shift. It’s bolted onto existing workflows in the hope that it will magically improve outcomes. Need to streamline content creation? Just add AI-powered tools to your editorial process. Want to personalise learning? Plug in an adaptive learning algorithm without rethinking lesson plans or assessment frameworks.

    But this approach is flawed. AI isn’t just another piece of software; it fundamentally changes the dynamics of decision-making, production, and engagement. Without reengineering workflows to accommodate this shift, AI’s potential remains untapped. Worse, the mismatch creates operational chaos. Teams find themselves juggling legacy systems and AI platforms that don’t play well together, leading to inefficiencies that negate any gains the technology might have offered.

    The Teams That Are Moving Forward

    The organisations that are successfully navigating this transition aren’t treating AI as an afterthought—they’re baking it into their systems from the ground up. This requires a willingness to rethink not just workflows but also organisational culture. What happens when AI becomes the core of your operations, rather than a peripheral enhancement?

    Take publishing, for example. If AI is central to content creation, editorial teams may need to shift their focus from writing to curating and validating machine-generated outputs. This isn’t a minor adjustment; it’s a complete overhaul of how work is done. Similarly, in education, AI-driven personalised learning platforms demand an entirely new approach to curriculum design—one that anticipates constant feedback loops and adapts teaching methods accordingly.

    The Structural Challenges of AI Integration

    The reality is that rebuilding workflows for AI integration isn’t just a technical challenge; it’s a deeply structural one. Institutions need to confront several hard truths:

    Data Dependency: AI thrives on data, but many organisations have chaotic data practices. Whether it’s student records in education or market analytics in publishing, the lack of clean, organised data hampers AI’s effectiveness. Worse still, the rush to implement AI can lead to risky shortcuts, compromising privacy and security.

    Human Resistance: Change management is often overlooked in discussions about AI. People are creatures of habit, and asking teams to fundamentally alter their workflows invites resistance. Without investing in training and cultural change, even the most sophisticated AI systems will fail to gain traction.

    Vendor Lock-In: The AI landscape is dominated by a handful of major players who offer enticing ‘end-to-end solutions.’ But these platforms often come with strings attached—data ownership, interoperability restrictions, and long-term costs. Institutions need to carefully evaluate whether they’re designing their workflows around the technology or around the vendor’s business model.

    Regulatory Lag: Governments and regulators are still playing catch-up when it comes to AI in education and publishing. Clear policies on data privacy, algorithmic bias, and accountability are sorely needed. In the absence of regulation, institutions risk making costly mistakes that could harm their reputations and, more importantly, their stakeholders.

    The Long-Term Implications

    If the current trends continue—if AI remains a bolt-on rather than a baked-in component—we risk stagnation in sectors that desperately need innovation. Education won’t see the personalised learning revolution that was promised; instead, it will grapple with fragmented systems that alienate educators and students alike. Publishing won’t experience the creative renaissance enabled by AI; it will struggle under the weight of half-integrated solutions that inhibit agility.

    But if institutions can take the harder path—rebuilding workflows, investing in data infrastructure, and fostering cultural change—the potential is transformative. AI could enable education systems that adapt to each learner’s needs in real time. It could empower publishing teams to scale creativity without sacrificing quality. The key is not to ask, “How can AI improve what we’re already doing?” but rather, “What would our system look like if AI were at its core?”

    The Questions Institutions Should Be Asking

    For decision-makers, the imperative is clear: stop treating AI as a silver bullet and start asking the hard questions about workflow design and structural integration. Specifically:

    • What data practices need to change to make AI truly effective?
    • How will AI reshape roles and responsibilities within teams?
    • Are we designing workflows around a technology or around our institutional goals?
    • What risks—privacy, security, bias—are we unintentionally introducing?
    • Are we building systems that can adapt as AI continues to evolve?

    The Bottom Line

    AI hasn’t plateaued, but many workflows have. Until institutions confront the systemic issues that prevent meaningful integration, the promise of AI will remain just that—a promise. It’s time to stop plugging in AI and hoping for the best. Instead, we need to rethink the systems in which it operates, ensuring that the technology isn’t just supported but structured to succeed. Because in the end, AI isn’t the roadblock—it’s the roadmap.

  • Power Dynamics in EdTech and Teacher Support

    EdTech’s Simplistic Narratives: The Power Dynamics Behind “Helping Teachers”

    Syllabyte’s pitch, like many others in the education technology sector, is built upon a familiar narrative: a teacher turned entrepreneur identifies classroom pain points, then launches a product aimed at solving them. It’s a story that resonates because it’s rooted in lived experience, but it’s also one that often oversimplifies the deeper systemic issues at play in education and publishing. The question isn’t whether such solutions are helpful—they often are—but what happens when these tools become integral to the infrastructure of teaching and learning.

    The founder’s post highlights the perennial struggle educators face in sourcing “proper content” for classrooms. It’s a valid concern. Teachers spend countless hours curating resources, adapting materials, and trying to meet diverse student needs within rigid curriculum frameworks. But to frame this as a supply-chain problem solvable through streamlined content platforms misses the broader systemic forces shaping this issue. Why are teachers so often left to fend for themselves in the first place? Why hasn’t the publishing industry, or indeed education systems themselves, adequately addressed the gaps? These aren’t rhetorical questions—they point to deeper problems that a single edtech product cannot fix.

    The Consolidation of Power in the Content Ecosystem

    One of the most pressing concerns here is the growing centralisation of educational content under tech platforms. Tools like Syllabyte inevitably centralise decision-making about what constitutes “proper content.” This isn’t inherently wrong; curation can be a valuable service. But it’s worth questioning what happens when the gatekeepers of “proper content” are private vendors rather than educational professionals or institutions. Who decides which publishers get a seat at the table? What happens to niche or dissenting perspectives that don’t fit neatly into the platform’s algorithmic recommendations?

    This trend mirrors what we’ve seen in other industries—streaming services in entertainment, for instance, where content diversity often suffers under the weight of platform consolidation. In education, the stakes are higher. Content isn’t just entertainment; it shapes minds, values, and worldviews. When the decision-making power shifts from educators to tech vendors, the risks of bias, exclusion, and cultural homogenisation grow exponentially.

    Marketing Promises vs. Classroom Realities

    Edtech solutions often enter the market with lofty promises: less work for teachers, more engaging materials for students, and better outcomes for all. But the reality of classroom implementation is rarely so straightforward. Teachers are already overburdened by administrative tasks, underfunded resources, and the pressure to meet standardised testing requirements. Adding another platform to manage—no matter how user-friendly—often exacerbates these burdens rather than alleviating them.

    Furthermore, the assumption that digital tools automatically improve teaching and learning is flawed. Digital content can be as ineffective as its analogue counterparts if it fails to engage students meaningfully or align with the curriculum. The promise of “making it easier” for teachers and publishers glosses over the complexities of pedagogy, the nuances of student engagement, and the critical role of teacher autonomy. What educators need isn’t just easier access to content but tools that respect their professional expertise and adapt to the messy realities of classroom life.

    Data Privacy: The Silent Cost of Convenience

    What’s notably absent from the founder’s pitch—and indeed, from much of the edtech discourse—is any mention of data privacy and security. Platforms that connect educators and publishers inevitably collect vast amounts of data: user behaviour, content preferences, classroom demographics, and more. This data is a goldmine for vendors but a potential minefield for schools and educators. Are these platforms transparent about how data is stored, shared, and monetised? Are teachers and students being adequately protected from breaches or misuse?

    The edtech industry has a long history of treating data as an afterthought, focusing on convenience and growth over security. Yet, as schools increasingly rely on digital tools, the consequences of poor data practices become harder to ignore. A breach doesn’t just compromise individual privacy—it undermines trust in the very systems that are meant to support education. Institutions adopting platforms like Syllabyte should be asking hard questions about privacy policies, encryption standards, and compliance with regulations like Australia’s Privacy Act.

    The Bigger Picture: What Happens Next?

    If tools like Syllabyte succeed in their mission to streamline content access, what happens next? Will publishers and platforms become the de facto arbiters of educational materials, sidelining educators in the process? Will schools become overly reliant on third-party vendors for critical resources, leaving them vulnerable to price hikes, service disruptions, or corporate buyouts? And what happens to the diversity of educational content when market dynamics inevitably favour the largest, most profitable publishers?

    These are not idle concerns. The edtech sector has a history of favouring scale over substance, often prioritising growth metrics over meaningful educational outcomes. Institutions need to think critically about the long-term implications of embedding such tools into their systems. Are they solving immediate pain points at the expense of creating deeper dependencies? Are they trading short-term convenience for long-term vulnerability?

    Supporting Educators Beyond Tech Solutions

    The founder’s call to “support the system” is admirable, but it’s worth asking what true support looks like. Is it about building more tech tools, or is it about addressing the systemic issues that make those tools necessary in the first place? Teachers don’t just need easier access to content; they need professional respect, adequate funding, and policies that prioritise education over corporate interests.

    Perhaps the real innovation lies not in creating another platform but in advocating for structural change—better funding models for schools, more equitable access to resources, and a stronger commitment to teacher autonomy. Until these deeper issues are addressed, the promises of edtech will remain, at best, a partial solution to a much larger problem.

  • Challenges of Traditional Publishing Processes

    The Tyranny of Process: Why “The Way Things Are Done” Holds Publishing Back

    In the publishing world, the mantra of “fall in love with why things are done, not how they’re done” resonates deeply—and for good reason. The industry has spent decades meticulously refining processes and workflows, often to the point of fetishisation. But as digital transformation continues to reshape the sector, clinging to established methods has become less a sign of diligence and more a dangerous form of inertia.

    It’s not hard to see why this happens. Publishing has long been a bastion of tradition, where the “right way” often meant the tried-and-true way. Contracts were inked on paper. Book launches followed predictable timelines. Marketing campaigns were designed around brick-and-mortar retail cycles. Even as digital tools infiltrated the ecosystem, they were often shoehorned into these legacy workflows rather than used to fundamentally rethink them.

    And herein lies the problem: defending the process for its own sake is the fastest way to make a business irrelevant. In an era defined by rapid technological shifts, “how things are done” matters far less than “why they’re done”—yet many organisations still fail to grasp this distinction.

    The Process Trap

    What’s most insidious about clinging to outdated methods is that it’s rarely framed as resistance. “That’s how we’ve always done it” is often couched in terms of quality control, risk management, or preserving best practices. But the reality is this mindset stifles innovation, punishes adaptability, and perpetuates inefficiency.

    Consider the rise of self-publishing platforms and subscription-based models like Kindle Unlimited, which have rewritten the rules of distribution and monetisation. Traditional publishers have spent years trying to retrofit these models into their existing workflows, often with mixed results. Why? Because instead of asking, “Why do readers choose these platforms?” they’ve focused on how to replicate the mechanics of success without challenging their own assumptions.

    The same phenomenon plays out in education publishing, where legacy systems built around print distribution are struggling to adapt to the demands of digital-first classrooms. Here, the tyranny of process is particularly damaging. Schools and educators increasingly need flexible, personalised tools that can integrate seamlessly with their broader ed-tech ecosystems. Yet many publishers continue to churn out static, PDF-like “digital textbooks” that are little more than glorified replicas of their print counterparts.

    When process becomes dogma, purpose gets lost in the shuffle.

    Adaptability Is the Only Constant

    The advice to “fall in love with why things are done” is ultimately a call for adaptability—and adaptability is the publishing industry’s Achilles heel. This is an industry that historically thrived on gatekeeping: gatekeeping authors, gatekeeping access, gatekeeping formats. But as barriers to entry collapse thanks to technology, the only way forward is to shift from gatekeeping to enabling.

    This is easier said than done. Enabling requires humility—the willingness to admit that the methods we’ve spent decades perfecting might no longer work. It requires asking uncomfortable questions: Who benefits from this process? Who is excluded? Does this workflow serve the reader, or does it merely serve the publisher?

    More importantly, enabling requires a radical focus on purpose. If the purpose of publishing is to connect creators with audiences, then the industry needs to build workflows that prioritise connection over control. That might mean rethinking how intellectual property is licensed, how content is packaged, or how platforms are monetised.

    The Cost of Doing Nothing

    The risk of ignoring this shift is existential. For proof, look no further than industries that have already been disrupted. Music publishing clung to physical distribution models until streaming platforms like Spotify made them irrelevant. Film studios resisted digital-first distribution until Netflix forced their hand. In both cases, the companies that adapted early not only survived but thrived, while those that clung to process found themselves sidelined.

    Publishing is now facing its own Spotify moment. The shift to digital-first consumption, the rise of AI-generated content, and the growing demand for personalised experiences are all converging to force the industry’s hand. But instead of embracing these changes, many publishers remain bogged down by workflows that are built for a world that no longer exists.

    Looking Forward

    The senior editor’s advice—fall in love with the “why”—is more than a platitude. It’s a survival strategy. Organisations that embrace this mindset will be the ones to lead the industry into its next chapter. They’ll recognise that workflows are tools, not sacred artefacts. They’ll invest in technologies that amplify purpose rather than entrench process. And they’ll build cultures that reward questioning rather than compliance.

    For decision-makers in publishing, the question isn’t whether change is coming—it’s whether their organisations will be ready when it arrives. Falling in love with the “why” is one way to ensure they are. But ignoring it? That could be fatal.

  • Over-Engineering in Education Technology

    Opinion: The Cult of Over-Engineering and Its Hidden Costs in Education Technology

    It’s a familiar refrain from engineers and technologists: “We over-engineer because we must.” The logic is rooted in past experience—decades of solving complex problems under constraints where redundancy and robustness were paramount. But this mindset, while admirable in its attention to detail, has become a double-edged sword, particularly in sectors like education technology. Over-engineering isn’t just about excessive complexity; it’s about misplaced priorities and a failure to understand the environments where these solutions will be deployed.

    Take education, for example. Engineers often approach the classroom like a factory floor, imagining a neatly organised system of inputs and outputs. They build software and devices that promise precision, scalability, and efficiency, assuming schools operate with the same predictability as production lines. But education is messy. It’s human-centric, context-laden, and deeply resistant to the kind of standardisation that many engineers crave. Over-engineered solutions often fail to account for this, leaving educators with tools that are cumbersome, opaque, or outright unusable.

    The Hidden Costs of Complexity

    At first glance, over-engineering might seem harmless—a quirk of perfectionism that results in fancy features or bulletproof functionality. But the reality is far more concerning. When tools are unnecessarily complex, they bring hidden costs that ripple through classrooms, institutions, and even the broader education system.

    Usability vs. Complexity
    Over-engineered platforms often alienate their end-users: teachers and students. A “simple” grading tool with layers of analytics, dashboards, and customisation options might look great on paper, but if it takes an hour to set up and another hour to troubleshoot, it’s a net loss for educators already stretched thin. Simplicity isn’t a nice-to-have; it’s a necessity when the primary users are not technologists but everyday humans.

    Security Overhead
    More features often mean more attack surfaces. Over-engineering can introduce unnecessary vulnerabilities, from overly complex data structures to poorly integrated third-party APIs. In education, where sensitive student data is involved, the consequences of a breach are catastrophic—not just for institutions, but for individuals whose privacy has been compromised.

    Distracting from Core Goals
    Education technology should enhance learning. It should help teachers teach and students learn. But when engineers prioritise bells and whistles over functionality, the focus shifts from pedagogy to gadgetry. Features like AI-driven insights or blockchain credentialing sound impressive, but do they actually improve learning outcomes? Too often, the answer is no.

    What Drives the Over-Engineering Epidemic?

    It’s tempting to blame engineers themselves for this trend, but the truth is more complex. Over-engineering in education technology is often spurred by structural issues within the industry:

    Marketing Pressure
    Vendors are under immense pressure to differentiate their products. A simple, streamlined tool doesn’t make a splash at conferences or stand out in sales pitches. So they add features—often unnecessary ones—to make their offerings look more advanced. The result? Tools that prioritise visual appeal over practical utility.

    Procurement Dynamics
    Education institutions often select tools based on feature checklists rather than usability. Vendors respond by building products that tick more boxes, even if those features are rarely used. This creates a vicious cycle where complexity is rewarded, not penalised.

    The Myth of Scalability
    Engineers love scalability—a system that can handle a small classroom today and a nationwide rollout tomorrow. But scalability often comes at the expense of simplicity. A platform optimised for massive deployments may become unwieldy in smaller, more intimate settings, which still make up the majority of educational environments.

    Breaking the Cycle: Simplicity as Strategy

    The solution to over-engineering isn’t just about stripping away unnecessary features; it’s about changing the incentives driving this behaviour. Here’s how:

    Reprioritising Usability
    Vendors should start by asking: What do teachers and students actually need? Usability testing and real-world pilots must become central to the development process, not afterthoughts.

    Regulating Data Practices
    Over-engineered systems often collect more data than necessary under the guise of “analytics.” Governments and institutions need to enforce stricter regulations around data collection, ensuring that tools are designed with privacy in mind.

    Shifting Procurement Mindsets
    Institutions must move beyond feature-driven purchasing decisions. They should demand proof of effectiveness—does the tool demonstrably improve learning outcomes, reduce teacher workload, or bolster student engagement?

    Learning from Simpler Systems
    Engineers should look to the success of low-tech solutions, like Google Classroom or even physical whiteboards, as reminders of what truly matters in education. Technology is a means, not an end.

    The Big Picture

    Over-engineering isn’t just an engineering problem; it’s a cultural one. It reflects a disconnect between technologists and the environments they’re designing for. In education technology, the consequences of this misalignment are amplified—cost overruns, wasted time, and tools that fail to deliver on their promises.

    If simplicity is the antidote, then it’s time for vendors, engineers, and institutions alike to embrace it as a guiding principle. The future of education technology shouldn’t be defined by the ambition of the engineers building it, but by the needs of the educators and learners it aims to serve. Anything less is just noise.