Core Matters’ Position on AI: The Good, The Bad, & The Ugly

Core Matters’ Position on AI: The Good, The Bad, & The Ugly

Artificial intelligence (AI) is everywhere right now; promising speed, efficiency, and a competitive edge. But for many business leaders, the conversation feels polarized: AI is either the future of work or the beginning of the end.

At Core Matters, we don’t believe in hype or fear. We believe in clarity. That’s why we sat down with Ryan Englin, our Founder and CEO, to talk candidly about AI: the good, the bad, and the ugly.

As an employee engagement expert, working daily with companies who hire, onboard, and retain real people, Ryan offers a grounded perspective on where AI truly helps, where it hurts, and how leaders can use it as a tool without losing trust, culture, or common sense.

Here’s the interview:

1. Why did Core Matters decide to take a public position on AI instead of avoiding the conversation like many companies in this space?

Because avoiding the conversation doesn’t protect leaders. It leaves them exposed.

At Core Matters, we’ve always believed our responsibility is to help business owners make better decisions, not just feel comfortable. AI is already impacting hiring, communication, marketing, and leadership whether people like it or not. Pretending it’s not happening doesn’t slow it down. It just creates a bigger gap between the leaders who understand it and the ones getting blindsided by it.

AI is a tool, and like any tool, it can either amplify good leadership or expose bad leadership.

2. When most leaders hear “AI,” they feel either overwhelmed or skeptical. How do you define AI in plain language for ops-heavy business owners?

I keep it simple: AI is an assistant, not a decision-maker.

It’s software that can process information faster than a human and help you draft, organize, analyze, or summarize. But it doesn’t know your people, your company culture, or your standards unless you teach it. It doesn’t replace leadership, but it can certainly help make things easier.

For ops-heavy businesses, that means AI can save time on things like organizing interview notes, summarizing feedback, or cleaning up communication. But it should never be the thing deciding who you hire, how you lead, or what your values are. If you treat it like a replacement for thinking, you’ll get lazy results. If you treat it like an assistant and give it very specific tasks, it can be incredibly helpful.

3. Before we get into the good, bad, and ugly, what’s the biggest misconception you see leaders making about AI right now?

That AI will fix what leadership hasn’t. A lot of leaders are hoping AI will solve their hiring problems, engagement problems, or communication problems. It won’t. It will just make those problems happen faster and at scale.

If you don’t have clear expectations, AI won’t create them.

If your company culture is broken, AI won’t repair it.

If your leaders avoid hard conversations, AI won’t magically make them better communicators.

The truth is, AI doesn’t create clarity. It exposes whether clarity already exists. And for leaders willing to do the work, that’s powerful. For leaders looking for a shortcut, it’s dangerous.

The Good: Where AI Actually Helps

AI is a powerful tool when it’s used with intention. In the right hands, it reduces noise, creates consistency, and gives leaders back time to focus on people instead of paperwork. In this section, Ryan shares where AI genuinely adds value in hiring, onboarding, and employee experience without replacing the human judgment that great leadership requires.

4. Where do you see AI adding real value today in hiring, onboarding, or employee experience?

AI adds the most value anywhere leaders are losing time to busywork instead of leadership.

In hiring, it can help clean up job ads, create better interview questions, summarize candidate notes, and spot gaps in the process. But it should never decide who’s the right fit. Fit is about values, expectations, and effort. That’s people work.

In onboarding, AI can help reinforce clarity. Things like summarizing SOPs, creating checklists, or helping managers prepare for 2/4/12 check-ins. That support helps leaders show up more prepared instead of winging it.

And in employee experience, AI can surface patterns: where communication breaks down, where engagement is slipping, where people are getting stuck. But the moment still belongs to the leader. AI can tell you where to lean in, but it can’t replace how you lead when you do.

5. How can AI help leaders create clarity, not confusion, inside their organizations?

Most confusion inside organizations comes from leaders explaining things ten different ways to ten different people. AI can help standardize messaging, document expectations, and translate what’s in a leader’s head into something repeatable and clear.

But here’s the key: AI doesn’t create clarity on its own. Leaders still have to decide what matters, what good looks like, and what’s non-negotiable. AI just helps communicate it more consistently and reinforce it over time.

If a leader isn’t clear, AI will just scale the confusion faster.

6. How should leaders think about AI when it comes to communication with employees? What should never be automated, and what can be?

My rule is simple: If emotion, trust, or accountability is involved, it should not be automated.

Hard conversations. Performance feedback. Recognition. Conflict. Career discussions. Those moments require presence, tone, and empathy. Automating them tells employees, “You’re not worth my time,” even if that’s not the intent.

Where AI can help is behind the scenes. Drafting messages, organizing thoughts before a conversation, summarizing meetings, or helping leaders prepare what they need to say clearly and respectfully. That’s a huge win.

The Bad: Common Pitfalls & Misuse

Not every AI implementation is progress. When leaders use AI to move faster without fixing broken systems, or to avoid hard leadership work, it can create confusion, erode trust, and amplify existing problems. Here, Ryan breaks down the most common mistakes he sees companies make and why “efficient” doesn’t always mean “effective.”

7. What are the risks of using AI to move faster without first fixing broken systems?

AI doesn’t fix broken systems. It accelerates them.

If your hiring process is inconsistent, AI will make it inconsistently fast. If your onboarding is confusing, AI will scale that confusion across every new hire. Speed without structure doesn’t create progress; it creates chaos.

We see this all the time: leaders adopt AI to “save time,” but what they’re really doing is avoiding the harder work of defining expectations, building processes, and training leaders. AI just makes the cracks show up sooner and often more publicly.

8. How can over-reliance on AI damage trust with candidates or employees, even when intentions are good?

Trust is built when people feel seen, heard, and respected. AI can support that, but it can also quietly undermine it.

Candidates can tell when communication feels automated. Employees can tell when feedback is templated. When people feel like they’re interacting with a system instead of a leader, they start questioning how much they actually matter.

Even with good intentions, overusing AI sends the message: efficiency matters more than you do. And once trust erodes, engagement follows right behind it.

9. What warning signs tell you a company is using AI as a shortcut instead of a strategy?

There are a few giveaways. One is when leaders can’t clearly explain their expectations without AI’s help. If you need a prompt to define what “good performance” looks like, that’s not an AI problem. That’s a leadership problem.

Another sign is when AI-generated content replaces conversations instead of supporting them. Automated feedback, generic recognition, or mass communication with no context or follow-up, that’s a shortcut, not a strategy.

And the biggest red flag? When speed becomes the goal instead of outcomes. Strategy asks, “Does this make us better?” Shortcuts ask, “Does this make this faster?” AI magnifies whichever question you’re asking.

The Ugly: What Leaders Aren’t Talking About

Some of the biggest dangers of AI aren’t loud or obvious. Poorly applied AI can quietly weaken culture, outsource accountability, and distance leaders from their teams. In this section, Ryan tackles the uncomfortable truths about AI and the long-term consequences for companies that hand over judgment instead of strengthening leadership.

10. How can poorly implemented AI quietly erode culture, accountability, or leadership confidence?

Poorly implemented AI doesn’t usually blow things up overnight. It slowly dulls the edge of leadership.

When leaders start relying on AI to write messages, give feedback, or explain expectations without truly owning the content, accountability gets blurry. Employees aren’t sure what’s real, what’s automated, and what actually matters. Company culture weakens because it needs human moments: tone, timing, and follow-through, not templates.

Over time, leaders also lose confidence. If you stop practicing hard conversations, you don’t get better at them. If AI becomes your voice, you forget how to use your own. That’s how leadership slowly turns into management-by-software instead of leadership-by-example.

11. What are the long-term consequences for companies that use AI to avoid hard conversations, coaching, or leadership development?

Eventually, those companies lose leaders and they don’t even realize it at first.

You end up with managers who can forward messages but can’t coach. Systems that look polished on the surface but feel hollow to the people inside them. High performers leave because they’re not being challenged or developed, and low performers stay because no one is willing to address the gaps.

Long-term, the business becomes fragile. When things are easy, it feels fine. When pressure hits, there’s no leadership muscle to handle it. AI didn’t cause that weakness; it just hid it long enough for the damage to compound.

Practical Guidance: Using AI the Right Way

AI isn’t a magic bullet. It’s a tool. The difference between companies that benefit from AI and those that regret it comes down to how intentionally it’s implemented. Ryan shares practical guardrails, decision frameworks, and leadership principles to help businesses use AI responsibly while protecting trust, clarity, and employee experience.

12. If a leader wants to start using AI responsibly, what must be in place before they turn it on?

Before you turn on AI, you need clarity.

Clear expectations. Clear roles. Clear standards. Clear values. If you can’t explain what “great” looks like without a tool helping you, AI will only amplify the confusion.

Leaders also need ownership. AI should never be the reason something was said, decided, or sent. If you’re not willing to stand behind it, don’t automate it. Responsible use starts when leaders see AI as support, not replacement.

13. How should companies decide where AI stops and people start?

The dividing line is responsibility.

AI can handle preparation, organization, and reinforcement. People need to own judgment, relationships, and accountability. If a moment requires trust, emotion, or consequences, it belongs to a person, not a chatbot.

When companies get this right, AI strengthens leadership. When they get it wrong, they accidentally train their people to stop looking to leaders and start looking at a screen. That’s a dangerous shift.

14. If you could give one piece of advice to leaders who want to “AI-proof” their company, what would it be?

Don’t focus on AI. Focus on leadership fundamentals.

A strong company culture is built on clear expectations, consistent follow-through, and real relationships. Those things can’t be automated.

AI-proof companies don’t fear technology because they’re grounded in clarity and trust.

15. If an ops-heavy business owner only remembers one thing from this conversation, what should it be?

AI won’t fix your business, but it will amplify it.

Used well, it gives leaders time, clarity, and consistency. Used poorly, it magnifies every weakness in your systems and leadership. The difference isn’t the tool. It’s the leader behind it.

A Final Thought for Leaders

At the end of the day, AI will either give you more time to lead well or expose the places where clarity, accountability, and trust are missing. That choice is still human.

If this conversation sparked questions about how AI fits into your hiring, onboarding, or employee experience, start there. Get clear. Fix the fundamentals. Then decide where a tool can support the work you’re already committed to doing.

And if you want help thinking through that responsibly, that’s the work we do every day at Core Matters: helping leaders build businesses where people and systems work together.

Like what you read?

Subscribe to Our Newsletter

Related Posts

Learn how to ask better questions, reframe situations, and build understanding to reduce workplace conflict.

With a clear hiring process and trust in the system, hiring success is inevitable.

Ready to build a team you trust and grow your business with confidence?

Leaving Already?

You’re Probably Losing More Than You Know...