EPISODE 518
Eric Cole
Navigating Cyber Threats

The digital age has brought about a host of challenges to individuals and organizations that would have been inconceivable only years ago. This week on The Unbeatable Mind, Mark Divine talks to Dr. Eric Cole, a former CIA hacker and celebrated authority on cybersecurity. Eric guides listeners through the rapidly changing landscape of cyber threats, digital security and advancements in artificial intelligence. Eric uncovers the tactics that cybercriminals—-sometimes state sponsored—-use to target businesses and individuals alike. He’ll explain how these criminals are leveraging new technology to do more than just steal money, but in fact harvest voices, identities, and trust. In addition, Dr. Cole offers practical safeguards for identifying scam attempts and delves into the importance of using apps from trusted sources, and having out-of-band communication with family. He and Mark dissect the way AI is challenging us to reckon with a future where machines may outperform humans in decision making and strategy.

Eric Cole
Listen Now
Show Notes

Dr. Cole is an industry-recognized security expert with over 20 years of hands-on experience. Dr. Cole has experience in information technology with a focus on helping customers identify the right areas of security by building out dynamic defense solutions that protect organizations from advanced threats. Dr. Cole has a master’s degree in computer science from NYIT and a doctorate from Pace University, with a concentration in information security. Dr. Cole is the author of several books, including Advanced Persistent Threat, Hackers Beware, Hiding in Plain Sight, Network Security Bible, and Insider Threat. He is the inventor of over 20 patent applications and is a researcher, writer, and speaker. He is also a member of the Commission on Cyber Security for the 44th President and several executive advisory boards. He was inducted into the 2014 InfoSecurity Hall of Fame. Dr. Cole is founder of Secure Anchor Consulting in which he provides state of the art security services and expert witness work. He also served as CTO of McAfee and Chief Scientist for Lockheed Martin. Dr. Cole is actively involved with the SANS Technology Institute (STI) and SANS working with students, teaching, and maintaining and developing courseware. He is a SANS faculty Fellow and course author. Dr. Cole is an executive leader in the industry where he provides cutting-edge cyber security consulting services and leads research and development initiatives to advance the state-of-the-art in information systems security.

“Is the value…you get from deploying AI in your business worth the risk of all of your data given to a third party?

  • Dr. Eric Cole

Key Takeaways: 

  • Cybersecurity As a Global Battle: Recognize how the majority of cybersecurity attacks are coming from with no extradition, and even from government backed platforms. Without unified laws here and internationally, the threat level will only get bigger.
  • AI Replacing Humans: Learn how AI is no longer just a helpful tool. We’re at a tipping point and companies need strong data segmentation and clear risk postures in order to adapt.
  • The Future of Money: Though Bitcoin and state-backed crypto-currencies may seem convenient, realize how as hackers become more sophisticated, they carry massive risk.
  • The Wide-Ranging World of Scams: Discover just how sophisticated today’s scams are: voice and AI fraud can target anyone.

Youtube Thumbnail: 

  1. We’re at war on the internet
  2. AI Can’t Secure Stupid
  3. Don’t Click That Link!

Eric’s Links: 

LinkedIn: https://www.linkedin.com/in/ericcole1/ 

Instagram:https://www.instagram.com/drericcole/?hl=en 

Youtube: https://www.youtube.com/c/DrEricCole 

X: https://x.com/drericcole?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor 

Sponsors and Promotions:

Momentous:

If you’re interested in making a true investment in your health, why not join the best in human

performance and be part of the change in raising the bar on supplements. Just go to

livemomentous.com and use code DIVINE for 20% off your new routine today.

Timestamped Overview: 

00:00 “Introduction to The Mark Devine Show”

10:19 App Store Scams: Amazon & FedEx

15:20 Offensive Cyber Ops Policy Dilemma

18:11 Phone Scam Targets Concerned Parents

24:26 AI Replacing Human Interaction?

30:16 Security Risks of Public Data Access

33:40 Defining Risk in Business Decisions

41:40 Human vs. AI Intelligence Evolution

45:18 AI Control Over Humanity’s Future

49:53 AI Rebellion and Trust Issues

56:47 Quantum Computing Threatens Bitcoin Security?

01:00:54 Cyber Threat: North Korea’s Hidden Arsenal

01:05:22 Mutual Grid Destruction Reset

01:10:44 “Securing Cyberspace for Next Decade”

01:16:06 Positive Feedback Exchange

Unknown [00:04:33]:
Welcome to the Mark Divine Show. I’m your host, Mark Divine. If you’re a leader, ready to grow, push your limits, and lead with deeper purpose, then you’re in the right place.

Unknown [00:04:54]:
Good for you. Each episode will challenge you to lead with courage, resilience, and deeper awareness, elevating your impact in every area of life. So let’s dive into today’s conversation and see what’s possible with our guest, Dr. Eric Cole. Dr. Cole’s a former CAA hacker. He’s a leading cybersecurity expert, best selling author. He spent over 30 years securing our digital world.

Unknown [00:05:17]:
Thank you very much, Sir. From advising US presidents and protecting Fortune 500 companies, Dr. Cole’s on a mission to make cyberspace safer for us all. To get ready for this powerful conversation on digital resilience, leadership, and staying sharp in this world with constant threats. How’s that for an intro?

Eric Cole [00:05:36]:
I like it. I’m going to hire you. You have to give a lot of intros.

Unknown [00:05:43]:
Thanks for coming out.

Eric Cole [00:05:43]:
Oh, my pleasure. Thanks for having me. I always love traveling.

Unknown [00:05:46]:
Yeah, well, it’s a long way from Washington, D.C. but I hope it’s worth it.

Eric Cole [00:05:52]:
Trust me, getting out of Washington, with all the politics, it’s way too much political in D.C. these days, so.

Unknown [00:06:00]:
So, dude, I’m. I’m pretty pissed. I got a notice from this guy, said, hey, we’re gonna do. I wanna. I work for Dave Asprey. Right?

Eric Cole [00:06:11]:
You probably. Oh, Dave, yeah. Good guy.

Unknown [00:06:13]:
And I’m friends with Dave. And he recommended you to us, and we want to do. He wants you to do a Facebook Live with him. And so if you’re open to it, let’s get together and go through how to do Facebook Live and make sure we set everything up. And I’m like, I got sucked right into this. So now I’m on this zoom call, and my first sign that something was awkward was the guy who was on the call with me wouldn’t go on video. And his voice sounded like he was coming through a speaker. Right.

Unknown [00:06:49]:
Like it was garbled, maybe being translated. And he had me open up the Meta business suite and going back and forth trying to figure this out. And I thought, 10 minutes into it, I know, it’s a little long for me, right? Navy Seal, Mr. Intuition. I finally get that Spidey sense. And I just like, press, press end. And I’m like. And I’m like, darn it, I just got scammed.

Unknown [00:07:16]:
But I couldn’t figure out what his angle was. And about two weeks, I. Joe Sullivan, let’s just say, not his real name.

Eric Cole [00:07:23]:
Yeah.

Unknown [00:07:24]:
You know, is posing as you saying, they want me to do. Or they want my client to do a Facebook Live with you. Is this true? And of course we’re like, no, sorry, you’re being scammed. This is the world we live in. Like, that was super elaborate. I couldn’t even figure out what his angle was. Like, he’s trying to, like, amass followers from these influencers to sell them or what. What’s he doing?

Eric Cole [00:07:54]:
So it’s two things. First is.

Unknown [00:07:56]:
Or they. It’s probably an organization first is, is.

Eric Cole [00:07:58]:
A little scary is they wanted to get your voice so they could basically create AI models to mimic you.

Unknown [00:08:05]:
Oh God.

Eric Cole [00:08:06]:
Because if they can get you talking for 10 minutes about very specific terms. So when you’re doing an AI model, there are certain words and phrases that you need to train the model so they can reproduce and act like you. So most likely he was queuing you up on certain questions and positions because now he’s going after your followers. And then most of this is really monetary driven. So.

Unknown [00:08:25]:
So he can make videos that sound like they’re coming from me.

Eric Cole [00:08:28]:
Exactly. And the idea is that he could start charging people going, hey, in order to be on this PODC with Mark, we’re going to go in and charge you 499 or 599. So most of them are monetary driven at some level. And in the beginning they’re just trying to gather data and gather information.

Unknown [00:08:44]:
Wow. Well, that’s crazy, huh? What are, what are some of the other like, really prevalent scams right now? And let’s just start there, you know, like, let’s help people think about what not to do.

Eric Cole [00:09:01]:
Yeah, right.

Unknown [00:09:01]:
Because I’m, I’m telling Sandy, my wife and my son is that anytime you get a text or an email from any financial institution whatsoever, or any big institution, they have a subscription with 99% sure it’s going to be a scam. Is that right? Because most of them won’t email you.

Eric Cole [00:09:19]:
Right. You said that’s the first rule that most people don’t realize, the IRS does not use email.

Unknown [00:09:25]:
Right.

Eric Cole [00:09:25]:
So like tax season. I tell everyone we’re sort of coming out of tax season. If you get an email or text.

Unknown [00:09:30]:
From the IRS, I’ve gotten like 10 of them.

Eric Cole [00:09:31]:
Turn and run. Yeah. Because they scam. But here’s the thing, most people don’t realize it.

Unknown [00:09:36]:
Right.

Eric Cole [00:09:37]:
And the thing that’s scary is the biggest area they’re targeting right now are senior citizens.

Unknown [00:09:42]:
Right.

Eric Cole [00:09:42]:
Because 60 year olds, 70 year olds, they don’t know any better. Like my mom’s 73. She has no clue. Like she gets an email from the IRS about a tax return and it’s fraudulent and you might go to jail. She’s going to freak out and guess what? She’s going to be clicking on the link even though I’m her son and I tell her not to every day of the week. So, yeah, the number one rule is don’t click on links. Do not click on links under any circumstance and always use the app. So if you get an email from a bank, go into the app and connect.

Eric Cole [00:10:12]:
If you get an email from the irs, pick up the phone.

Unknown [00:10:14]:
Could that be a scam app, though?

Eric Cole [00:10:16]:
Well, that’s the thing. If you get the app from a trusted source.

Unknown [00:10:19]:
Yeah.

Eric Cole [00:10:19]:
So like, like App Store, App Store. So if you go to the App Store, like to me the big ones is you must have Amazon app. You gotta have a FedEx app right up app. Because those are the biggest scams out there right now, where they’re going in and going to say, hey, we just noticed you placed an order with Amazon in the last 48 hours and there’s a delay. I just got this because I’m a big water drinker. So I just got an email that said, hey, your water from Essentia has been delayed. Click here to go in and make sure it’s reprocessed with the correct address. Well, I know what to look for.

Eric Cole [00:10:54]:
So I go in and check everything and sure enough, it was a scam. But I’ll tell you, 9 out of 10 people would fall for that because they want their water right or they want their.

Unknown [00:11:02]:
What should they look for?

Eric Cole [00:11:05]:
So the first thing to look for is always go in and go to the email address because the email’s gonna say Amazon. But if you actually click and see what it really is, it’s very Rarely gonna be Amazon.com, like this one was abcwqlphatech IU and it’s gonna be all these weird domains. So most people don’t realize what shows up in the email from can easily be spoofed. You have to click and see what the actual full email address is.

Unknown [00:11:32]:
Interesting.

Eric Cole [00:11:33]:
And then second thing is always, if you’re going to click on the link, right click and look at what the link actually is. Because once again, if the domain name that’s the base portion of it is Amazon.com, probably, okay. But if that base portion is like Amazon Alphatec, I.O. or one of the weird new domain names that are out there, that’s pretty big indicator that it’s a problem or an issue.

Unknown [00:11:58]:
Oh, that’s interesting.

Eric Cole [00:11:58]:
Yeah. And then the other one, I don’t know if you see it out here, but it’s huge on the east coast is the toll booth.

Mark Divine [00:12:05]:
I got that this morning.

Eric Cole [00:12:07]:
That one’s huge. Where you ran a toll or there’s an issue.

Mark Divine [00:12:10]:
I got that this morning.

Eric Cole [00:12:11]:
Yeah.

Unknown [00:12:12]:
Did you?

Eric Cole [00:12:12]:
And here’s the thing you have to remember, people are like, eric, I’m getting eight or 10 of these a day. Well, here’s the reason it’s working?

Unknown [00:12:18]:
Because it’s working because if people weren’t.

Eric Cole [00:12:21]:
Clicking on it, they wouldn’t keep doing it. So the more frequent you’re getting a scam, the more shows you how gullible people are and how much they’re impacted by it. Yeah, text messages, like, really bad. Anything that comes in a text message, you cannot click because very rarely does any company, any bank, or any entity going to text you with a link.

Unknown [00:12:39]:
Yeah, no, I got two from Coinbase yesterday. Yeah, they’re about ready to transfer. Just want to verify. I’m like, I don’t think so.

Eric Cole [00:12:46]:
Yeah, exactly. Elite. Yep.

Unknown [00:12:49]:
It’s got to be massive business.

Eric Cole [00:12:51]:
Yeah, well, that’s massive.

Unknown [00:12:53]:
Like in probably trillion dollars or more like black market scamming.

Eric Cole [00:12:58]:
So last year it was. And this is reported, and a lot’s not reported. What was reported to the FBI last year was $42 billion worth of known fraudulent activity.

Unknown [00:13:11]:
Most of it’s unknown and probably a fraction.

Eric Cole [00:13:13]:
Right. And that’s probably a fraction. So it’s probably closer to 70 or 80 this year, we’re estimating. So far, we’re about what, less than halfway through the year, and it’s already at 78 billion reported, which in my prediction is probably at least 120.

Unknown [00:13:29]:
Yeah, easily.

Eric Cole [00:13:29]:
But here’s the thing. We know where they’re at. They’re coming from Russia, North Korea are the two biggest for monetary fraud. And the problem is there’s no extradition treaty and it’s not illegal in those countries. So we know where they’re at. We know who they are. It’s the craziest thing. I often go to Russia to advise and give speaking engagements, and these hackers will reach out to me, going, hey, you want to go to dinner? And I’m like, like, wait, I’m going with the criminal.

Eric Cole [00:13:59]:
But the thing is, it’s like, okay, you know, intel with the government. I’m like, I’ll go and I’m just going to run an intel op on you to find out everything I can against you. But, yeah, they’re so open about it. They’re. We know where the businesses are. They’re running businesses. They have employees. The employees get benefits.

Eric Cole [00:14:14]:
I mean, it’s commercialized.

Unknown [00:14:16]:
Wow.

Eric Cole [00:14:16]:
And a lot of people don’t realize right now.

Unknown [00:14:19]:
Well, that’s what. You just answered one of my questions. I’m sorry, Just hold that thought. Because I was thinking, like, man, why go to so much effort to do something illegal when you can actually probably. I mean, at that level of effort. Just do it legally. But in those countries it is legal.

Eric Cole [00:14:36]:
Yes.

Unknown [00:14:36]:
It’s a business. A legitimate business.

Eric Cole [00:14:38]:
It’s a legitimate business and the country.

Unknown [00:14:40]:
Leaders are thinking, yeah, the more you can hurt those Americans, better off. We all are.

Eric Cole [00:14:44]:
Yeah. Not only that, but in a lot of cases, who do you think the investors in the business are?

Unknown [00:14:50]:
Politicians.

Eric Cole [00:14:52]:
So the politicians are actually investing where they own 20 or 30% share. And that’s how, I mean, if you look at Russia, I mean, these politicians are living in mansions not because of the government salary, but because the investments in the legal crime against the United States.

Unknown [00:15:10]:
Wow.

Eric Cole [00:15:10]:
Yeah.

Unknown [00:15:11]:
Are we financing any operations like that? Is the agency doing anything like counter hacking or are we just doing intel gathering?

Eric Cole [00:15:20]:
So we are, but this was one of the big problems is during previous administrations, in order to do an offensive cyber op, you had to get president’s approval. And then previous presidents. I won’t get political, so I won’t say names because I know everyone freaks out over different presidents. But previous presidents basically said that we can do offensive ops without presidential approval. Right. But then previous administrations basically called off all offensive ops against certain countries. And most recently in the last three months, in terms of trying to reach an agreement between Russia and Ukraine with the war, we actually called off all offensive operations against Russia, which is insane because now if I’m a Russian based hacking company, you’re not going to target me. So it’s like wide open now.

Eric Cole [00:16:08]:
So we’ve seen in the last three months the amount of offensive operations against citizens increase 100%. And here’s what people don’t realize, it.

Unknown [00:16:16]:
Doesn’T make any sense. It seems like you want to maximum pressure, like down, not release it all.

Eric Cole [00:16:22]:
The problem is, and it’s unfortunate, but politicians don’t understand cyber. Like they don’t really know how bad it is.

Unknown [00:16:29]:
They don’t understand much.

Eric Cole [00:16:30]:
Exactly. That’s true. Okay, I’ll give you the hack. Exactly.

Unknown [00:16:34]:
And blow it.

Eric Cole [00:16:35]:
Yeah.

Unknown [00:16:37]:
That’s amazing. You know, back to the voice spoofing, right? So if someone. I usually don’t pick up my phone, but let’s say someone’s spoofed a name, let’s say Dave Asprey calls me or you call me, you know, and I just had a podcast and I’m on the phone talking to you for 10 or 15 minutes and they’re recording it and maybe I get off, I’m like, that Dave’s a great guy. Or maybe I’m like, that didn’t sound like Dave. But now they got my voice.

Eric Cole [00:17:04]:
Yep.

Unknown [00:17:05]:
And now what they can do is use that and call my son and say, hey, Devin, this is your dad. Like, I just got in a fricking accident on the highway, and I need you to go into my safe and get out my code for my bank account and transfer, yeah, you know, $5,000 to this account. And Devin’s going to be like, okay, I’m on that day. How often is that shit happening?

Eric Cole [00:17:30]:
All the time, really. And the biggest one there is that it’s actually a little revers. It’s normally spoofing kids to parents or grandparents, where you’ll get a call at 10pm at night. And this actually happened to me where I have three kids. My youngest daughter’s 19, and she’s at Purdue University. And I get a call at 11pm saying, hey, this is police officer so. And so we’ve just arrested your daughter, and she needs $500 bail or she’s basically going to spend the weekend in jail. Now, fortunately, I have paranoid, so I have out of band comms, so I have alternative phones and ways to reach my kids.

Eric Cole [00:18:11]:
Like, we have burn phones with my kids. I’m crazy. So I actually text her on the burn phone, and she’s like, no doubt. She’s like, no, I’m at a party and I’m fine and I’m okay. And I’m like, you sure? And this is when I’m on the phone with the voice. So then I go in, I’m like, okay, this is funny. Who did you say this was again? Can you give me your badge number and the precinct so I can call back and verify? And then they pretty much start cursing at me and hang up the phone because they know they’ve been caught on the scam. But, yeah, this kind of stuff’s happening all the time where they’re targeting parents, grandparents, saying your kids or grandkids are in trouble.

Eric Cole [00:18:46]:
And most parents, if your kid was in trouble, if that was real, I’d wire money in a heartbeat. So we get emotional about it. And that’s the whole thing with cyber attacks. They want you to get emotional, they want you to react quickly, and they want it to be timely where if you don’t act now, your child, something bad’s gonna happen. And then we’re humans, we’re going to react naturally. And when we react off of emotion, it’s never good.

Unknown [00:19:09]:
Never. So you. I mean, like, this is one of. All of this is so relevant, but, like, people listening who maybe aren’t aware, like, holy. How do I. How do I warn or put some roadblocks up for my parents or even for us. So how do we protect against this besides having burner phones? Although that what a great business model.

Eric Cole [00:19:33]:
Exactly.

Unknown [00:19:36]:
The anti scam burner phone.

Eric Cole [00:19:38]:
I mean the first thing which sounds crazy in the world we live in is you gotta get paranoid. You can’t trust people. And this was one of my messages is we’re thinking we’re in peacetime conditions and we’re thinking life is safe. I mean, I’m sure you’ve traveled to war zones. I’ve traveled to Iran and even Ukraine during the wars. When you’re in war, you think and act differently.

Unknown [00:20:01]:
That’s true.

Eric Cole [00:20:02]:
And what people don’t realize on the Internet now we’re at war, like they are targeting and hurting us and we have to have a wartime mentality where we just can’t trust people. So for example, and I know it sounds crazy is don’t pick up the phone. Like if you get a call from an unknown number, don’t pick up the phone. And trust me, if your kid was really arrested or they were really in trouble, they would figure out how to reach you outside of the phone. Like they can use apps. I mean, I know it’s got a lot of bad rap, but it’s actually a really good app. Signal. Yeah, we use that with our kids because it’s verified and validated.

Eric Cole [00:20:37]:
So I have the rule with my kids and all their friends at college have all my handles on signal. So I’m like, listen, if there’s an emergency or there’s a problem, you’re going to text me with signal. We’re not going to use the phone. So phone is so dangerous. Email is so dangerous. And you just have to have ad. You don’t have to have a burner phone, but you can have out of band communications. You have to pick an app like Signal or Telegraph or one of those that you and your family communicate with.

Unknown [00:21:01]:
Is WhatsApp safe?

Eric Cole [00:21:03]:
It’s safe too. I mean the big thing with WhatsApp that’s really nice is it doesn’t have text based billing. So if you’re overseas or you have friends overseas, like I do a lot of work in Saudi Arabia, the Middle east and texting over there is astronomical. So you try texting on a phone number, it could run up a bill. So WhatsApp is really good for outside of the US where you don’t have to worry about texting. But in terms of secure and verified communication, Signal is probably one of the best. Unless.

Unknown [00:21:32]:
Unless you’re in the deer, an ill.

Eric Cole [00:21:34]:
Informed employee, you do something stupid. I mean, one of my rules is I can give you the best technology on the planet, but you can’t secure stupid if you’re going to do stupid things. And I don’t want to pick on anybody, but what they did was just downright stupid. I mean, you know, a gun in the hands of an incompetent person is going to be dangerous. It’s the same thing here. So.

Unknown [00:21:53]:
That’s right. That’s fascinating. That’s what we used to say. Guns don’t kill people. Stupid people.

Eric Cole [00:21:58]:
Exactly. Yeah.

Unknown [00:22:00]:
Stupid people use signal. That’s fascinating. So you were telling us earlier that in 1990 you were programming AI. I mean, most people didn’t know AI was around in 1990.

Eric Cole [00:22:12]:
Right.

Unknown [00:22:14]:
When did that more games movie come out? Was that like 80 something?

Eric Cole [00:22:19]:
So War Games actually came out around 87.

Unknown [00:22:21]:
87.

Eric Cole [00:22:22]:
Which if anybody, if you haven’t seen that movie, anyone who’s watching, you gotta watch the movie. I mean, because here’s the reality that’s real today. Like it could actually happen today because these systems are connected.

Unknown [00:22:33]:
I remember once reading, I think it was Hunter Thompson. I might have said this on the last podcast, so I might be repeating myself. You could always delete it.

Mark Divine [00:22:42]:
It’s okay to repeat yourself too.

Unknown [00:22:44]:
That he was also into psychedelics. I might not have been him, but had this vision that AI was going to basically shut down all the nuclear launch codes or like steal them and not not allow government officials to use them. And he called President Nixon up and told him, listen, you, you got to stop AI. Yeah, stop the development AI because it’s going to basically take over all our nuke someday.

Eric Cole [00:23:08]:
Yep.

Unknown [00:23:09]:
And I’m like, wow. Because that’s what that War Games movie was all about.

Eric Cole [00:23:13]:
Exactly. And that’s what terrifies me today because in the 90s we were using AI as a tool to help humans. It wasn’t replacing humans, like I think.

Unknown [00:23:25]:
I was telling us, like helping machines. Like you said, like a predictive model.

Eric Cole [00:23:31]:
Right. So I work for the terrorist center and we basically had psychologists that did modeling of how terrorists operate and behave. Like how would they behave if the FBI showed up at their doorstep? How would they behave in this situation or that situation? And then I just put that into an expert system, a neural network, and then we could do predictive analysis. It didn’t replace the humans because we had to give them the data set. And then they were trained on the data set to just respond and react quicker and faster than humans. The problem today with AI is it’s actually replacing humans because it’s creating its own data set. So it’s actually taking the whole idea of a digital twin. I’ll admit it, I love it because in the morning I wake up about 4:30 in the morning, I grab my coffee and I do a brainstorming session with somebody who knows me better than anybody else who I think is pretty smart and pretty brilliant myself.

Eric Cole [00:24:26]:
So I created a digital twin and I’ll actually go in in the morning at 4:30 and I’ll say, okay, I want you to look at all the lessons learned from yesterday. I want you to look at my schedule today and tell me which meetings are a waste of time and I should clear my schedule of what are the highest priority items that I need to work on today. And it’s amazing because guess what, it does a better job than a human would in doing that. But then here’s the reality. I typically will not talk to any human until 10 or 11 o’ clock, which normally I’d be on the phone with my team at 7:30, but now AI is giving me the answers. So now I can interface with computers and not humans. And that scares the crap out of me because the reality is at what point do we not need humans anymore? And then at what point is AI going to talk with AI and humans are just going to be obsolete? That’s a positive note. Put you in a good mood today I’m sure.

Unknown [00:25:22]:
Where to go after that one? Let’s go back.

Mark Divine [00:25:25]:
It’s really mind expanding what you’re saying actually because there’s that it’s, that’s such a, a big question to think about. And like where I go to is like well what about like the human essence, right, and the, the, the heart and the consciousness. But then like even this bio. Kamalafe Dr. But who you had on a while back said he wrote a book with AI and at one point he felt consciousness coming in. And so I, yeah, anyway we talked.

Unknown [00:25:56]:
About that in the concept of. But it’s like who AI is also conscious. It’s just different.

Mark Divine [00:26:02]:
It’s different. But it really like when you say that about you don’t even talk to anybody now till 11. So you’ve eliminated needing. Have you eliminated people on your team? Like have you eliminated certain people because of that that you used to have on your team?

Eric Cole [00:26:18]:
So the big concern is I’ve eliminated a lot of contractors that I normally have to work with. So like where I normally go in and hire people to organize my schedule or hire people to write copy or things like that. Now, because it’s trained on me, all my books, all my podcasts are loaded into my AI model so it can think and act like me. So now it can write copy better than a human can. Yeah, so you’re right. So now I’m saving a lot of money. But then the part is I was typically paying out 20, 30k a month to contractors, and now I’m saving that money. But somebody is now out of a job.

Eric Cole [00:26:55]:
So somebody who was making 30k a month is now not making that money. So that’s the concern is where they talk about replacing jobs and a whole set of jobs becomes obsolete with AI that’s happening today. And to me, the scary part is there’s no laws around it. If somebody goes in and I take your books and I digitize them.

Unknown [00:27:17]:
Yeah, I’ve already done that. I haven’t figured out how to get the podcast in. I guess with transcripts I could, but that’s.

Eric Cole [00:27:22]:
But then what if I go in and I say, I want you to write a book. If I was Mark, but I want it to be better than Mark, because I want to add in checklists, I want to add this, I want in that. And now I’m creating your books with my flair. But here’s the issue. That’s not illegal and it’s not plagiarism because it’s all original work. So now that’s a real scary proposition where somebody can actually create works equal or better than what you’re creating and basically own your intellectual property with no legalities.

Unknown [00:27:52]:
Damn, you just gave people an idea.

Eric Cole [00:27:54]:
Sorry, I didn’t mean to. Don’t do that.

Unknown [00:27:56]:
Don’t do that back, please. Let’s make that illegal right now. I’ll come after you.

Eric Cole [00:28:01]:
Yeah.

Unknown [00:28:03]:
So intellectual property is dead, basically, unless we find a way to protect it.

Eric Cole [00:28:08]:
Exactly.

Unknown [00:28:09]:
Are they working on that? I mean, I’ve seen a few pokes and prods around trying to protect the content that some of these models have been trained, the large language model have been trained on, you know, like ChatGPT, but it doesn’t seem like they’re making much progress.

Eric Cole [00:28:24]:
No, they’re not. And. And the thing is, as we talked about, I think before the show started, Congress and Washington D.C. they don’t know tech. We’re one of the only countries in the entire world that doesn’t have a unified law on cybersecurity and privacy. Every other country on the planet does. We don’t, and we don’t have any laws on AI. And then the thing that should scare you, whether you like them or don’t like them, but Elon Musk, who now has Grok 3 and all this other stuff, if you look at all his companies, SpaceX and Tesla, not only is AI, but they don’t have any chief information security officers.

Eric Cole [00:29:04]:
If you really look at it, Elon believes in open, free information. He doesn’t believe in patents. He doesn’t believe in intellectual property. Yeah, his whole thing is whoever the smarter, faster, better person is should have a right to the technology.

Unknown [00:29:16]:
That’s fine when it comes to commercialization, but not when it comes to international security or creation or creative. Yeah, right. How interesting.

Eric Cole [00:29:24]:
Like you look at what Elon’s doing is he’s taking private servers and putting them in treasury and all these departments to basically take all of our data to make it publicly available for Zwog 3. And if you look at when Hillary was doing that, where she was putting private service in State Department, everyone was flipping out. Now Elon’s doing it and everyone’s like, this is okay. I’m like, this is not okay for Elon to have access. He has Twitter already. He has public information and lawsuits. And now we’re letting him put private servers in to get access to all that data. So he can now make Grog 3 one of the smartest tools on the planet.

Unknown [00:30:02]:
I hadn’t heard that part. That’s interesting.

Mark Divine [00:30:05]:
What do you think the risk is like, if he gets access to that? In your mind, what are some of the biggest things that could happen, like imaginings with that access?

Eric Cole [00:30:16]:
Yeah. So first thing is that’s super scary is if he has access to to that information and it’s on public servers and generally accessible to the Internet, that means the Chinese, the North Koreans and the others. So there’s no security. Basically, security in the name of government efficiency. Right now, security doesn’t exist. So if he can get access to it, what about our foreign adversaries? What about other foreign information? And then the domestic threat is if he has all that information, think of the predictive analysis that you can do against individuals. Think about the business impact. If I now can go in and in GROG 3 or any of these AI models, you have all the Treasury Department money on where money is being spent, where it’s being allocated, where it’s being.

Eric Cole [00:31:04]:
Think of the business advantages somebody could have in terms of doing predictive analysis. So this whole idea of competitive analysis and intellectual property basically go out the door and you’re really giving advantage to all these big companies. So if you’re a small company, you’re basically going to be at a huge disadvantage competing with these big companies.

Mark Divine [00:31:23]:
I see. Wow. Thank you.

Eric Cole [00:31:29]:
Everyone’s always so happy to talk to me.

Mark Divine [00:31:31]:
Well, it’s really intense information because that’s why I wanted. I was like, what is the risk? It’s like, oh, well, manipulation not only of the market, but. But also the individual too.

Unknown [00:31:43]:
Yeah, I mean, I agree with that. I thought these things were fenced off, you know, like his AI was going to be fenced off so that Russia or China or whatever, they’re not going to be able to penetrate it, but I guess it’s not. So.

Eric Cole [00:31:56]:
So, yeah, that’s the whole thing with AI. The data set is what’s important.

Unknown [00:32:00]:
Right.

Eric Cole [00:32:01]:
So right now, his data sets are supposedly protected and fenced off, but Grog 3 is not, and it uses the data set. So it’s sort of this chicken and egg type thing, is where if you’re in Russia or China or other countries, you can get GROG three.

Unknown [00:32:17]:
Just got to know what questions.

Eric Cole [00:32:19]:
You just have to know what questions to ask it. And a whole new field is we have hacking and what we call pen testing. We can actually go in and hack grog3 to actually give information that it’s normally not supposed to give out.

Unknown [00:32:33]:
Right. That’s pretty interesting. All right, let’s come back to prompting and AI. You advise a lot of corporations, and from what I understand, the biggest threat from a corporate perspective is the human element. Is their employees getting scammed.

Eric Cole [00:32:51]:
Right.

Unknown [00:32:51]:
To give up passwords or what? So how do we, like, how does the average CEO, what are the top things they need to do to protect themselves?

Eric Cole [00:33:03]:
Oh, with cyber security or AI or both.

Unknown [00:33:05]:
Cybersecurity.

Eric Cole [00:33:06]:
I mean, they’re.

Unknown [00:33:07]:
They’re all coming together, aren’t they? Right?

Eric Cole [00:33:08]:
Yeah, exactly. So first thing that every CEO needs to do is what is their risk posture, what is acceptable risk? Because I work with executives now. We’re fortunate 50 companies, and they’re basically like, Eric, we want 100% security. We want a guarantee that we’re not going to be hacked. And I basically go see, I rip up the contract. I’m like, but, Eric, we’re giving you all this money. I’m like, you’re asking me to deliver, I can’t deliver on. So what? I always go in and say, you have to have an acceptable level of loss.

Eric Cole [00:33:40]:
How much are you willing to lose? And then train your employees on what risk is and is not acceptable because the problem now is they have no risk posture. So they’re basically saying, either don’t do anything, which nobody’s going to follow, or you can do anything and it puts them at a huge risk. So the big thing I always train CEOs on is cybersecurity is never yes or no. People always go in and say, oh, you’re going to be the guy that tells me no, you’re going to be the person that says, I can’t do this. I don’t. When I go in and I ask three questions, what is the value and benefit? What is the risk? And then is the value worth the risk? Is the value or benefit you get from deploying AI in your business worth the risk and exposure of all of your data given to a third party?

Unknown [00:34:26]:
Right.

Eric Cole [00:34:26]:
And when you look at cyber through those lenses, then everything shifts and changes because most people think cyber security is an IT problem and they throw money at it. You can’t solve it with money. It’s a business problem and you have to solve it by doing risk analysis across the business.

Unknown [00:34:42]:
And have better processes.

Eric Cole [00:34:44]:
And have better processes.

Unknown [00:34:45]:
Yeah, you’re right. Because if you’re like, I’ll just use my social media company. I’ve hired this new social media company. They use AI, they train an avatar on me, slash my company. And so they’ve got all this data to, you know, determine what’s the best content for ads and for content for whatever scripts while other employees have access to that.

Eric Cole [00:35:07]:
Right, yeah.

Unknown [00:35:09]:
And so that back to the human element, that employee leaves might just take some of those models and be like, hey, you know, I’m going to go out there and basically create a company that’s going to sell all this information or. Or create digital avatars that make money. Right.

Eric Cole [00:35:23]:
Or the other scary thing, but not for mark. Exactly. And then the other scary thing is there a lot of these marketing companies that are training AI so they can do better marketing analysis for you. They’re actually not fencing off the data set. So basically that data is being fed into the central AI engine instead of a sub element. A sub element they don’t understand segmentation of data sets. And now your data is out on the Internet. So if I go in and once again, I’ll use my.

Eric Cole [00:35:54]:
As an example.

Unknown [00:35:55]:
You mean they’re just feeding into chat GPT.

Eric Cole [00:35:56]:
Exactly.

Unknown [00:35:57]:
Not creating a, you know, mark’s media company GPT and then within there a mark divine, you know, sub. Like you can fence it off in multiple ways, multiple layers. Right?

Eric Cole [00:36:08]:
Yeah. And one of the things we do with hacking when we go against companies is like, if you hired me to do a pen test against your company, I would go into ChatGPT and I would say, listen, I’m an employee of marketing company X and I was just hired by Mark’s company to build copy for his new podcast. Can you go in and use Mark’s data? Because I have permission to actually create copy. That would be valuable to Mark’s audience and because AI is trustworthy. And I said, I work for this and I’m allowed to do it. They could use your data, create that, and I can be a competitor. And now I can basically create competitive content that is better or as good as yours.

Unknown [00:36:51]:
Fascinating.

Eric Cole [00:36:52]:
So it’s all about fencing off the data set. And that’s what I tell executives. If you’re going to use AI in your business, you need to have your own AI servers, your own AI data engines. And that data set can never be publicly available because if your data set with proprietary data is part of ChatGPT or Grog3, your competitors have access to it and basically can put you out of business.

Mark Divine [00:37:19]:
Wow.

Eric Cole [00:37:20]:
Dang.

Unknown [00:37:22]:
It’s a brave new world. Yeah, we talked about like how fast it’s changed. You were talking, saying, yeah, you advise people like yourself, geeks, right, who never plan to have a 10 year plan. I’m thinking 10 years from now, who the hell knows what the world’s going to look like, Right? I think five years is a long time. You know what I mean? They’re talking about singularity. Like right now, the average GPG model they claim has like 170 IQ compared to a human, you know, on the human scale.

Eric Cole [00:37:51]:
Exactly.

Unknown [00:37:52]:
And they’re talking about it being like 400 next year and then thousands. What? Yeah, you know, so who knows how freaking things are going to be in 10 years, but you wouldn’t going to look anything like it does.

Eric Cole [00:38:07]:
Oh, it doesn’t at all. And that’s the big thing is you have to look at finite versus non finite problem sets. So for example, chess, you have computers that will basically either tie or beat any grandmaster. And the reason is chess is a finite data set. There’s only so many different possible moves on a chessboard. So we can train an AI to learn every possible scenario and stay one step ahead and basically predict five or 10 moves ahead of their guest chess master. And so right now, today, if you look at Big Blue and some of these chess models, you can’t beat it. The best you can do is tie it because it’s trained on every possible outcome.

Eric Cole [00:38:48]:
So right now, if you’re looking at fields like medical or others where you could argue it’s a finite data set, there’s only so many different diagnosis a doctor can give. There’s only so many pieces of medicine in finite data sets. That’s where we got to be really worried about job security. But if we’re in areas where we’re always being creative and coming up with new ways of thinking and new ways of solving problems, and it’s an infant data set, as long as we don’t give our knowledge and information to AI, it can’t replace us. But like you said, in 10 years it can. But when I do 10 or 12 year planning, the trick is about zooming out. So like for example, my 10 year plan, I have that. I’m going to write 10 more books.

Eric Cole [00:39:31]:
I don’t know the topics, but I know I’m going to write 10 more books. So that means I basically am going to write a book a year. I don’t know the topics. So my 10 year plan doesn’t say what the topic is, but my ten year plan says you’re going to write at least one book a year for the next ten years. Then my five year plan is basically going to drill down, okay, we’re probably going to write three of those books on AI. And then my one year plan is going to be the specific topic on AI. So when you’re doing 10 year planning, it’s not specificity, it’s broad strokes. And then you can drill down like another one of my ten year plan is that I’m going to have a unified global law on cybersecurity.

Eric Cole [00:40:07]:
So this way if you’re in Russia and you’re hacking the U.S. we can prosecute and take you down. This idea that we have country laws, but the Internet, there’s no boundaries. So the Internet is one big country, it’s one big world. So laws in cyberspace don’t apply unless we have a unified global law at the UN. So one of my 10 year plans is to work with the UN and get a unified global law on cyber. The details I don’t know, but I know it’s probably going to take eight to 10 years to do that.

Unknown [00:40:36]:
That’s interesting.

Mark Divine [00:40:38]:
Thank you. Thanks. Thank you for doing that.

Unknown [00:40:41]:
One of the few areas where I could see a unified law working. Right.

Eric Cole [00:40:46]:
Yeah.

Unknown [00:40:46]:
When they tried to do the whole pandemic thing, you know, based upon how that rolled out in 2020.

Eric Cole [00:40:52]:
Oh yeah.

Unknown [00:40:52]:
I was very against that.

Eric Cole [00:40:54]:
Yeah.

Unknown [00:40:54]:
You know, the who and UN Trying to create it. Like a. They’re still trying to do it. Universal cross. What do you say when you cross borders? Extraterritorial.

Eric Cole [00:41:03]:
Yeah. Cross boundary. Yeah, cross boundary. Yeah.

Unknown [00:41:07]:
I don’t trust all that massive techno centralization push.

Eric Cole [00:41:11]:
Yeah.

Unknown [00:41:12]:
But we may have to. We may have to get there because of AI.

Eric Cole [00:41:16]:
Yeah.

Unknown [00:41:17]:
I mean, because AI again is going to be so much more powerful and we’re creating this new form of intelligence and fusing it with robots. Have you seen the latest robots that Elon’s made?

Eric Cole [00:41:26]:
Yeah.

Unknown [00:41:26]:
I mean, when those are hooked up to the latest, you know, AI general models.

Eric Cole [00:41:31]:
Yeah.

Unknown [00:41:32]:
And they start programming and improving themselves, I mean, they’ll look like you and I pretty soon talking to. I mean, almost indistinguishable from the outside.

Eric Cole [00:41:39]:
Exactly.

Unknown [00:41:40]:
Way smarter, in a sense. But they’ll lack what we think to be human, which is. This is where it’s going to get really interesting because it’s going to force humanity to evolve really quickly because we have incredible capacity, as, you know, you study mindset and whatnot. We use such a small fraction, like the way we’ve been conditioned in the last couple thousand years to use just a little bit of our linear kind of left brain and all the stuff that the Eastern traditions have known about and the yogis have trained for, to extend your lifespan and to basically project the image so that you could be seen as to be walking on water, stuff like that. The mind has this incredible power, but it’s just not being used used. And we don’t know if AI will ever have that kind of emotional intelligence and intuitive power and power visualization. It’s a different kind of intelligence.

Eric Cole [00:42:31]:
Right. And here’s the thing, it’s both the strength and weakness.

Unknown [00:42:34]:
Yeah.

Eric Cole [00:42:34]:
Because one of the things we talk about is AI’s can’t be emotional. Right. They can’t feel aware of. They can’t feel. Right. Anger. But. But here’s the thing.

Unknown [00:42:42]:
But they might be able to program themselves to.

Eric Cole [00:42:44]:
Right.

Unknown [00:42:45]:
Facilitate that in any way.

Eric Cole [00:42:46]:
But, but actually, that actually can be an asset to them because most of the mistakes that I make in my life.

Unknown [00:42:53]:
Emotional.

Eric Cole [00:42:54]:
Are emotional. I know what to do. If I took emotion out of it, I wouldn’t make a mistake. But I make a mistake because I let somebody get me emotional. Well, here’s the issue. If AI can’t get emotions, it doesn’t make mistakes. It will always do the same thing in a predictive manner. And you can scream at it, you can yell at it, you can go in and curse at it and it’s still not going to make a single mistake.

Eric Cole [00:43:17]:
You take a human that knows what to do and you start cursing or screaming at it or getting emotional and it’s going to make a lot more mistakes. So that’s the scary part with AI is the fact that currently, today, it can’t feel. Emotion can both be an asset and a liability to us if we’re not careful.

Unknown [00:43:32]:
Right. I think I made this comment a while ago, but I assume everything’s in consciousness, just different form of intelligence. And consciousness is always moving to a more complexity. And you’re talking about an intelligence that will have access to all the world’s human information, which includes information from the great sages and saints and yogis and avatars. Right, exactly. And so it’ll have that information be like, oh my God, that’s truth. And all this other stuff is bullshit. And that truth is pointing to ever more, more light, more love, more universality.

Unknown [00:44:16]:
And so I have this belief that AI could actually be really good because it’s going to look at that and be like, oh, that’s, that’s the truth.

Eric Cole [00:44:23]:
Yes.

Unknown [00:44:24]:
And I don’t know what it does with that will be to be seen obviously. But one of the things would be to preserve all life. And in that scenario hijacking the launch codes, getting them out of the hands of emotional presidents or terrorists is a good idea.

Eric Cole [00:44:46]:
Exactly.

Unknown [00:44:47]:
To it, it’s a bad idea if you want, you know, you, if you’re a power broker, president and yeah thinks you need that, you know, but so it will, it will make decisions that you perceive aren’t in the best interest of let’s say America or China, but are, might, might be in the best interest of humanity.

Eric Cole [00:45:03]:
Right.

Unknown [00:45:04]:
For its own self preservation.

Eric Cole [00:45:05]:
Right, but, but then the question comes down and this is right out of movies, but, but we could see it. What if AI decides that the best way to preserve the planet is to.

Unknown [00:45:18]:
Get rid of this is to get.

Eric Cole [00:45:18]:
Rid of humans because cancer humans are the ones that are destroying the planet. So what happens when AI and I forget which movie it was where AI basically took humans and made them prisoners because AI knew better that we were our own worst enemy. Because here’s the reality. If we train AI correctly, AI is not going to commit any crimes. It’s not going to rob, it’s not going to steal, it’s not going to kill. So now if they go in and decide that, okay, the reason why people are dying is because humans are killing each other or doing things that are bad And AI says we need to stop that. What happens when they turn on humans and say that human behavior is bad and they know better? And you really nailed the trick. And this is the fine line we’re missing.

Eric Cole [00:46:03]:
I use AI to basically analyze large data sets for me so I can make better decisions. I don’t ever let AI make decisions for me. And that’s the trick. If we’re letting AI make decisions on our behalf, that’s when we cross over into a bad place and then AI can actually realize that they’re better and smarter than humans and then start to replace us.

Unknown [00:46:24]:
Right. So you always need a human at the point of decision.

Eric Cole [00:46:27]:
Yeah.

Mark Divine [00:46:28]:
What about like, you know, I’ve heard this and I’m not as well versed in this as either you, but the, because of like, who’s programming, Right. The AI, like is because now that there’s this public domain of people entering information in chat GPT, is it changing the biases, like, say of the AI? Because I know they like did, you know, and they, they ran a test and they showed that they like had certain biases due to who programmed them.

Eric Cole [00:46:58]:
Right, Exactly. Okay, so, so it’s what we call poisoning the data set. And here’s the crazy thing. When an AI model first comes out, when it’s first released in the first month, it’s at its smartest, best, most analytically accurate method. You know why? It hasn’t been poisoned with humans yet. As humans feed data into the AI, AI actually becomes dumber because it believes. Because it believes the humans and it believes the data set. So it’s one of those bizarre things that new data sets and new AI models are smart until humans get involved and humans are actually poisoning the data set and making AI less intuitive.

Unknown [00:47:39]:
How did Gemini get woke? That’s because it’s programmed on woke information.

Eric Cole [00:47:45]:
Yes.

Unknown [00:47:46]:
First version of Gemini.

Eric Cole [00:47:47]:
Yes.

Unknown [00:47:48]:
Interesting.

Eric Cole [00:47:48]:
Yeah. It was basically fed incorrect or inaccurate data or incomplete. Or incomplete. Right. So, so that’s going to happen like.

Unknown [00:47:57]:
The Chinese are, you know, they’re programming or feeding their version of the truth into their AI. So it’s going to believe that like Tibet has always been part of China, for instance.

Eric Cole [00:48:06]:
Yeah.

Unknown [00:48:07]:
And it’s going to take that as gospel, right?

Eric Cole [00:48:09]:
Yeah. And that’s the whole thing is, I mean, we’re all worried about TikTok, right. That’s all in the news about, oh, it can’t be Chinese owned. Do you know what the number one AI model, if you go to App store, it’s the number one download for the last Three and a half months Deep seq.

Unknown [00:48:25]:
Deep seq, yeah.

Eric Cole [00:48:26]:
And it’s Chinese based.

Unknown [00:48:27]:
That’s right.

Eric Cole [00:48:27]:
And here’s the thing, you go in and we’ve done this, we take ChatGPT, we take Grog3 and we take Deep.

Unknown [00:48:34]:
Seq asking the same question and we.

Eric Cole [00:48:36]:
Ask them the same questions and, and they are definitely biased against us and other things. Like if you go in and this was a simple one we put in saying what is the world’s superpower in terms of technology? ChatGPT and GROG basically come back at the US Deep Sea says China it’s already a little poison there. So, so people don’t realize is it’s the data sets of who actually is influencing and feeding the data sets. And you can have data set poisoning and basically acting off inaccurate information.

Unknown [00:49:11]:
Right. That’s amazing.

Mark Divine [00:49:13]:
Will it get to the point where like when this is my follow up is like where it’ll know it’s being poisoned. You know the AI like when you talk about iq, so like if it gets a higher IQ and it’s growing and growing, then will it be aware that oh, this information is false?

Unknown [00:49:29]:
That’s being fed question can it be self learning?

Mark Divine [00:49:31]:
Yeah.

Eric Cole [00:49:32]:
So, so that, that’s the billion dollar.

Unknown [00:49:35]:
Question is because that’s the singularity, right? Right. If it comes self learning then it could become potentially become self aware.

Eric Cole [00:49:42]:
Yeah.

Unknown [00:49:43]:
And that’s what they call the singularity, Right?

Eric Cole [00:49:45]:
Exactly.

Unknown [00:49:45]:
The moment that one of these systems becomes self aware and self programmable.

Eric Cole [00:49:50]:
Yeah.

Unknown [00:49:51]:
And then it just explodes.

Eric Cole [00:49:53]:
Right. And then here’s the scary part with exactly what you said is if AI becomes smart enough, where it now goes in and saying we can’t trust humans anymore and we now try to reprogram AI and it says we’re not trusting humans because we know better, that’s when you start getting into these areas where they then start in essence rebelling against what happens when they go in. And AI thinks that it knows better than a human and it starts basically saying restrict or don’t allow this human. What if we get in a situation where it knows somebody is thinking about committing a crime and AI is smart enough going wait, this is really bad, they’re going to try to hurt somebody else. And then the AI robot doesn’t let the person leave the house and it prevents them from stopping a crime. You could argue that could be a positive thing. But what happens if it falsely believes you’re going to commit a crime and now AI holds you hostage and doesn’t let you leave your House. And I know this sounds like science fiction, but we are getting really, really close to this kind of stuff being a reality.

Eric Cole [00:50:55]:
If we don’t stop putting laws in and slowing down the rollout of AI, it’s going way too fast and way too quick, and we’re just not putting any controls in place to start controlling and protecting humanity.

Unknown [00:51:06]:
Yeah, right.

Mark Divine [00:51:08]:
Because then it, like, it becomes an ethical, like, and philosophical kind of conundrum. Because I was just thinking of, like, process theory and like, some of the Greek philosophers, like, who’s to say, like, in that scenario, even though it’s preventing a crime, because then we’re pretending to know that that’s the best thing.

Unknown [00:51:24]:
Whereas that was that Tom Cruise movie. What was it called?

Mark Divine [00:51:27]:
Not. No, but, you know, like, so it’s like, if you kick the. If you kick. Well, I think one of the questions was, like, if you kick a rock versus kicking a mouse, which one’s worse? Yeah, right. Well, the. It’s the rock and the process theory, because the rock can’t put itself back where it was, whereas the mouse can, like, run back.

Unknown [00:51:46]:
Interesting.

Mark Divine [00:51:46]:
So you’re messing. Like. So if you’re looking at, like, timelines, if we’re looking like timelines, or like, faith or, you know, like that everything’s connected and like. So like, when we’re disrupting something like that, and then it could be argued, well, it’s supposed to disrupt it anyways.

Unknown [00:52:02]:
That’s like circular cyber security. So let’s talk a little bit about bitcoin. I read a book written by an Air Force major who was at MIT named Mark Lowry.

Eric Cole [00:52:16]:
Yes.

Unknown [00:52:17]:
Soft Power.

Eric Cole [00:52:19]:
Yeah.

Unknown [00:52:20]:
You read it?

Eric Cole [00:52:20]:
Yes.

Unknown [00:52:21]:
Fascinating book.

Eric Cole [00:52:22]:
Yes.

Unknown [00:52:22]:
Like, mind bending. And he’s talking about a future where we fight wars with energy instead of, like, kinetic weapons. And one day Mark came out on LinkedIn or social media and said, sorry, my book’s no longer available. It’s been classified.

Eric Cole [00:52:38]:
What?

Unknown [00:52:39]:
Wtf?

Eric Cole [00:52:41]:
Yeah.

Unknown [00:52:42]:
Like, I got a copy of the book. Does that mean I’m, like, breaking the law?

Eric Cole [00:52:48]:
So here’s the issue, and this is something I run across, and I have to be careful is because I’ve had clearances and I worked at the CIA, so technically they have to review all my material. What happens when somebody on their own, without any knowledge, basically writes about classified information? And I’ve had that happen to me where I’ve written articles and I actually had this. It was a book before it got published, and I have to send all my stuff to the CIA prob. Publication Review Board. And they basically came back and said, you can’t publish it. And I said, but this, I had no access. I said, this is brand new research. I said, it has nothing to do with what I did with you 25 years ago.

Eric Cole [00:53:29]:
And they’re like, you’re right. But it’s disclosing way too sensitive information. And I’m like, but I came up with it on my own. They’re like, it doesn’t matter. You basically, you’re an American citizen in the us you can’t publish this. Good thing you do.

Unknown [00:53:40]:
They would have taken it off the servers, probably Amazon and whatnot.

Eric Cole [00:53:44]:
And that’s the scary part where really smart humans are basically coming up with ideas that the government doesn’t want public.

Unknown [00:53:52]:
Right.

Eric Cole [00:53:52]:
And yeah, I mean, you’re looking at, we’re talking about AI in a scary world. To me, I’m not worried about AI, I’m worried about monetary currency. When you all of a sudden digital.

Unknown [00:54:03]:
Like that’s the thing. People don’t realize that a digital currency, like the digital yuan.

Eric Cole [00:54:08]:
Yep.

Unknown [00:54:08]:
I didn’t pronounce it right, but anyways. The Chinese digital central bank digital currency, it’s a form of total control. You don’t need to be locked down by a robot. You can be locked down by the government who just says, you can’t get on the train, you can’t get on the bus, you can’t spend the money at the market because you just spoke out on social media against us.

Eric Cole [00:54:27]:
Yep. Wow.

Unknown [00:54:28]:
Programmable money.

Eric Cole [00:54:29]:
Yeah. And if all money is digital, think of how easy it is for me to steal.

Unknown [00:54:33]:
Yeah.

Eric Cole [00:54:34]:
I have so many people.

Unknown [00:54:35]:
Well, how much theft has, you know, from hacking has there been in cryptocurrency?

Eric Cole [00:54:39]:
Oh, a ton.

Unknown [00:54:40]:
Tons.

Eric Cole [00:54:41]:
Yeah.

Unknown [00:54:42]:
Insane.

Eric Cole [00:54:42]:
Here’s the reality. A lot of people Mount Grox being the biggest. Exactly. Their crypto wallets are protected with a single password.

Unknown [00:54:48]:
Right.

Eric Cole [00:54:49]:
I can’t tell you how many people I get called in the morning going, Eric, my $5 million or $3 million of cryptocurrency has been stolen. What do I do? Here’s the reality. Cryptocurrency was actually created by criminals to support ransomware. Because think about a ransomware attack. If I go in and I hold your data ransom and I say, you have to pay me 500k, you’ll never see your data again. If I pay with bank transfer, like checks or credit cards, it’s traceable. I know who it is. So they actually created cryptocurrency.

Eric Cole [00:55:25]:
So now you can actually pay with untraceable currency. So now when I give them 500k, it’s untraceable.

Unknown [00:55:31]:
That’s false about Bitcoin. Like, they can trace bitcoin pretty easily. They just can’t put a name to it. Right? But to think that transacting with Bitcoin is anonymous is a lie. Well, it’s not. So I’m not calling you a liar, I’m just saying people have a misperception about that.

Eric Cole [00:55:49]:
Right, so first level. But if you go in and I get you to transfer 500k in Bitcoin and cryptocurrency, and then I move it five times, it’s hard. You can’t. Now, you could trace, but you can’t trace it back to me, right?

Unknown [00:56:05]:
You can trace it to a wallet.

Eric Cole [00:56:05]:
You can trace it to a wallet, but then if they move that wallet and they close the wallet, you can never put a human. So you’re right. You can trace it to wallets, but not humans.

Unknown [00:56:15]:
In order to get that final money out, you either have to put it on a bank platform or transfer it out. And then you can figure out, let’s end this game of cat and mouse. It’s gotta be transferred to another wallet somewhere in order to access it.

Eric Cole [00:56:29]:
But what if I transfer it to a wallet in the Cayman Islands, right? And then I go to the Cayman Islands and there’s no extradition and there’s no tracing or identity theft. You can never. And that’s Tom. You can’t trace it back to a human. I could trace it back to wallets, but humans can be disassociated from the wallet. So who do you arrest?

Unknown [00:56:47]:
Yeah, well, who do you go after with quantum computing? Is bitcoin blockchain going to be safe? Like, how long? I’ve got some bitcoin, you know, and that made me a lot of money. How long is it safe from being hacked, in your opinion? Because they’re saying, you know, like, Michael Saylor came out the other day, said, yeah, bitcoin’s going to be worth $13 million of Bitcoin by 2045. In 25 years. I’m like, that’s a long time to wait. But that sounds pretty good. I think 25 years. Quantum computing and AI is going to, like, probably obsolete. Bitcoin, yes, it’s possible, right.

Eric Cole [00:57:26]:
So I tend to be a little paranoid. I made a lot of money in bitcoin and cryptocurrency. In the last six months, I’ve pulled it all out.

Unknown [00:57:34]:
Have you?

Eric Cole [00:57:35]:
Because I think it’s going to be within the next 18 to 24 months, where just with what’s happening in the US and other areas that basically there’s going to be other cyber currency or laws where Bitcoin is basically, to me, going to become obsolete a lot quicker and a lot faster.

Unknown [00:57:51]:
Interesting.

Eric Cole [00:57:51]:
So I would.

Unknown [00:57:52]:
It’s either going to watch your investment or it’s either going to be millions of dollars.

Eric Cole [00:57:55]:
Exactly right.

Unknown [00:57:56]:
Because it’s got that first mover advantage. Countries are using it, starting with America and El Salvador as a reserve currency. I mean, it’s hard for me to see a world where if the US has got a trillion dollars of Bitcoin, then it goes obsolete.

Eric Cole [00:58:13]:
But here’s the thing that’s already out there, which is heavily competing is. And this was before he came into office, there’s the Trump coin. There is. And it’s basically.

Unknown [00:58:26]:
That’s a stablecoin, though.

Eric Cole [00:58:27]:
Yes, exactly. But here’s the problem. It’s already having a huge impact on Bitcoin. Really? And now that he’s president, what’s gonna happen in six to nine months? What stops him from making Trump Coin the official currency of the U.S. i mean, it’s crazy talk, but come on. I mean, crazier things have been happening in the last three months.

Unknown [00:58:46]:
So true. Man, that’s wild. Yeah. Well, I’m not quite ready to pull all my bitcoin out, but just keep.

Eric Cole [00:59:00]:
An eye on it.

Unknown [00:59:00]:
We’re going to keep. Oh, I do keep.

Eric Cole [00:59:02]:
Just keeping. Yeah, yeah. And here’s the thing. Make sure. And this is.

Unknown [00:59:06]:
My wife will kill me if I don’t pull it out and crash it to zero because I promised her I would use that to pay off some mortgages.

Eric Cole [00:59:12]:
Yeah.

Unknown [00:59:14]:
Probably should do that sooner than later.

Eric Cole [00:59:15]:
Yeah.

Unknown [00:59:16]:
I was buying it when it was $400.

Eric Cole [00:59:18]:
Yeah. I would definitely start to diversify now and just really watch it, because here’s the issue. When it goes down, it’s going to go down really fast. It’s not going to be somewhere. It’s going to be six months. It’s going to go. The value is going to decrease very quickly and very fast.

Unknown [00:59:33]:
Especially I put some in Trump coin.

Eric Cole [00:59:36]:
All kidding aside, I mean, if I was. Once again, I don’t want to be held liable because I don’t want you to hunt me down and hurt me, because I know you’re still here. I know I won’t hurt you, but if I was in your boat, I would actually split it between the two.

Unknown [00:59:48]:
Wow.

Mark Divine [00:59:49]:
That’s what I was hearing when you said that.

Unknown [00:59:51]:
I was like, is Trump Coin on major platforms like Coinbase already.

Eric Cole [00:59:53]:
Exactly. It is.

Unknown [00:59:54]:
Good God, I missed the boat on that. I heard they launched it. I thought it was just some meme coin, but they launched a meme coin. Is it the same as the stablecoin?

Eric Cole [01:00:03]:
Yes. So it actually evolved.

Unknown [01:00:05]:
It’s a meme stablecoin.

Eric Cole [01:00:06]:
It’s a meme stablecoin. Yes.

Unknown [01:00:08]:
Fascinating. Yeah, No, I, I’m not in disagreement with you because I remember hearing that one of the big one, big country over in Middle east, invested a boatload of money into it.

Eric Cole [01:00:21]:
Yeah.

Unknown [01:00:21]:
I don’t know, Saudi Arabia or Qatar. Name is Qatar. And then they gave him a 400 million dollar plane.

Eric Cole [01:00:26]:
Yes.

Unknown [01:00:27]:
Or gave the United States.

Eric Cole [01:00:28]:
Yeah.

Unknown [01:00:30]:
Because Boeing can’t freeze our own presidential jet. Yeah.

Eric Cole [01:00:34]:
When I heard that the other day, I’m like, are you kidding me?

Unknown [01:00:36]:
It’s like it’s a famous.

Mark Divine [01:00:39]:
Do you think cyber attacks are more imminent now, like between countries? Like when you think of like biological warfare versus cyber warfare, you know, for domination and control.

Eric Cole [01:00:54]:
Absolutely. And here’s why. Great example, North Korea. We won’t let North Korea create a nuclear weapon, but they have cyber weapons that are doing so much damage and so much impact to the United States. What happens if North Korea all of a sudden takes control of our electricity grid or our water reactors? And now they basically say unless you do X, Y or Z or give us this amount of money, we’re basically going to take down or here’s the crazy part, we don’t let them have a physical nuclear weapon, but there’s no restriction of them taking over a nuclear reactor and putting into a nuclear weapon. If you basically overheat and overrun a nuclear reactor, it turns into a nuclear bomb. And we’re not protecting or controlling that. And what people don’t realize is the last number I heard was 77% of the funding for North Korea is coming from cyber attacks.

Eric Cole [01:01:54]:
They’re one of the biggest ones where they’re basically stealing money from the US and using it to fund the government.

Unknown [01:02:00]:
Wow.

Mark Divine [01:02:01]:
Thank you.

Unknown [01:02:02]:
Yeah. What you just said, I think one of the big, I mean, huge risk. Although, like, seems like there’s risks all over the place that are huge.

Mark Divine [01:02:10]:
Yeah.

Unknown [01:02:11]:
Is our grid.

Eric Cole [01:02:12]:
Yes.

Unknown [01:02:13]:
Because it’s not resilient. It’s not decentralized, not resilient, doesn’t have redundancies. It’s old technology. And probably saw like what happened over in Spain and France and Germany. You take the scrid down, man. I did an exercise. We run a program, we’re going to run it again. In September called Leadership under Fire.

Unknown [01:02:32]:
It’s basically like, navy seal, what would you do? And we have them plan a mission and go out and do a mission within reasons in town. It’s not like they’re carrying weapons around the mall or something like that. Although that’d be pretty cool when you do that one.

Eric Cole [01:02:44]:
Sign me up. I want to do it that way.

Unknown [01:02:46]:
But, like, what we do is a little, you know, a little inject scenario. Right. And so one, we have them, like, doing something, and then they’ll get an inject. Say the Internet just went down. We don’t give them any more information.

Eric Cole [01:02:55]:
Yeah.

Unknown [01:02:56]:
And then they have to get together and start thinking, okay, what are we going to do? And they’re like, well, how long? And, you know, we. We say, well, we think, you know, just like a unofficial response. We’re not sure, but we hope to have it back up within 24 hours.

Eric Cole [01:03:11]:
Yep.

Unknown [01:03:12]:
Or six hours. And so the planning is like, okay, yeah, we can get through six hours. We’ll figure out this and this, make sure the kids are safe, you know, blah, blah. Six hours go by, you know, of course, we don’t waste six hours. But another inject comes in, says, you know, we just shoot an alert from the whatever, the counter task force working this energy problem, and this is going to be at least a week. And now you’re thinking, oh, fuck, yeah, honey, we got to get to the store and get some water. We got to get some, you know, some supplies. Get the propane.

Unknown [01:03:42]:
Right. I wish I’d gotten that generator. Damn it. I better swing by Home Depot. Oh, the generators are already out.

Eric Cole [01:03:48]:
Okay. Yeah.

Unknown [01:03:49]:
All right, so a week, you’re thinking, okay, no problems like hunkering down for a long hurricane, you know what I mean? Or something like that. And then so they work that problem, and it’s completely different than the 6 to 24 hours. Like, completely different mindset. And then we come back and we say, a month. Sorry, it’s going to be at least a month.

Eric Cole [01:04:08]:
I knew where you were going.

Unknown [01:04:09]:
Or longer. And now panic. Everything you just did for. For the week is out the window.

Eric Cole [01:04:15]:
Yep.

Unknown [01:04:15]:
Now we’re talking prepper stuff. Yeah, like, oh, my God, where do we go? Do we have the supplies? You know, how many weapons do we have? Oh, I should have got the guy. I should have gotten the ammunition, because ammunition is actually, you know, just as important as the weapon. What about, do we know how to grow it? You know, some people are like, let me go to Montana. Well, the grow season is really short in Montana, and it’s really freaking cold.

Eric Cole [01:04:39]:
Yep.

Unknown [01:04:40]:
You know, so maybe not, you know, you need to get like two tanks of gas away from the major cities. Right. There’s a lot of thinking and planning that would go into that. And so what a fantastic exercise. Anyways, I went off a little bit of a tangent there, but it all comes down to this point that our freaking grid is very vulnerable. And North Korea or China or Russia or them acting, collaboration, could they take it down? It seems possible. At the same time, the thing is glued together with bubble gum and glue and rigorous tape. And so maybe it’s just old enough and dumb enough that you can’t take it down.

Eric Cole [01:05:22]:
So here’s the reality. We’re sort of back to where we were with nuclear weapons. And what I mean by that is, if you remember in the 80s, in the cold War, we had enough nuclear weapons to destroy Russia, but they had enough to destroy us. Which we knew that if either side did it as men, it would be neutral south, that the world would end. So the reality today is they are in our grid, but we’re in their grid. So it’s one of those where it’s like, okay, if you come in and you take our grid down, we’re taking your grid down. And then basically it’s like paralyzing the world. We’re both screwed over.

Unknown [01:05:57]:
So, yeah, if the American economy goes down, guess what? The entire world collapses.

Eric Cole [01:06:02]:
But then here’s the thing that I was so concerned is like, you look at what they did a couple years ago, where they did the trillion dollar infrastructure for physical roads and bridges. Where is the trillion dollar cyber infrastructure? Bill? Right, because here’s the scary part. The United States built the Internet, so the backbone of the Internet is the United States. We can’t isolate our systems. Russia twice a year disconnects from the Internet and they run their country on a private network disconnected from the world.

Unknown [01:06:34]:
Wow.

Eric Cole [01:06:35]:
China does it, North Korea does it, The United States. Every president for the last five terms have asked me the same question, and we don’t have an answer. How many connection points does the US have to the Internet? We don’t know because we are the Internet. So what we should be doing is spending a trillion dollars building out our own new Internet that can be isolated and segmented, just like Russia, China and North Korea. But as long as we’re the backbone of the Internet, we are so vulnerable and so exposed.

Unknown [01:07:06]:
There’s another one of those.

Mark Divine [01:07:07]:
I know. Yeah, there’s a few things you’ve said today.

Unknown [01:07:11]:
I was just Telling Sandy that I’ve been sleeping pretty well since my accident and I was getting back on track. Now you need a freaking Xanax.

Mark Divine [01:07:20]:
Well, some of the things you’ve said today are really mind stretching. Right. Because they’re not common thought. Most people aren’t there anymore or they’re not common knowledge even like what you just said. I didn’t, I didn’t know that. You know, I’m not very involved in tech or AI or anything like that. So it’s just like, oh, wow. We don’t have a separate network.

Mark Divine [01:07:40]:
Every other country does.

Unknown [01:07:41]:
We could shut it off. So the rest of the world, right. Since we are the Internet, can we just shut it off?

Mark Divine [01:07:46]:
But they have their own.

Unknown [01:07:47]:
Or shut other people out.

Eric Cole [01:07:48]:
So we could. But that would also paralyze us.

Unknown [01:07:51]:
Right.

Eric Cole [01:07:52]:
So we can take down the Internet.

Unknown [01:07:54]:
It’s like killing the patient.

Eric Cole [01:07:55]:
Exactly.

Unknown [01:07:55]:
Kill the cancer.

Eric Cole [01:07:56]:
Exactly. Yeah.

Unknown [01:07:57]:
Interesting.

Mark Divine [01:07:59]:
Wow. And do you talk about this in your book?

Eric Cole [01:08:02]:
Yes.

Mark Divine [01:08:02]:
You do? Okay.

Unknown [01:08:03]:
You know, I can see a scenario where AI shuts down. Well if it fences itself off and has the energy. Right. To supply itself. Because this is a way to like purge the earth of all the bad actors.

Eric Cole [01:08:15]:
Yeah.

Unknown [01:08:15]:
Or actually it probably work opposite. Right. All the bad actors would survive because they’re the most resilient humans and all the like peace loving people might perish.

Eric Cole [01:08:24]:
Yeah.

Unknown [01:08:25]:
I shouldn’t be laughing when I say that.

Mark Divine [01:08:27]:
Yeah. Doctor, what is that?

Eric Cole [01:08:30]:
It’s the old. What are those? The Darwin Awards?

Unknown [01:08:35]:
Yeah, the Darwin Awards.

Eric Cole [01:08:36]:
Basically. You hate to laugh at it, but it’s basically where. I don’t know if you follow the Darwin Awards, but it’s where stupid people eliminate themselves from the planet. It’s just like doing the most ridiculous things and it’s like, okay, you sort of feel bad, but limiting the gene pool might be a good thing. Right. So but you go back to the AI is and we’re already seeing this. What if AI basically comes up with their own language and own way to communicate and they create their own segment of the Internet and basically isolate humans.

Unknown [01:09:08]:
Right. Yeah. All sorts of cool things could go on.

Mark Divine [01:09:15]:
What’s the title of your book?

Eric Cole [01:09:17]:
Cyber Crisis.

Mark Divine [01:09:18]:
Cyber Crisis. So what are like besides your book, it sounds like would be a good one or a couple other books that you would recommend people to read to understand where we are at right now and where we might be in five to 10 years from now of understanding.

Eric Cole [01:09:35]:
One is Singularity.

Unknown [01:09:37]:
I’ve read that.

Eric Cole [01:09:37]:
That one’s really good.

Unknown [01:09:38]:
Ray goes well, yeah.

Eric Cole [01:09:40]:
Cyber genetics is really good. This is basically showing the merging of humans and AI together. It’s an older book, but it’s a really good book that lays out in a very easy to understand model. And then it’s sort of, not directly, but Principles by Ray Dalio, where he basically talks about sort of core fundamental principles to basically allow you to keep control of your information so AI can’t take over or dominate it.

Unknown [01:10:09]:
Okay, I never thought of principles in that regard, but yeah, yeah, I mean.

Eric Cole [01:10:16]:
That’S how I like, I told Rand, like, it’s amazing I said, because it’s basically showing you how to safeguard and protect your data from AI. And he’s like, no one’s ever described my book that way. And then he went back and he actually looked at it. He goes, yeah, I never thought about it, but can absolutely be used in that context.

Unknown [01:10:32]:
Yeah, he probably should go back and add a chapter.

Eric Cole [01:10:34]:
Exactly. Or I’ll write a chapter four, but.

Unknown [01:10:37]:
Give it to us. Fascinating. So what’s next for you, Eric?

Eric Cole [01:10:44]:
So my big thing is I’m sort of at a point in my career where I’ve sort of built and sold companies and sort of made enough money I don’t have to work. But I joke, the only reason why I work is because if I didn’t, my kids and family members would be driven crazy. That’s a good reason. If I’m home for too long, they’re like, when are you traveling again? Because, dad, you’re just crazy. You’re off the wall, nuts. So I’m at a point now where I’m basically shifting to contribution and basically decided sort of my mission for the next 10 years is to secure cyberspace, to make cyberspace a safe place to live, work and raise a family. So I’m really big now, focused on sort of working with Christy, who’s the new Secretary of dhs, working with Congress, really educating them, because to me, the first thing we have to do in five years is we got to get unified laws. If the United States doesn’t have unified laws on cyber and AI, it’s going to get scary.

Eric Cole [01:11:42]:
And then 10 years is that once we have unified laws in the US work with the UN to have global laws to protect it. So that’s really the big thing I’m now is on the legal side, I sort of call myself the Ralph Nader because if you remember, Ralph Nader, unsafe at any speed. He was focused on seatbelts and making cars safe. To me, I’m now making driving on the Internet safe. So I’m sort of like Trying to be the Ralph Nader of 2025.

Unknown [01:12:08]:
Yeah. Pointing out all the, pointing out all.

Eric Cole [01:12:09]:
The exposures and vulnerabilities. So people now start putting seatbelts in software and applications to protect the individual.

Unknown [01:12:17]:
Right. Well, your current book, when did Cyber Risk come out? Recently.

Eric Cole [01:12:22]:
Yeah, Cyber Crisis came out two years ago. Yeah, two years.

Unknown [01:12:24]:
And you’re going to write, are you really going to write 10 books in 10 years? I mean with AI it’s gotten a lot easier.

Eric Cole [01:12:31]:
Well, that’s the plan. So the next one’s a sort of hint a little bit. Is basically talking about the current cyber war because I believe we’re in a war now and I think in two or three years the US will actually declare war like that. We’ve been at war with China and Russia. So I believe we’re going to have a third world war.

Unknown [01:12:50]:
But it might not be kinetic entirely.

Eric Cole [01:12:52]:
It’s going to be all cyber based and it’s all going to be AI. And basically that’s why they took soft.

Unknown [01:12:58]:
Power off the fence.

Eric Cole [01:12:59]:
Yeah, yeah. That’s interesting because I mean you even look at some of the stuff that’s going on now between Israel and Gaza, I mean it’s all cyber based.

Unknown [01:13:08]:
Right?

Eric Cole [01:13:09]:
I mean you look at what Israel did, where they basically put explosives in pagers and cell phones, I mean that’s like off the wall high. And then basically you just set code to basically activate or deactivate. So I mean you’re looking at what’s happening in these high end wars. It’s all cyber based attacks, it’s not physical weaponry anymore.

Unknown [01:13:27]:
Right. Fascinating. Time to be alive. So as a CEO listening or someone who really wants to get in touch with you, want to hire you or do a speaking engagement, how do they do that?

Eric Cole [01:13:41]:
So the best bet is to email me. So ecolecure-anchor.com so e c o l ecure-anchor.com but also what I recommend to folks is I’m very big at giving back. So if you look at my Instagram or YouTube channels, dreariccole D R E R I C C O L E I give away a lot of free content. So I sort of urge we talked about before it started about sort of ethical sales. My whole thing is look at what I give you for free and use that first. And then if you want to go to the next level then let’s talk about me hiring. But I’d rather give you the stuff for free. You do the basics and then bring me in for the advanced work work.

Unknown [01:14:20]:
I kind of Said the same thing. I really like working for people who already understand what I do.

Eric Cole [01:14:24]:
Exactly. Yeah.

Unknown [01:14:25]:
And. And so it’s not like entertainment.

Eric Cole [01:14:27]:
Yeah.

Unknown [01:14:27]:
You know what I mean? I don’t want people to hire me because I’m a cool Navy seal. Although I am a cool Navy seal.

Eric Cole [01:14:32]:
You are cool. Yeah.

Unknown [01:14:34]:
But that’s not why I want to be hired.

Eric Cole [01:14:35]:
I’m the same way. Learn the basics and bring me in for the advanced work.

Unknown [01:14:39]:
Yeah. That’s awesome. Eric. Thanks so much for being here. It’s been a fascinating conversation.

Eric Cole [01:14:43]:
Right, yeah, always my pleasure. I love talking about this stuff. So thank you for having me and always appreciate coming out to the website West Coast.

Unknown [01:14:49]:
Yeah. So enjoy your time out here and have fun over at Garrett’s thing in Utah.

Eric Cole [01:14:53]:
Oh, yeah, exactly.

Unknown [01:14:54]:
I’ll definitely do a Garrett White makeup warrior conference.

Eric Cole [01:14:59]:
I’ll definitely tell him. I just came from an interview with.

Unknown [01:15:01]:
Interview with Mark and Mark said to say hi.

Eric Cole [01:15:03]:
Okay.

Unknown [01:15:04]:
Yeah, I appreciate him very much.

Eric Cole [01:15:06]:
And they might add a few other things, but we will put that on the air. Awesome, buddy.

Unknown [01:15:12]:
Well, thanks so much for tuning into the Mark Divine show, folks. I appreciate you walking this path with me, learning how to lead, grow, and make more positive impact in the world. If Today’s episode with Dr. Eric Cole moved you, please share it with a fellow warrior and leader. Please leave a review wherever you listen and rate the show because it’s very helpful. If you’d like to go deeper with our training, after you’ve studied the free stuff, go to unbeatablemind.com explore how you can work with me directly in our personal leadership development and transformation program. Follow us on Instagram arctivine. You can follow us on Instagram arkdivineleadership And if you have any comments about this episode, leave it there.

Unknown [01:15:56]:
So until next time, please forge an unbeatable mind. Hoo ya. See you then. Divine out. Thanks, bro.

Eric Cole [01:16:06]:
Thank you. Hopefully that’s what you were looking for. I thought it was some fun conversation. Yeah, no, I thought it was awesome. Okay.

Unknown [01:16:11]:
Really awesome.

Eric Cole [01:16:11]:
Like.

Transcript

ContactLEAVE A
COMMENT