[00:00:15] Matt Clifford: Hello, and welcome to The Entrepreneur First Podcast where we uncover the stories and goals of some of the world’s most ambitious founders. My name is Matt Clifford, and co-founder of entrepreneur first and your host for this episode. Today we’re talking about the challenges of moderating the oceans of content that are published on the internet every day. So our guest for this episode is Sasha Haco, CEO and co-founder of Unitary. Unitary is a company that uses AI to protect people from harmful material online. Sasha has an amazing backstory, and she and the team are building something important and valuable at unitary. So sit back, relax and tune in to this conversation with Sasha Haco.
At Entrepreneur First, we get to meet and work with ambitious individuals who have excelled across an extraordinary range of fields before they decided to become founders. Sasha was pursuing a PhD before she dived into entrepreneurship, so I asked her to tell us a little bit about what she studied.
[00:01:19] Sasha Haco: So I joined EF in 2019, pre pandemic. And before that, I was finishing my PhD, and I was sort of in the world of academia. So I did a PhD in looking at black holes, so completely different, and sort of irrelevant, and then spent a year in America, loved it, and sort of started to realise that black holes are really interesting and really enjoyed that side of thinking about really hard problems.
What I realised was that I enjoyed the problem solving, thinking about difficult problems, more than I was sort of fascinated by black holes in their own right, and I started thinking, actually, I would do something a bit more relevant and impactful, there’s a sort of horrible realisation with black holes, maybe it’s just the fact that they’re so you know, out of this world, that I realised that actually, you know, no matter what I do, and how about how brilliant I could possibly be at black holes, I realised that I wouldn’t have that much impact at all. And it would take, you know, maybe my whole lifetime or more before anyone realised the impact of anything that I would do, even if I did amazingly. And that sort of felt a bit unsatisfying, so I started thinking that I want to do something more relevant and impactful for the world. And that’s what made me start thinking about the EF really.
[00:02:30] Matt Clifford: As a fun detour, for me, at least, I asked sessions to tell us what exactly a black hole is.
[00:02:35] Sasha Haco: So a black hole is basically an area of space, where the matter is so dense that gravity is pulling everything in, and there’s so much matter that not even light can escape its gravity. So gravity causes everything to, you know, attract each other. And you can imagine something where the gravity is so strong, that even light is pulled in. And because we see things by light reflecting off of them, we’re sort of destined to never be able to see a black hole, because any light that bounces off them will never come back, it will always fall in. And so that’s why black holes are called black, because they’re definitely not a hole. But it looks like empty space, because we could never see that part of the universe. And sort of all the work I was trying to do was to figure out, you know, what happens with stuff when it crosses this boundary.
Actually, when I first started my PhD, I sort of signed up to work with this person, and this like brilliant PhD supervisor, and I sort of thought I was going in researching string theory. And so I spent the summer reading up about string theory, and, and then sort of in September, like just before I started, I remember there was this big u-turn. And my supervisor said, “Actually, there’s a really exciting project that’s about black holes, let’s do that instead.” So it could sort of change. And it was just good luck, really.
[00:03:43] Matt Clifford: An interesting fact for everyone listening is that Sasha is also featured in a Netflix documentary called, “Black Holes, The Edge of All We Know”. I asked her to tell us about what it was like making this film.
[00:03:52] Sasha Haco: And this is so funny. So during my PhD, I was working on this problem called the Black Hole Information Paradox. And this is like a big problem in physics. And it’s been around for about 40 years or so, the problem came about by Stephen Hawking, and this was sort of a problem that he’s been wanting to solve his whole life. The problem is essentially, when something falls into a black hole, it appears to disappear forever. It seems like it disappears forever. And all information about it is lost. And that fundamentally contradicts what we know about how the world works. So this is a sort of paradox.
Stephen Hawking discovered this paradox and was desperate to solve it. I was very lucky to be part of his team of people trying to solve it. And during that time, this producer and filmmaker in America, he’s actually also a physicist and philosopher, decided he was making a documentary about some of the work we were doing. And also about another physics group, which I didn’t even know about at the time. Over the course of actually several years, this camera crew would be like flies on the wall, while we were just working. And I remember thinking, you know, I don’t know who’s gonna want to watch this because it’s just us working for like 1000s and 1000s of hours. Obviously, they did a lot of editing and got rid of a lot of the boring stuff. But it was funny because at the beginning, I remember feeling a bit self conscious, and you know, I should look nicer. But by the end I didn’t, it didn’t occur to me. But it was amazing because they came with us, we went to Wales for this sort of physics retreat, and they came with us and we went to America and they were there, and they came back to the UK. So they sort of followed us around and I actually had no idea that it was going on Netflix, or that there was anything to come of it until suddenly, people started messaging me saying, I think I’ve just seen your Netflix, and it was really bizarre. And yeah, they’ve done a lot of editing. We’ve got rid of a lot of the boring stuff. And hopefully it’s quite interesting. It’s called “Black Holes, The Edge of All We Know”.
[00:05:37] Matt Clifford: I have to say that amusing aside, it’s clearly generated quite a lot of interest in your group. And, you know, we were joking internally the other day, because we were looking at our traffic logs. I think I already told you this Sasha. But you know, we were looking at like, what are the most common search terms that lead to people going to the EF website? Unsurprisingly, the number one is people searching for Entrepreneur First. But actually people searching your name is something like the fourth most common way that people from a search engine come to our website. So I think it just shows sort of either how, how niche we are or just how on message it is.
I feel I have to ask you as well, before we move on to sort of the transition into becoming a founder, which obviously what this podcast is mainly about but you know, obviously Stephen Hawking is this iconic figure, probably one of the very few physicists that people who are not in academia know about. What was it like to work with him?
[00:06:28] Sasha Haco: It was brilliant, it was amazing. I was very, very lucky to be part of that, and especially working on something that he felt really passionate about, because he was really engaged. And I worked with him for that sort of last few years of his life. And that was a really amazing experience, because this was his problem. And he had amazing intuition.
He was a man of a few words, as you can probably imagine, because speaking was a very difficult thing. And so he would sort of say something, and then you’d read into it for ages and ages, because, you know, this, these few words, but probably meant so much. So I remember having a few sort of conversations where I’d ask you questions, and he gave me this very short answer. And I had to go away and try and work out what this meant.
And usually, there was some like, very profound insight that often I didn’t realise or notice until after we solved the problem. And then I realised that’s what he would, he’d already sort of intuited from ages ago, there was a very, like, amazing thing. And he was also up for a good time and had a lot of fun as well.
[00:07:27] Matt Clifford: One thing I’m always fascinated by is to understand what drives an individual to become a founder. So I asked Sasha, what her transition to entrepreneurship felt like coming from such a deep academic background,
[00:07:39] Sasha Haco: In some ways, I did get quite a good training for being a founder. And part of that is trying to take complex ideas and explain them to other people. And that’s just sort of part of academia, things are things we’re working on, as part of a collaboration. And actually, in some ways, I feel like I do that all the time now, especially as with the sort of CEO hat now.
I guess we’re a deep tech company, and we’re solving hard problems with complicated technology, and how can you take that and talk about it to potentially a sales team, or a customer success team or something like that at a company where their background is not in computer vision, or whatever. So I think that’s been really helpful. That’s probably the main thing other than that, is this sort of thinking about, you know, a high, think about problems and things like abstract, abstract issues, things like that, which have been, which was interesting, and has helped a lot.
[00:08:31] Matt Clifford: It does remind me of the sort of things that people say about Jeff Bezos where it’s like, you know, his warehouse logistics team brought this incredibly complex problem to him and was like, explaining it, we know, they had all their experts. And he’s like, “No, that’s not it, I think it’s probably something like this,” and then they would go away. And annoyingly, he would be right.
[00:08:47] Sasha Haco: I guess, like, a lot of brilliant entrepreneurs, that of the world. They’re just like amazingly smart people. And who could just see things differently get to the nub of an issue faster than anyone else and understand how to solve it. So I feel like that’s probably true of a lot of brilliant entrepreneurs and Stephen Hawking as well. I don’t know if he would have had any much interest in entrepreneurship. He was really interested in doing social good as well. And he was involved in quite a lot of outreach-y, charity things as well. So it wasn’t it wasn’t just purely black hole-focused.
I feel like as a founder, I’m in the deep end, learning something totally new every single day, which is something I actually love. And every day is a real challenge. And I love that my learning curve is really, really steep. And it’s actually something like, I didn’t feel so much with academia. The pace, I almost felt my pace of learning, was slower, even though, you know, I was researching something that was totally new and sort of uncharted territory.
In some ways,I feel like my pace of learning now is much higher. And everyday there’s a sort of big problem to solve. I love the sort of intuitive way that you encounter something, you figure out how to solve it, it goes disastrously, so then you find a new way of solving it and then you know, you iterate. I love that part of it. Something that’s quite surprising, the sort of cognitive challenge I find and thinking about all parts of life, like company life and company building. How can you, you know, recruit a team and make sure they love what they do and love working with you? And that’s something I just never really thought about before.
[00:10:17] Matt Clifford: I think one thing that really resonates there is that almost the role of a founder is to have to learn a whole set of things very quickly, with sort of existential stakes as if you don’t learn them, it’s really possible to just try a company, especially a young one, by not being able to do something. I remember that when I first met Sasha at EF, I was interested in whether there was any connection between black holes and content moderation, or whether the idea had been a bolt from the blue, so I asked her.
[00:10:47] Sasha Haco: No, so there’s no, like very direct link between black holes and sort of online safety, that’s definitely true. thinking, I really want to do something really useful. I wanted to solve a hard problem that has a lot of impact. And I didn’t know exactly what I wanted that to be. And so I was meeting lots of different co-founders and potential people. I was gonna work on exploring ideas.
And I met James there, he was my co-founder, he’s a brilliant computer vision expert, and had done a lot of work in analysing and understanding images and videos with AI. And so we sort of got on well and came together thinking about, you know, what can we do together. And one of the things that we came upon was this content moderation problem. And essentially, we felt that there are, you know, so many videos that people upload to the internet that are harmful in some way. And there’s a lot of people involved in the process of removing them, either users on the platform who see something and report it because it’s harmful, or their manual moderators, who then have to review it themselves. And we felt that a lot can be done just with technology to try and automate a lot of that process. So we started getting interested in this problem. I started reading more about it. And the more I read about it, the more I realised that this is an extraordinarily enormous problem, which I just hadn’t really conceived before I started looking into it, the more I read about it, the more passionately I felt about it that this is a problem that should really be solved.
James actually had done some work with Facebook before Entrepreneur First. So he was exposed to this from problems. And he was also a community moderator on Reddit. He was quite aware of this, what it’s like to be a moderator, even though he wasn’t being faced with really harmful content all the time. He was, you know, very often taking stuff down and checking content, and knew that whole process. With hindsight, it seemed like the perfect problem.
[00:12:31] Matt Clifford: Some people, when they join EF, know exactly what role they think they’re gonna play in their startup. They know they’re going to be the CTO, they know they’re going to be the CEO. I wasn’t sure how Sasha found hers. So I asked her what her goals were before she walked into EF.
[00:12:44] Sasha Haco: I felt when coming into EF, I really had no idea what I was getting myself in for in some ways. I didn’t know much about the startup world, I didn’t really know what the CEO did or what it meant to be CEO or CTO. So I came in quite open-minded, I think. I’ve really enjoyed speaking with people and talking and communicating ideas. So I felt that maybe being a CEO was the right thing. I’m also not like a coder in that sense. So I thought maybe I was more suited to the CEO, but I had no real sense of what that meant.
[00:13:14] Matt Clifford: Sasha’s company Unitary uses AI to moderate content online. One estimate is that over a trillion megabytes of data is uploaded to the internet every day, which makes content moderation an extremely challenging task. Unitary detects harmful content, using computer vision and graph based techniques, in order to make the internet safer. I asked Sasha to tell us how her business has evolved over the last few years.
[00:13:40] Sasha Haco: I feel like it’s changing all the time. And I go through these phases of feeling like if we don’t solve this tomorrow that the whole company might die. It’s probably a massive overreaction. But there’s that feeling all the time that, you know, I get to have these sleepless nights thinking, if I don’t figure this out, then you know, it’s really gonna be a car crash. And there’s actually that feeling all the time. I guess that never goes away. But the reason for that changes all the time.
So right at the beginning, it was sort of working out. This actually stayed for a while, but working out what is our product? What is it that people actually want? And I knew that we were solving a really hard problem, it wasn’t clear to me that we really understood what the first product would be that would really sell and people would really buy. And that was a really hard thing. And I just knew that if we didn’t figure this out, and sooner or later, there was nothing. I guess that’s something that every company goes through, that you have this idea and you have to work out, how do you hit product-market fit? What’s the thing that you need, so that was more of the challenge, I guess.
And now as we’ve started to kind of develop a bit, we’ve started to find out what it really is that is exciting about what we can do and how we can deliver value. And now that the stressful thing is execution, and making sure that we can go from this small company, which is sort of building technology and figuring things out to one that can really deliver and over-deliver Actually, that’s a much more different kind of stress.
[00:14:54] Matt Clifford: As companies evolve, so do their founders. To keep up with the demands of their customers, their team, their investors, and to grow their company, founders have to embrace the idea of consistently and continually learning and educating themselves. I asked Sasha, what her own learning process is like.
[00:15:13] Sasha Haco: Definitely trying to talk to people as much as possible, trying to find out what they do and, and how they do it. But also, just like, if I can, I like to try and witness a brilliant person doing that thing in action, and trying to sort of absorb and hope that I’m just by being around them, I can pick up a few nuggets of wisdom, and trying to watch someone be a great salesperson. And then sometimes with selling, you know, these companies that sell to startups, you know, people sell to you all the time, and sort of reflecting on, you know, sometimes I think, wow, I do really want to buy that product, even if it’s actually completely irrelevant and useless to, to what we’re doing. We don’t need it at all. And I think, “wow, that was just a great salesperson. How did they convince me that?” and so that kind of thing, I’m trying to think of examples where I’ve seen something that really works.
And also I’ve sort of become a bit of a reader of startup books, and I love following other startup journeys. And something I’ve found so helpful is keeping in touch with founders who are just a couple of steps ahead of me. So maybe I’ve been around for one or two more years, one funding round ahead. I’m trying to ask what happened to you when you did this for the first time, and they have their mistakes very, like they’re still very raw. So able to give a lot of advice, which has been really helpful
[00:16:24] Matt Clifford: Videos are just one format of content that can be uploaded online, and may contain damaging violent or offensive messages. Sasha explained the challenges of having to contend with text pitches, audio, illustrations, and more, as she builds out the business.
[00:16:38] Sasha Haco: So today, people upload stuff to the internet all the time. And that’s a great way of communicating with everyone. And obviously, since the pandemic, everyone relies so heavily on the internet for everything, but there’s also a lot of people who misuse and abuse the internet to upload stuff that’s designed to harm people or offend people. Or even sometimes it could be inadvertent, but something ends up in the wrong place.
So for example, there are certain things that just shouldn’t be posted on children’s platforms. And what that means is that in order to keep communities healthy, and platforms thriving, you need to have some way of being able to remove this kind of content. If a platform becomes overrun with harmful content, people will stop using it. And so there’s a sort of big incentive for the platform’s point of view, but also for users as well, to have an environment which is safe. And I remember reading something in the UK government report that several most terrorist attacks had some link in the last few years to online discussion and conversation. And there’s often ways where Had we been able to pick up things online, it could have stopped potentially, or prevented things from happening. So people use the internet for so many different reasons. And being able to keep on top of the good stuff, and also manage the stuff that is maybe less healthy is super important. Scale is, of course, like the biggest issue. But also something interesting is that the internet isn’t not localised. So you know, people can communicate from one place in the world to another, obviously, that’s amazing and brilliant. But also it means that people are posting stuff from all over the world, and then it can be accessed all over the world. And so you have this challenge, where there’s so many different languages. And you know, obviously, that’s a brilliant thing. But at the same time, even from a manual moderation point of view, in certain parts of, well, it’s hard to get enough moderators who speak a given language. So that’s just a real challenge.
So Unitary is trying to solve this problem, and we’re focusing particularly on videos. And so how can we identify automatically that content within a video is problematic or contain something unsafe in some way. And what we’re trying to do is take a video to automatically say what’s inside it and label that video with what it is. We don’t want to be the people who decide if something is safe or unsafe. So we’re not the arbiter of what is right and wrong, we just want to give information back to the platforms of what’s inside something, to be able to say, “This video contains x, y, and z,” or maybe give it some label, and then that platform, we can match that up against their terms conditions and decide whether that’s acceptable or not.
[00:19:07] Matt Clifford: Oh, you know, this is a five minute video, it has sort of a scene of graphic violence in it or something like that.
[00:19:14] Sasha Haco: Exactly. So this snippet of video contains exactly graphic violence, and will give us risk levels. So maybe it’s a very high risk type of thing. And then, depending on the platform, although most platforms wouldn’t want extremely high risk with violence, but then they can decide that that’s not appropriate for them.
So sort of app analysis of this video and any other signals that we can get our hands on, and we would look at the videos right? We have computer vision models, understand or analyse the frames of the video, but we’d also look at things like the audio, any additional text, maybe there’s titles or caption comments that accompany any given post and we analyse all of that together and try to understand what is this what does this post about? What does it really say?
A really big challenges is that different contexts can lead to totally different interpretations of the same thing. And so, something posted with one caption could be totally benign and another with another, it could be really racist, for example. And so it’s really important that you understand exactly how to understand the title, or the audio in conjunction with the sort of visual signal. So it’s a really hard problem from that point of view.
[00:20:23] Matt Clifford: I was also interested in whether there was any link between the technical methods that Sasha used during her PhD research in black holes, and the techniques that Unitary uses to moderate content online.
[00:20:35] Sasha Haco: So I think ultimately, yes, there is a connection in the sort of the fundamental way of solving these problems, I think will involve a lot of graph-based approaches and understanding how things connect with each other. And that’s something very relevant to the kind of math that I was doing before. But at the level of an individual person, individual video, we’re looking really at the sort of computer vision and natural language understanding, and how can we combine these different models. And that’s a very sort of machine learning, pure machine learning problem.
[00:21:07] Matt Clifford: The internet has had some pretty dark corners ever since it existed. I was curious to find out why content moderation has gained so much prominence and assume so much importance in recent years,
[00:21:18] Sasha Haco: I think it’s definitely the case that people have been trying stuff for ages. And some things have changed. But there’s still a long way to go. It’s not that we can do everything. So one element of it is just that models have advanced, there’s better ways of understanding video. There are better computer vision models, that sort of state-of-the-art has just increased today, compared to five years ago, it is just a different world. Even in academia, I think that’s really exciting, especially in the world of video, the state-of-the-art is changing very rapidly, there’s a lot of progress all the time. So that’s just really exciting. And that means every year, we’ll be able to do things that we couldn’t ever before. And that’s one thing, you know, the actual sort of academic techniques that you might apply.
But also the kind of things people are worried about is constantly changing. And I guess this isn’t so much why can we do it now but why is it an ongoing problem. But you know, people invent new ways of being racist or abusive, in some ways, there’s all sorts of different and new ways to be harmful. I guess, with every new precedent, there’ll be another way that people will be offensive. And so there’s constantly changing, and you have to really stay on top of things.
I think that’s one big issue that we have to grapple with, that you can’t ever just stop, you can’t say, oh, you know, we’ve built this model. And now we’re done. It requires somebody to constantly stay on top of it. But I think the other big thing is, even though people have been talking about it for years, the incentives are changing. And the need and desire to do something about online safety is really starting to change.
And that’s partly because of legislation. And like at a government level, a lot of countries around the world decided to implement changes to say that actually, it is the responsibility of different platforms to make sure they have the right measures in place. And so that’s one big driver. And the other ways that people are starting to feel more strongly about online safety. And that’s causing a totally different incentive for platforms to do something about it.
I’ve just seen a real change in how willing people are to even engage in that discussion. It’s platforms who are trying to remove that content. And whether that’s because they care about their users, or they’re worried about government legislation. Or, another thing that’s really changed recently, is advertisers have started saying that they’re not happy advertising, the old platforms, which have unsaved content. I mean, this has been, I think, for a while, but advertisers have started actually boycotting major social networks. Actually, there was a big boycott in 2017, so it’s not super recent, but the feeling is really getting stronger, that if platforms cannot have better moderation practices, then advertisers will stop advertising. And this really starts to affect their bottom line as well. As you know, obviously, user safety is really important, but so is revenue. And that, that changes the dynamic.
[00:24:05] Matt Clifford: Content formats are obviously constantly evolving online. In recent years Tik Tok videos and Instagram Reels hogging the limelight. I asked Sasha what she thinks about these new formats and what challenges they represent.
[00:24:17] Sasha Haco: I do think there’s this trend that’s gone from sort of text images to video. And now the next thing is live video. And that’s what I think is almost the future. All these big social networks are starting to include live video. Now, they’re all these other live video platforms where people can just interact, you know, in groups, and you know, in a live environment, and I think that’s going to be the next thing. And that’s an enormous challenge in terms of moderating that sort of quickly and also without spending a tonne of money. It’s gonna be a real challenge, but that’s definitely where the world’s going I think. So I definitely see live videos as a big one. I also see audio in itself is something that’s taking off. Obviously, there’s Clubhouse and things like that, but podcasts are obviously a good thing.
And just things happening really quickly, in the limited time people have to react to things. And that’s definitely a challenge for live videos. Someone could put something online and then go from one minute where something is being totally safe to them totally unsafe in no time at all. And because there’s people that upload stuff all the time, and quickly, you have this real sort of an arms race, it’s a real challenge to get there quickly enough. So I think just speed and length of videos, things are, it’s hard to tell if things are getting longer or shorter. Tik Tok is short-form video, but I know that they’re increasing or might increase the length of videos people can put up onto, there’s a, there’s a trend in both directions.
[00:25:29] Matt Clifford: It strikes me as one of the big challenges is that in the last decade, big tech has gone from being broadly, you know, the hero industry, it’s actually still quite popular with the general public, if you look at polls, it’s definitely become a bit of a you know, whipping boy for politicians and regulators around the world. And, you know, there’s a lot more suspicion and sort of, also suspicion by default of some of the big tech platforms in particular, both in the US, but you know, in Europe, and I think almost the changing sort of political and regulatory context. If you’re a Facebook or a Google or Twitter, whatever, that’s got to also make this a very charged issue and environment.
[00:26:12] Sasha Haco: For sure, it is continuing to change. And people are feeling more and more strongly about it. And so there’s a lot of pressure, I think, for people to act, and react when things go wrong, and not just sit back and, you know, put the blame on the user, for example. So it’s getting to become a bigger and bigger issue. And certainly, regulation is coming in, which is going to force a lot of change. And I think it’s going to be really hard for a lot of platforms to be able to react to that regulation, but they’re going to be very big changes enforced.
[00:26:45] Matt Clifford: The vision of a safer internet is a compelling one. So I asked Sasha to elaborate on her plans for the company.
[00:26:51] Sasha Haco: And so we’re just working furiously hard on the tech side, trying to build our tech so that it’s really scalable. That’s our biggest challenge. How can we scale up from millions to 10s of millions of videos that we can process every day and keep our models accurate? Another big thing is how can we increase the scope of what we’re able to understand in terms of languages, different languages, we can identify different types of content that we can, we can classify. So that’s like a big thing on the tech side. And so hopefully with that, that will enable us to start to look at a huge amount of more content. And we also are hopefully going to start to work with several different social networks, big and smallm over the next year to really try and tag and flag up content that is inappropriate in some way.
[00:27:37] Matt Clifford: I also asked Sasha to share her views on what advice she would give to a new person entering the world of entrepreneurship for the first time.
[00:27:45] Sasha Haco: That’s a good question. I think there are so many things I think I know that would have been easier, like a brilliant piece of advice I was given. So I can’t claim this as my own. Ian Hogarth told me this. He said that one of your jobs, I don’t even know if he know but I internalised, it’s like, oh my gosh, one of you that one of your jobs is as a CEO is when trying to sell into a company is to sort of make such good friends with somebody internally, that they feel that they are obliged to sort of ring you up and let you know if the deal’s going wrong. And that really stuck with me. And I just think that’s so important that there have been so many times when you think something’s going the way you want it to be in a sort of sales situation. And then it doesn’t and you don’t really understand why. And having somebody who just likes you as a person and feels bad if you know they don’t end up doing what they said they would. And so they feel that you know they owe it to you to tell you what’s going on helps so much. And that’s something that I think it’s actually super important.
[00:28:42] Matt Clifford: That brings us to the end of this episode of the Entrepreneur First Podcast. Hope you enjoyed listening to Sasha’s story and her experience building Unitary so far. Join us next time when I’m going to be speaking to Andre Lorenceau, co-founder and CEO of Divigas and Leon Farrant, co-founder and CEO of Green Li-ion. They’ll both be telling us about how technology can help alleviate some of the impact of the climate crisis. If you enjoyed this episode, please do subscribe on Apple podcasts, Spotify or wherever you listen.
For more information about Entrepreneur First, visit joinef.com.
Big thanks to Cofruition for consulting on and producing the podcast and thank you for listening. Speak to you next time.