Skip to main content

Caleb Sima, CSO Robinhood, on dealing with humans

Guest advisor Caleb Sima is Chief Security Officer at Robinhood – the well-known stock brokerage app you may already have on your phone. He shared his thoughts on dealing with humans with our CyRise Elevate members.

Over a 20+ year career, Caleb has co-founded three companies, been a hacker, a CTO, a CEO, and is now a CISO and Board Member. An independent thinker with a curious mind, Caleb claims he went from a technical role to a CISO role because, “there’s always something new to learn”.

A widely recognised technical expert in identification of emerging security threats and penetration testing, he’s also a candid character who doesn’t mind a bit of a joke:

“I’m here to give investment advice and talk politics,” he laughs.

On success

We asked Caleb what qualities he thinks helped him succeed. Despite his modesty, he still delivered us some nuggets of wisdom on success:

“What has helped me to become successful? I don’t know! I don’t even know if I would say I am a successful security leader, to be honest. I tend to be able to build good teams, the people around you are key. I hire people I think are smarter and better, then move things out of their way! Unblock them. That’s what I do every day.”

Top three pieces of advice for CISOs

Given that CyRise Elevate is all about getting better, faster, we grilled Caleb on his lessons learned from a career in security. Interestingly, most of his advice was not about security tools, workflows, or technical advice, but instead centred on dealing with the humans around you.

Here are Caleb’s top three pieces of advice for security leaders, on working with humans:

1: Humans need stories – not security information

Don’t overload your board with facts and figures

When engaging with the Board, one of the mistakes CISOs make is overloading them with information. “A lot of people come to the board with metrics and numbers and statuses. In my view, the board is not a status update, the board is about telling a story,” says Caleb.

Tell a good story

Tell a story about what’s most critical, what you need , and how the board can help you get that done. You should pick one, two, maybe three things that are top of mind, and tell this in a story – not in metrics”.

Need metrics? Try maturity models

“I lean towards maturity models because they’re impactful” says Caleb. “For example, I spent my first 90 days at Robinhood identifying our crown jewels, and working out what protections they had around them. My metrics were, ‘Ok, how many jewels do we have, and what level of maturity are we at protecting them?’ Once you have that information, you can work out your overall security maturity, and tell that as a story to the Board”.

2: Don’t educate developers on security

No-one likes security

“Security is a pain in the ass. No one wants to do it. It sucks!” laughs Caleb, with refreshing candour. “Personally, when I code, I can see what I’m doing wrong, but I just need to get the project done – then I go back and fix it. Building things is fun, security is painful.”

It’s not useful to teach developers security

Caleb points out that even security professionals don’t know everything about security. “I asked a room of security people to name the OWASP Top 10. No one could do it. This is a room full of security people! So how do we expect engineers to know cybersecurity, in addition to all their other priorities? Education is good, but that’s a long game. Don’t expect to fix engineers by educating them.”

Remove security problems from developers

“Your job isn’t to educate developers on security, but instead to abstract security problems away from them. How can they build a safe product without knowing security? My dream is that our intern at Robinhood could build a product and we could release it knowing it’s safe by default,” shares Caleb.

3: Can’t get your way? Escalate or contain

Disagreement on what poses a security risk is normal

When asked what to do when the security priorities conflicted with engineering priorities, Caleb had some cut-and-dry advice:

“Our job is to manage risk for the company. If there’s a major breach, then I haven’t done my job. So I think friction between security and the rest of the organisation is okay. If you’re in security and you’re liked by everybody, something is wrong! But it doesn’t mean you can’t get along.”

Don’t be afraid to escalate up the chain

“Obviously, you don’t want to screw your cross-functional partner by throwing them under the bus. But if I consider it a major risk, I’ll escalate, you bet,” Caleb shares.

Understand that security isn’t the top business priority

“When I escalate a critical problem, I don’t always get my way – and that’s okay. We might get together and agree we just don’t solve that risk. If the company decides this is just not the risk I’m going to spend my time on that’s okay. I’ve done my job. At the end of the day, security should never be priority #1. If it’s in the top 5, that’s great,” Caleb says.

Contain risks with compensating controls

“If the business has decided we cannot solve this risk, then it’s your responsibility to contain the risk. If you can’t convince the company to change, it’s your job to put in place compensating controls,” shares Caleb.

If you can’t have control, you need the ability to monitor the risk.

“If you can’t have monitoring, then we assume the thing is always breached, and build around that.

We figure out a lot of alternative ways to secure products and reduce risks, even if the cause of the risk itself can’t be changed in a way we might prefer,” shares Caleb.

For difficult decisions, use rings

To work out when to escalate a problem and when to go around it, Caleb suggests using an adapted version of the Intel Rings model to help make decisions.

“Ring 0 is most critical. If the risk falls in Ring 0, we don’t have compensating controls – I will die on a hill to make sure the vulnerability is secured. Ring 1 might be securing things like PII (personally identifying information). Ring 2 might be business systems. Ring 3 is all the things that aren’t as high risk.

For Robinhood, one of those Ring 0 issues is crypto-custody. There’s nothing we are lenient on – there’s strict engineering, immutability, integrity controls. At Ring 0, engineers must work around the controls we have in place.”


4 circles forming a bullseye with critical in the middle circle and each circle numbered from ring 0 in the centre to ring 3 furthest to the outside

Follow Caleb Sima on Twitter and LinkedIn and Cyrise Elevate on Twitter and LinkedIn.

CyRise Elevate is our membership and development program for ambitious cybersecurity leaders. We’re currently recruiting members for our new CyRise Elevate tribes for GRC and technical security leaders and have limited spots available in our CyRise Elevate tribes for senior security leaders in scale-up organisations.

If you know an ambitious security leader you think might be a good fit, we’d love to meet them. For our new tribes, the perfect candidate is someone who has strategic responsibilities and (probably) reports to the CISO or the Head of Information Security. Is that you or someone you know? Send us an email at [email protected] and we can send you some more information.

Scaling and Maturing the Security Function

Illustration of Swathi Joshi smiling. Swathi Joshi is VP, SaaS Cloud Security at Oracle. "Scaling and Maturing the Security Function"

Swathi Joshi is VP, SaaS Cloud Security at Oracle. Based in San Francisco, she was previously Security Engineering Manager (Detection and Response) at Netflix. Swathi received her Masters in Information Security and Assurance from George Mason University (US), and a Bachelor of Engineering, Computer Science from NMAM Institute of Technology (India). She is a board member of The Forte Group, Day of Shecurity and the Sahasra Deepika Foundation for Education.

This month, Swathi joined CyRise Elevate as a guest advisor. Below is an edited excerpt from our conversation, where she discussed how she approaches the scaling and maturing of companies’ security functions.

Scaling and Maturing the Security Function sketch of key points and illustration of Swathi Joshi

What are your foundational blocks of a security program? 

The way I’d like to describe it, there are three sections: 

  1. One is doing the actual work: management, risk compliance, interim response. 
  2. Then ‘How can you increase leverage and unblock your team? What are some of the aspects that you can use to get better leverage?’
  3. Then, measuring effectiveness. ‘How are we actually doing?’ 

It was Simon Wardley who had this concept of pioneers, town planners and settlers. Each of these people like doing different types of work:

  • Pioneers. They are very comfortable working with poorly understood problems. You need these people to go build stuff. 
  • Then there are settlers, who like maturing a product.
  • Then there are town planners who love standardising, who love stability. 

Quote You need people with different mindsets and different interests to come together.

So, I think in the security program – to increase your leverage – you need these three kinds of people. You need people with different mindsets and different interests to come together. 

There’s also efficiency-gain work we have to do. We’re often in this position of, ‘How do we balance all of the operations load that’s on us AND continue to improve?’ One way is through automation.

But a word of caution when it comes to automation: there is a certain point at which the return for the automation stabilises. Initially you’ll see huge gains. Let’s say you have case management or a source system. Initially you’ll say, “Oh great, our mean time to assemble everyone is getting shorter. Our mean time to resolve is getting shorter. Our time to schedule a post-mortem or a post-incident interview is shorter.” 

And then after that it becomes the norm and then it stabilises a little bit.

How much time do you spend thinking about your strategy for maturing? 

The reality is, if you don’t say, “Okay, we are going to have an offsite and we are going to think about this,” if you don’t carve out that time, day-to-day work just takes over. And generally, for me, what has worked is, Wednesdays and Fridays mornings I keep as my focus time. I try to really control my calendar and make it as productive as possible. 

I try to not schedule any meetings for Wednesdays and at least half a day Friday. So, I can get focus time to think about it. 

You’ve talked about ‘rightsizing’ a security investment. What does rightsizing practically mean for you? And when is it a ‘wrong size’, so to speak?

I think for rightsizing, there are multiple contributing factors: risk appetite, investment, revenue, which are driven by the business/company. Then there’s building a security strategy, hiring, resources allocation.

So, I think rightsizing, for me, means, ‘Okay, what are the company factors that impact you?’ That’s mainly informed by the appetite to invest in security, the risk threshold of the leadership team, the revenue of the company, and proportionality: the number of libraries in the stack, number of lines of code, number of applications, number of developers and engineers in the organisation. Those are some of the markers in my mind when you want to rightsize the security organisation.

What are the key focus areas for you in the next few years? Are you going to add in an extra team or is it just about expanding existing teams?

In the last couple of months or so, the economic climate has really shifted. I think we are at the inflection point, in terms of communicating the security return of investments… And I’m intentionally using ‘number of people’, and not ‘maturity’, because I don’t think you can equate that to be the same: more people doesn’t necessarily mean you’re mature. Last year I have been focussed on: ‘how can each of the security functions within my organisation operate effectively, how can GRC, detection and response, vulnerability management and red team each act as an efficient unit?’ 

Now that we are close to that goal, next year I want to spend time increasing the interconnectivity of the team. How can red team inform detection engineering better?’ How can we leverage lessons learned from our post-incident reviews?, etc. 

A common conversation at CyRise Elevate is about managing stakeholders and communicating with non-security teams. At Oracle, how much time do you personally spend with the non-security teams and stakeholders?

My non-engineering, non-technical interactions are mostly legal, policy, HR, and executive management. Large portions of my time is spent talking with application development teams and infrastructure teams. About 20 percent of my time is probably spent with non-technical stakeholders.

What’s your favourite part of the job?

I think when I step back and take a look at the scope of the organisation. The number of things that we do on a daily basis and the scope of things that we cover, it’s amazing. It’s humbling and exciting.

And the other thing at Oracle SaaS is the customer obsession. What can we do for our customers? How can we make this easy? How can we make this better? How can we make it more secure? Those are things that make this job really fun.

What’s a book we should all read?

Surrounded by Idiots: The Four Types of Human Behavior and How to Effectively Communicate With Each in Business – and in Life, by Thomas Erikson.

What’s a food we should all try?

Chicken biryani (a one-pot meal from wartime).

What’s a podcast we should all learn from?

Hidden Brain.

What’s a band or artist we should all listen to?


Thanks, Swathi, for chatting with us at CyRise Elevate.

Connect with Swathi: LinkedIn | Twitter

CyRise Elevate is our membership and development program for ambitious cybersecurity leaders. We’re currently recruiting members for our new CyRise Elevate tribes for GRC and technical security leaders and have limited spots available in our CyRise Elevate tribes for senior security leaders in scale-up organisations.

If you know an ambitious security leader you think might be a good fit, we’d love to meet them. For our new tribes, the perfect candidate is someone who has strategic responsibilities and (probably) reports to the CISO or the Head of Information Security. Is that you or someone you know? Send us an email at [email protected] and we can send you some more information.

How to Hire Great Cybersecurity Talent

Hugh Williams for CyRise Elevate

This month, CyRise Elevate was joined by guest advisor Hugh Williams. Hugh spent several years in the US working in technical executive roles at Google, eBay and Microsoft, and is currently Melbourne Business School’s first Melbourne Enterprise Professor

Hugh is an advisor to many companies, including Domain (AUS) and Doordash (US), is on the Board of the State Library of Victoria, and is a Venture Partner at Rampersand. He has been a CEO of his own start-up, and also co-founded CS in Schools, a philanthropic venture that helps teachers teach coding in secondary schools.

In addition to his business and charitable achievements, Hugh also has serious technical credentials as well, with a PhD in Computer Science from RMIT University and more than 120 published works, including over 33 US patents. 

Below is an edited conversation from our latest CyRise Elevate session, where we invited Hugh to share his wisdom on hiring incredible tech teams.

Hugh Williams sketchnote

What are the key characteristics that you look for in candidates when you’re interviewing?

I worked at Microsoft for quite a while, and they loved using the Lominger Competencies. It’s a long list of characteristics of people, things like humor, charisma, integrity, empathy. If we went to an interview or training at Microsoft, feedback would be based around these competencies. They also use them to evaluate people as they moved up the job ladders, and where you were on the scale, etc. And now I’ve become a huge advocate of these competencies too – they are so useful.

The four that really matter if you’re hiring a software engineer are:

  1. Intellectual horsepower – is the person off-the-charts smart?
  2. Are they good problem solvers? Because that’s what computer science is: solving problems.
  3. You want people who are action-oriented. People who just want to start, go do stuff, build things. 
  4. People ought to be driven for results. They want to deliver, they want to change the world.

And of course – you don’t want people who are deficient in integrity, or collaboration, or the other competencies. All of the competencies matter. But, we want magic with those four: intellectual horsepower, good at problem solving, action-oriented and being results-driven. 

So, if you’ve got somebody who’s a computer scientist, and well above average on those four competencies, then that’s the kind of person that is likely to succeed at Microsoft. And that’s all I’ve ever done when I’ve worked for any company since. Anytime I get a chance to do advisory work, help hire a CTO, whatever it is that I’m doing… I’m like right, I’m just gonna figure these four things out. 

So, if you were hiring – developers, engineers, security people – what type of questions would you ask to test on problem-solving?

Get them to write some code. Ask them a coding question. Make them write real code, in a real language. I really don’t care what language…  we can all learn new languages – they come and go. I grew up writing Pascal, but things just change. So long as it’s a real programming language. Not in pseudocode – actually write code. So the problem solving, I think, is fairly straightforward. Solve a hard problem by writing code. 

How do you know if someone has intellectual horsepower?

I think intellectual horsepower is sort of related to problem-solving – it overlaps. My big test for this is: was it a fun and engaging experience, where I learned something as an interviewer? It’s like a spidey sense you get. 

For instance, the conversation is going fast, they grab thoughts, they run with them, they create interesting new knowledge, and I come out of the interview going, ‘that was really interesting, I’m glad I spent that hour with them. I feel like I’m a bit smarter, having hung out with that person, I like the way they think, they asked really good questions.’ 

So I think at some level, you have a sort of spidey sense about intellectual horsepower. You want to work with people smarter than you. I think that’s the essence of it. 

How do you know if someone is action-oriented? That can be a really hard one to pick up on, whether a person is inclined to really get in and do stuff?

It is the hardest, but again, I think through experience, you can feel it. So let’s imagine we’re doing some kind of college interview and I say to you:

‘There’s a deck of cards with 52 cards in it, and I want you to write a function called ‘Shuffle’ that takes, as an input, an array of 52 cards that are in some order, and I want you to write a function that shuffles the cards and returns that array that’s being shuffled.’ 

That’s one of my standard questions for someone who just finished uni. If you were the candidate, there’s two ways you could respond; 

Approach A – You can grab the whiteboard marker and jump at the whiteboard. You can go, ‘oh, cool, an interesting question, I’m just gonna make a couple of assumptions here… are these reasonable?’. That sort of enthusiastic approach.

Approach B – Or, you can respond with, ‘I don’t know, I mean, there’s library functions for shuffling and stuff, right? Why would anybody want to write a shuffle thing? It’s a pretty old-fashioned question… you know, I haven’t thought about this sort of stuff for a while.’ 

Or perhaps they’ve got like, 16 clarifying questions: ‘What is a card? Is there always 52 cards? What if somebody wanted to shuffle four cards…’. 

Essentially, they’re not launching at the whiteboard going, ‘Oh, man, this is fun. Let’s go.’ They’re pushing back, inferring my questions are stupid, not making any assumptions, wanting things over-explained. So that’d be the college level version of it, I think, is like: do they launch at the whiteboard? 

And the more mature version of it, say for the senior engineer manager type, might perhaps be asking them something more complex, and maybe it’s not a coding question at all. You’re looking at: Do they lean forward and engage? Are they excited by the conversation? Do they want to solve this, and move the state of knowledge further forward? 

The metaphor would be: did they lean in or do they lean out? 

And occasionally, you’re going to ask a question where they’re just kind of stumped and stuck. And you might think that the person isn’t action-oriented, but they’re just stuck, you know, and so sometimes you’ll have to come at it from multiple angles… give them something else to try to figure out.

So those would be my clues. Are they a whiteboard pen grabber? Or are they a leanback? 

Your final characteristic – about being driven for results. This is a tricky one, because sometimes when you interview people, they attribute their success as being part of a team. How do you discern here, if they are actively driving results?

Yep. This is one where I always ask people what they’re most proud of. So if they just finished college, I’ll ask what was the assignment that they most enjoyed, and why. 

If they’re a bit further in their career, I say they’ve had an amazing career… and ask, ‘When you think back, what are you most proud of?’ 

And then what I’m listening for is: Did this person ship things that mattered out to customers? Did we do something where we were concerned about the outcome – or where we were concerned about the process? 

Say we’re talking to college students, and I ask ‘What’s your favorite assignment?’, and they respond with something like: 

“We worked as a group in the final year on an industry project… we were lucky enough to work at XYZ Company. And we actually solved this really important problem for them, that saved them an enormous amount of money and allowed them to make more widgets with a higher margin. And we got that done in six weeks. And I think it could be something that really changes how they work with their customers.” 

The opposite would be them doing something esoteric and complicated, and not really finishing it. Like, ‘I did an assignment where I implemented this thing, and it was interesting, and it worked, and we got it in on time.’

So I think you can just kind of tell: what do people talk about when they reflect on what they’ve done? Is it stuff that had an impact, or is it the process of doing things?

Another way to look for being driven for results, is that you can read it in resumes. Have people listed bullet points under their job title… like, being in charge of a process. Or is it like, ‘we delivered XYZ, more money, more users, more efficient, more reliable, safer, etc.’

What’s a book we should all read? 

I like Getting Things Done, it’s oldie but a goodie. You don’t have to read the whole book… It gets a bit boring after the first five chapters. 

What’s a food or dish we should all try?

Bakso. It’s an Indonesian food, like a hot soup with meatballs. Do it with all the sauces – hot sauce, soy sauce, and sweet sauce (kecap manis).

What’s a podcast that you’re listening to?

I’ve been listening to this one a lot while I do the mowing: Rockonteurs. It’s interviews with famous musicians and producers – it has great stories and I even learn a few management lessons along the way.

What sort of music do you listen to?

I buy a lot of vinyl records, I’ve got about 1,000 records in my collection. What’s on my turntable now is ‘Under the Midnight Sun’ – The Cult’s new album. It’s fabulous.

Thanks Hugh for chatting with us at CyRise Elevate.

Connect with Hugh: LinkedIn | Twitter


CyRise Elevate is our membership and development program for ambitious cybersecurity leaders. We’re currently recruiting members for our new CyRise Elevate tribes for GRC and technical security leaders and have limited spots available in our CyRise Elevate tribes for senior security leaders in scale-up organisations.

If you know an ambitious security leader you think might be a good fit, we’d love to meet them. For our new tribes, the perfect candidate is someone who has strategic responsibilities. Is that you or someone you know? Send us an email at [email protected] and we can send you some more information.

Building Security Culture for High-Performing Teams with Marc Bown

CyRise Elevate Marc Bown title tile

This month, CyRise Elevate was joined by guest advisor Marc Bown, CISO and Enterprise Technology Lead at Immutable, a web3 gaming scale up in Australia.

Prior to Immutable, Marc helped found the security teams at Sportsbet, Fitbit and Afterpay. A leader in security, technology and engineering, Marc is passionate about building empowered, high-performing teams.

He believes that “good security is as much about culture as it is technology” – so we asked him to elaborate!

What do you consider to be an effective security culture?

“Good security requires a company to have three things – the knowledge, ability, and will to do the things that will protect it. The last one – the will – is often the problem.

Companies with a good security culture are willing to invest in security. That investment might come in the form of spending time on training staff, or being willing to make hard tradeoffs, like choosing security over a new feature that might grow market share.

You can also use company culture to move the needle on security. No-one – except us – gets up in the morning excited to deal with a security problem. It’s our job to persuade the company into doing the right thing – and our tool for doing that is culture.”

What does good security culture within an organisation look like?

“A really detailed security compliance training pack… that everyone has to do once a year!


Security people often laugh at this, because that’s what security culture looks like for a lot of organisations. In places like this, people only deal with the security team through annual training, or when they’re forced to engage with some process the security team has designed. But a good security culture is built through three key elements:

  1. Security team branding: That might sound fluffy, but make sure that people know you’re approachable and helpful. Educate them on how to engage with you and delight them each time they choose to do so. Make each interaction more likely to lead to another interaction. Initially those interactions might be human-to-human, but ultimately you can make those interactions scale through automation and self-service.
  2. Awareness: But not just for awareness’s sake. Yes, help people know what threats are out there – but also give them specific (and reasonable) calls to action on what to do with that information. Telling people “security is important to us” isn’t useful, because people don’t know how to act on it. But giving the prompt, “If you’re building a feature that touches money, we’d love to help you simulate how an adversary will interact with that feature… contact us in our Slack channel!” – that’s specific and actionable.
  3. Continuous, relatable and specific communication. People need to be hearing from you constantly, in the language they understand. Communications need to be specific to the audience. If you’re talking to engineers – talk like one. And if you’re talking to civilians – don’t make things inaccessible with jargon and technical concepts.”

You talk about the importance of a good security team. How do you build one?

“It’s important the company understands that good security people want to work at places that value their input and implement their suggestions. They want to make an impact.

So in that way, culture is a virtuous cycle: good security culture comes from a good team, which leads to good company-wide security practices, which leads to a happy security team, which leads to more people wanting to join that team.

We hear all the time about the shortage of security skills, but really, it’s a matter of attracting these skills.”

If there is a strong existing company culture, how do you make sure that the security culture you’re trying to implement doesn’t undermine this, or provide mixed messaging?

“It’s always easier to build on something that already exists. So if there’s already a really strong company culture – piggy-back off it! Use the good elements of the company culture and leverage that for security. But if the existing culture happens to be a problem from a security perspective, be transparent about that. Call it out. But bring receipts! You’ve got to be aware of the issues you’ll come up against and show why you need to change.

For example: if people are really used to something, they’re going to doubt that it’s a problem. Bring examples of the problem, and actionable examples about how it needs to change. Measure, and start to regularly reflect people’s performance back to them. If people buy into the need for change, and are regularly shown how they’re doing, they will try to do better.”

This sounds like change management. Any tips for approaching this?

“Empathy! Make a case to people about why they should change. People are more likely to deal with people being a pain in their butt – who bring them inconvenience – if that person is perceived as a friend. Be approachable, be someone they trust and want to work with.”

You’ve also mentioned looking for empathy in your security team hires. Why is empathy so important across the team?

“Empathetic people know how to balance the needs of others with their own.

The security team is always asking people to do something they didn’t plan to do. Almost all of these people have impossible demands in front of them. They are all being asked to do more, faster, and with less. Influencing the right outcomes requires you to first understand someone else’s priorities, how your request impacts them – and how they might feel about what you’re asking them to do.

I also think it’s important to focus on having a diverse team. You need people who don’t look like you. Who you don’t know. So ask around. Diversity fuels empathy.”

Lightning round questions! What are you reading ATM?

“Twitter. And I listen to a lot of podcasts. Risky Business is how I stay up-to-date with work news. When I do read books, it’s usually because I’m being forced to have time off work, and my wife tells me “this is what you’re reading!”. I just read The Premonition by Michael Lewis (author of The Big Short).”

What music do you listen to?

“I listen to Aussie hip hop and American hip hop – but I can’t listen to that when I’m working. I listen to classical music in the background for work.”

Anything else you recommend?

The Security BSides – they’re a great group of people.”

Thanks Marc for chatting with us at CyRise Elevate!

Connect with Marc: LinkedIn



CyRise Elevate is our membership and development program for ambitious cybersecurity leaders. We’re currently recruiting members for our new CyRise Elevate tribes for GRC and technical security leaders and have limited spots available in our CyRise Elevate tribes for senior security leaders in scale-up organisations.

If you know an ambitious security leader you think might be a good fit, we’d love to meet them. For our new tribes, the perfect candidate is someone who has strategic responsibilities. Is that you or someone you know? Send us an email at [email protected] and we can send you some more information.

Reflections on the Optus hack with Tom Uren

This month, CyRise Elevate was joined by guest advisor Tom Uren. Tom is Editor of Srsly Risky Biz (created in collaboration with the Risky.Biz podcast) – a weekly substack newsletter that features stories shaping cyber policy.

Based in Canberra, Tom is also a senior fellow at Australian Strategic Policy Institute, who spent 15 years at the department of defence in various roles. He has diverse expertise across internet and cyber issues and has published and researched international and domestic cyber issues including Australia’s Offensive Cyber Capability, the insecurity of the internet of things, and Chinese commercial espionage.

We invited Tom to talk about what threats and trends he is seeing in the industry. Below is an edited conversation from our Elevate session – where members of our community fired a bunch of questions at him… and, spoiler alert: – they were mostly about the recent Optus hack!

You cover a cyber security gamut on the Srsly Risky Biz newsletter. What interests you most at the moment?

There’s a few! Espionage and intelligence – how nation states are using cyber to advance their interests. Offensive cybersecurity – activities that degrade, deny, disrupt… eg, the stuff that breaks things. How nation states are trying to use cybersecurity capabilities… And most recently – the Optus cyber hack, and its implications for ID verification.

Let’s start with Optus. Would you agree there’s not much responsibility being placed on Australian companies to stop breaches currently?

I think that for a long time, governments have been reluctant to regulate, and so privacy and data protection are weak. One of the problems when companies get hacked is that most problems are borne by other people. So, while the companies share prices go down for a bit, and there’s a bit of reputational loss… six months later – it’s like nothing’s happened. There was no financial cost. It’s all the customers and stakeholders who bear the majority of costs.

So because companies don’t bear the costs they don’t invest as they should. I think that regulation that imposes fines, so that the costs are borne by the companies – encourage them to invest at the right level. That way, if there’s a breach – it’s both a reputational and financial loss.

For the longest time, when breaches like this happen – governments have been confused about the right thing to do. And the industry lobbies against strict regulation, because it’s expensive. It’s been hard to weigh the costs… like, ‘What’s the real economic cost of 10 million people losing their drivers licence?’. So that’s delaying stricter legislation. But when stricter fines are levied – the business community pays attention.

Another thing that’s happening is that ransomware has much larger costs on companies, and that also makes them pay attention. Sometimes a company only improves their posture on ransomware if a stakeholder in the company had been affected, even though the company themselves had not. Working on Srsly Risky Biz, I see how ransomware happens all the time – but often it has to have a direct impact on an internal stakeholder before change happens.

How pragmatic do you think authorities will be on implementing any new regulations?

I have mixed feelings about regulations. A lot of regulation attempts to do something but doesn’t succeed… and is just painful.

In the Optus case, I’m not certain they should be fined. But certainly what happened, it was an absolute clanger. They left an API exposed, so anyone could query for customer details. But it could be that Optus has mature cyber maturity processes and this was a bizarre one-off accident. In that case, I’d argue that they don’t deserve to be fined.

Typically, when you read about breaches, it often is ‘we knew we had a vulnerability, we didn’t patch it, and we ignored all the alerts’. That’s not yet clear if that’s the case for Optus, so it’s possible they don’t deserve fines.

But going back to the question – the level of education is low, so when you don’t know what you’re doing… you look for things that are verifiable and that might solve the problem.

Which do you prefer – principle-based (risk-based) regulation or standards-based?

Principles-based. I think what would be good is if you could say: assess this risk and make the decision. My view is that regulators don’t do that, because they don’t understand the situation themselves or how to articulate what would be sensible.

In Australia, should we be looking at what the US is doing – in terms of enforcing the reporting of ransomware payments – and potentially hold CISO’s responsible for failing to disclose breaches?

I think I like the idea of transparency. In terms of aviation regulations, for example – if you don’t know what is going wrong – then how can you fix it? Aviation has a really robust way to go from accident to root cause, to fixing it – and that, I think, is pretty absent in the cyber security space. That said, Aviation, I believe, is protected from legal cases when they are reporting errors.

Do you think the Optus data has really been deleted?

I think it has – but I don’t know. If I were an Optus customer, I would work on the assumption that it could be breached. I wrote a story comparing it to Australia’s Equifax, where a Chinese organisation stole 140 million peoples data from a credit reporting company, including at least some licence data. And that, as far as I know, has never appeared on the internet anywhere.

This one was different in that the Optus hacker immediately threatened to leak it and leaked some. But in another sense it may well be the same. The Equifax data could appear on the web tomorrow. There’s a looming threat that could remain… but I suspect it’s not going to turn up.

What do you actually see, policy change in near future, if any?

I think they will implement tougher fines – in terms of privacy legislation. The idea of informing banks was probably a good one. Banks may try to find that data anyway. I don’t know if they would buy it – but I think they would try to find out who is in that leak. Just to make it easier, if there was a mechanism to tell them whose ID was compromised. In Australia we don’t have many banks – so they’re big and very capable.

I think the Optus breach has implications for the government identity matching services. One possibility is that these kinds of leaks or breaches become more common, and so that kind of erodes the usefulness of id numbers for identity verification. So that pushes ID verification towards facial recognition technologies… but I don’t think it will happen anytime soon. But it seems the government and Clare O’Neil want to be seen to be doing something, and I do agree that the privacy legislation is a bit weak.

If you’re looking to change regulation, it makes sense to start with critical infrastructure. At least last time I looked, we hadn’t gotten it bedded down… in that there was a bill that had passed, that makes critical infrastructure more accountable, but the regulations were up to each vertical. It is complex: electricity, water, gas, banks, fintech… there’s not one standard that can work across everything. Which makes sense – and so it will take time – but so you want to bed down critical infrastructure first.

And then possibly apply that across other areas of the economy. That makes sense to me, but won’t necessarily happen. I guess I’d call that the supply side of regulation. And then if you make the demand side penalties higher… that gives companies more incentive just to look at the problems in the first place.

How would you describe how the threat landscape has changed from the perspective of senior management?

A few weeks ago I had a piece that listed all the things that happened in the last month. It’s quite a crazy list – a couple governments hacked, a couple of critical infrastructure, in other countries that had been ransomware, various teenagers had just managed to compromise very big companies.

To me, what’s happened is, there’s this kind of arms race that’s going on between the standard techniques companies and enterprises use – and the standard techniques that hackers use to break into them. One of the big ones is the rise of techniques to bypass Multi-Factor Authentication – prompt bombing. That has been known about for a long time, but it seems like every criminal now knows that you can do it and you don’t even have to do smart stuff.

I think MFA has been quite good a control – but now we’re realizing it’s not that good if hackers are savvy, and users aren’t smart with it. Perhaps organisations have relied too much on MFA being effective.

How do you manage the stress of your job?

My job isn’t that stressful. Sometimes I don’t know what I’m going to write about. Deadlines help.

How do you try to influence people?

I like talking to people and asking questions. Telling stories. Laying out a vision of what the world should look like. Like – the world is changing in XYZ way, and therefore we should do XYZ. Also, you can get good leverage from not just talking to the top person, but working their stakeholders.

What’s a book everyone should read?

Not sure it’s for everyone – but I liked Hilary Mantel’s Wolf Hall, about Thomas Cromwell.

What’s a podcast we should all listen to? (Other than Risky Business!)

I like Odd Lots Bloomberg podcast, it’s about economics and the way the world works.

[Note from Tom: Oh, and I forgot at the time but my absolute favourite is 99% Invisible, which is about how the world is designed.]

What’s a band or artist you listen to?

I like 90s stuff. Red Hot Chilli Peppers old stuff. Blood Sugar Sex Magik is a favorite.

Thanks Tom for chatting with us at CyRise Elevate.

Connect with Tom: LinkedIn | Twitter | SRSLY RISKY BIZ newsletter

CyRise Elevate is our membership and development program for ambitious cybersecurity leaders. We’re currently recruiting members for our new CyRise Elevate tribes for GRC and technical security leaders and have limited spots available in our CyRise Elevate tribes for senior security leaders in scale-up organisations.

If you know an ambitious security leader you think might be a good fit, we’d love to meet them. For our new tribes, the perfect candidate is someone who has strategic responsibilities and (probably) reports to the CISO or the Head of Information Security. Is that you or someone you know? Send us an email at [email protected] and we can send you some more information.

Tom Uren Sketchnote


Reflections on securing Netflix, with Jason Chan

Jason Chan is the ex-VP Security at Netflix, and while he describes himself as “currently retired”, is also an angel investor and startup advisor.

Jason spent his whole 25-year career in security, with the final decade at Netflix. He witnessed the evolution of Netflix from a company that shipped DVDs (remember those days?) to a streaming service, and then a production studio. In the last three years, Jason not only managed Cybersecurity at Netflix, but IT as well.

Constant security is more stressful than an emergency

At any one moment, more than 90% of cybersecurity professionals are stressed, around half are feeling burned out, and more than a third are considering quitting their jobs in the next six months due to burnout (Deep Instinct, 2022). Jason believes that’s due to the always-on demands of being a cybersecurity leader. He compares cybersecurity to jobs such as a paramedic or a firefighter — stressful, but for the latter two professions, that stress will ebb and flow. It will be quiet, then there’s a fire, then it’s quiet again, and that break in between emergencies is an important part of stress management.

But security professionals have a different kind of stress, as they always have an incomplete knowledge of how secure their domain may be; a security professional never knows what might be happening that they don’t see. This means security professionals must always be on guard, can never really enjoy a break, and can be more susceptible to burnout.

We’re still in the stone age

Jason recalled a quote from friend Alex Stamos, former security chief at Yahoo and Facebook: “Being a CISO today is like being a CEO before accounting was invented”.

Jason believes cybersecurity leaders are operating in the equivalent of “the stone age of security”. But working in such a young field can make the job both exciting and frustrating, as many problems security leaders face do not yet have a standard answer. Like an early explorer in the Wild West, security professionals must expect the unexpected, and adapt nimbly to never-before-seen threats.

Because there are so many challenges still to be solved in cybersecurity, leaders are forced to be creative problem solvers. But with so many security leaders overworked and under-resourced, finding the time and space to be creative can be your biggest challenge.

Security is not “everybody’s job”. It’s yours.

When asked “How do you build a culture of cybersecurity?”, Jason gave an unexpected response:


Instead, the culture of your organization should be your starting point for your security approach. Netflix already had a well-established culture and values encoded into its operations when Jason came on board. Rather than trying to change the culture, he changed his approach to fit — even if that meant sometimes having to “work around” the culture.

Case in point: Netflix is well-known for its flattened hierarchy and trust in its employees. Its hiring culture was to recruit experts in their fields, and then give them full autonomy. Following this culture, engineers were expected to focus on what they were hired to do, not worry about security. So rather than following the manta “Security is everybody’s job”, Jason created a new mantra for the security team: “Let people do their job, and let us take care of security”.

In practice, that meant Netflix employees never had to do arduous cybersecurity training sessions or random phishing tests. Security was simply part of the fabric of the workplace — staff knew where to go for help but were free to focus on their job.

Jason was quick to point out that when he entered a different workplace, he would have to change his approach to fit that culture — there is no one-size-fits-all approach to cybersecurity. Which leads to the next point…

What you did there may not work here.

Working in consulting in the early part of his career exposed Jason to vast differences in risk tolerance across different customers. Witnessing such varying cultures and security risks proved there was no single approach to cybersecurity — what you did in one workplace likely won’t suit your next. While CISOs are hired for their experience, they’re not hired to simply replicate their past work.

“I hired people from different backgrounds — that’s how you solve difficult problems.”


People want to solve difficult problems

Jason believes a security leader’s job as people manager is to build a high-performance team — and that means bringing in the best. But how do you attract and retain the best talent in such a competitive industry?

At Netflix, Jason had the luxury of being able to pay people “decently”, but he knew that money alone wasn’t enough to retain the very best. Luckily, Netflix culture also allowed him to offer three additional workplace qualities:

1. Freedom and autonomy to solve problems their way, without micromanagement.

2. Surrounding the team with other world-class people.

3. New and challenging business problems to solve.

Without a decent salary, freedom, a world-class team, and “gnarly problems”, it‘s difficult to retain the best talent.

“For a commercial company like Netflix, at the end of the day we were entertaining people, not solving world hunger. We’re not NASA. So, when you start recruiting you have to think, ‘What is interesting about working here? What gnarly problems do we solve?’.”

TL;DR Jason’s underlying message was that success as a security leader often comes down to your people and management skills rather that your security expertise. Security experience is expected, but management skills set you apart. A sensitivity to company culture and what motivates people, the ability to manage stress and burnout, and the gift of creative problem solving are the 1% skills that ultimately separate the average from the world-class.

More from Jason:

Linked-in profile 

Follow Jason on Twitter 

CyRise Elevate is our membership and development program for ambitious cybersecurity leaders. We’re currently recruiting members for our new CyRise Elevate tribes for GRC and technical security leaders and have limited spots available in our CyRise Elevate tribes for senior security leaders in scale-up organisations.

If you know an ambitious security leader you think might be a good fit, we’d love to meet them. For our new tribes, the perfect candidate is someone who has strategic responsibilities and (probably) reports to the CISO or the Head of Information Security. Is that you or someone you know? Send us an email at [email protected] and we can send you some more information.