Season 2 Episode 3 Nov 12, 2020

Apple ARM-Macs, COVID-19 Vaccine, Privacy ‘Nutrition Labels,’ and Controlling Computers With Our Brains

Pitch

*In Captain Picard voice: "Computer. Read my mind."

Description

In this episode, we talk about about Pfizer/BioNTech's COVID-19 vaccine’s effect on certain tech companies, and we get into Apple’s App privacy “nutrition labels.” Then we chat with Ish ShaBazz, iOS engineer and creator of Capsicum app about Apple's ARM-based Mac’s and other noteworthy announcements from this week's Apple event. Finally we speak with Peter Yoo, director of research at Synchron, Inc, about the first human trials of their Stentrode brain implant which allows users to control computers with their brain.

Hosts

Saron Yitbarek

Disco - Founder

Saron Yitbarek is the founder of Disco, host of the CodeNewbie podcast, and co-host of the base.cs podcast.

Josh Puetz

Forem - Principal Engineer

Josh Puetz is Principal Software Engineer at Forem.

Guests

Ish Shabazz

Ish Shabazz is the founder of Illuminated Bits, and an iOS Developer who supports other developers.

Peter Yoo

Peter Yoo is director of research at Synchron, Inc.

Show Notes

Audio file size

58759618

Duration

00:40:48

Transcript

[00:00:10] SY: Welcome to DevNews, the news show for developers by developers, where we cover the latest in the world of tech. I’m Saron Yitbarek, Founder of Disco.

 

[00:00:19] JP: And I’m Josh Puetz, Principal Engineer at Forem.

 

[00:00:23] SY: This week, we’re talking about Pfizer’s COVID-19 vaccines effect on certain tech companies and Apple’s privacy nutrition labels.

 

[00:00:31] JP: Then we’ll be speaking with Ish ShaBazz, iOS Engineer and Creator of Capsicum App, about Apple’s ARM-based Macs and how their systems will integrate with iPhone and iPad apps.

 

[00:00:40] IS: I also think that it kind of lowers the barrier of entry just to tab because now you can do more powerful development on like a MacBook Air.

 

[00:00:48] SY: And then we’ll chat with Peter Yoo, Director of Research at Synchron, Inc. about the first human trials of their brain implant, which allows users to control computers with their brains.

 

[00:00:59] PY: On one hand, the algorithm is learning based on the patient’s brain activity, and then the patient will have to react to that algorithm. So it’s kind of this symbiotic learning between the algorithm and the patient’s brain.

 

[00:01:11] SY: So this week, Pfizer and BioNTech announced that their COVID-19 vaccine was showing to be 90% effective, very exciting, in preventing infections of the virus. This of course is a huge positive for the world. However, one interesting thing about the news was how it negatively affected certain tech companies. Stocks for companies like Zoom, Shopify and Teladoc Health fell pretty significantly. Zoom fell 17.4% and both Shopify and Teladoc Health fell over 13.5%, even Netflix and Amazon stocks fell a bit after the news. So these companies are, of course, ones that grew even more popular during the pandemic for their stay-at-home ethos. But it’s interesting how now just the mention of a solution, just a little bit of hope, makes investors lose faith in some of their long-term popularity. I kind of think if anything, the pandemic has definitely caused like a paradigm shift in the public’s perception of distributed and remote alternatives and they aren’t going anywhere once we’re all vaccinated, at least that’s how I feel about it. And it just seems really premature on the investor’s part to just kind of almost like pull out at this point. You know?

 

[00:02:22] JP: Yeah. I think it’s really premature on everyone’s part. This news hit and you saw the markets react. You saw the general public react as if a vaccine is going to be on street corners next week. And it’s just not the case. There’s still a really long road ahead before the vaccine could become available to the general public. I know Dr. Fauci had said that in the best of case, everyone could be potentially vaccinated by April. As of this recording, about six months away.

 

[00:02:50] SY: Right. Exactly. And to be fair, I guess if the vaccine is working and everyone gets one, the biggest thing that’ll be different will be school, like school will be going back to normal. And I guess to be fair, that’s a pretty significant thing, right? Like there’ll be a significant drop in Zoom usage if school goes back to normal. But all the changes that have happened with remote work, with distributed teams, I don’t think that’s going to go away. I don’t think we’re going to go back to the offices the way that we were.

 

[00:03:19] JP: I think for a lot of companies it’s going to be difficult for them to go back to offices. We’ve seen a lot of them in Silicon Valley, especially sell off their office space and transition workers to permanent work from home, permanent remote work. I mean, I think the phrase is putting the genie back in the bottle. Does that just all go away? Can you imagine working for a company and being offered remote work and then all of a sudden they say, no, we need you to come in? Maybe you’ve moved. Maybe you’ve structured it differently.

 

[00:03:50] SY: Exactly. Yeah. It just seems really premature. And I guess that even with the 90% efficacy, which is very, very exciting, it’s still early. We still have a while to go. So I’m surprised that it took a hit so soon. But you know what? The market reacts to news very efficiently. So I guess that’s just how it goes.

 

[00:04:08] JP: The other thing we don’t know about the vaccine is how effective is it going to be. Is this going to be like a one-time vaccine you get and then it’s active for the rest of your life? Is it going to be like a flu shot? You’re going to have to get every year?

 

[00:04:20] SY: Right.

 

[00:04:21] JP: Nobody really knows. There’s still so much unknown about this.

 

[00:04:23] SY: Right. Absolutely. And I guess the piece of news that surprised me the most is Shopify. Like Zoom makes sense why that went down. Teladoc Health, that makes sense. Netflix, even Amazon makes sense, but Shopify feels really random. Is it that people are going to stop selling their goods and go back to their main jobs? You know what I mean? I’m not really sure what the logic is behind that one.

 

[00:04:44] JP: Yeah. Maybe this is just me speaking as someone that lives in a rural area, but I think this is like kind of shortsighted to think like, “Okay, all this remote work, all this online business and connections that we’ve been doing during the pandemic, it can all just go away once the pandemic’s over. We can just close up telehealth. We can close up curbside pickup. We can close up shipping of products. A lot of new markets and businesses have opened up to areas that are outside of metros. We’ve seen a lot of people move away from metros. Yeah. I agree with you. I don’t think a lot of this is going away.

 

[00:05:20] SY: Yeah, for sure.

 

[00:05:21] JP: Starting December 8th, Apple is requiring apps to carry what they call “nutrition labels” about the privacy information that developers will have to submit in order to continue releasing and updating their apps. The concept is that before you download an app on either the iOS or the Mac App Stores, a label of sorts will tell you what kind of information the app will collect on you. We’ll see if Apple is going to be strict about policing these reporting guidelines as they are about other app rules, but this seems like a really good first step to me.

 

[00:05:51] SY: Yeah, absolutely. I love this idea. First of all, I love that they’re calling nutrition labels. That just makes them sound delicious and it makes them sound as like fun and light and very human friendly. Right? Because terms and services and privacy policies technically probably have the same information. But even though the words, the terminology feel protecting of the company, not helpful to me.

 

[00:06:20] JP: Right. That’s a good point.

 

[00:06:21] SY: Does that make sense?

 

[00:06:22] JP: Yeah. Yeah. Nobody I think would confuse a terms and service agreement with something you as a consumer are actually supposed to read.

 

[00:06:30] SY: Exactly. Yeah. Yeah. Anytime there’s the word agreement, the word policy, the word contract, it’s not designed for you. A nutrition label feels like, “Ooh!” It’s for me, I get to learn. I get to pick what I want to digest or who I want to consume my content.” It just feels so friendly. So just the name alone is something I love, but the idea is so good. When I was creating the privacy policy for my company, Disco, I had to list out all the different third-party apps that might have contact and had to go into detail, but no one’s going to read the privacy policy. It’s too long, it’s too many words. So just having a little snapshot that says, “Look, here are the key things that you should know about what data’s being used,” in very simple language, just a few icons, maybe a couple of sentences, I think is such a great move from Apple.

 

[00:07:20] JP: So do you think consumers will pay attention to these labels? One thing about nutrition labels is that they are somewhat easy to understand. The information is presented in a great way, but they’re completely optional. Some people will look at them. Some people don’t. Do you think people will actually pay attention to these when they download apps?

 

[00:07:35] SY: I think that they will pay attention. I think it’ll make a difference. So for me, for example, Google does something similar when you connect your Gmail account to a third-party app and it’ll say this app is going to read your email, contact your list, email on your behalf, et cetera. And I have not installed apps because of what they want to do with my information before.

 

[00:07:59] JP: Right.

 

[00:08:00] SY: I’ve looked at that and went, “Wait a minute. Well, you shouldn’t be able to send emails on my behalf. That’s not anywhere relevant to what you need to do.” You know what I mean? It’s clearly collecting and doing more than it needs to do. I have not installed apps and found alternatives for that reason. So I can definitely see a situation where I would read the nutritional label and go, “Wait a minute. This is a note-taking app. Why do you need my location? I don’t need my notes to be geotagged.” And I might decide that, “You know what? I’m going to find a note app that doesn’t collect all that data.” I can totally see people making better decisions based on that information.

 

[00:08:35] JP: Something else that had occurred to me was, do you think this could be a result of the congressional hearings that happened a couple months ago?

 

[00:08:41] SY: Oh, interesting.

 

[00:08:43] JP: Apple, Google, Facebook, and Twitter, they were all really put on the stand and privacy policies came up, data collection came up, and Apple’s got a better track record than the other companies, but still, maybe they’re playing a little defensive here.

 

[00:08:55] SY: Oh, maybe. Yeah. That’s really interesting. That could totally be it, but I also want to give Apple credit because they’ve generally been very pro privacy and pro security and they’re one of the few tech companies that don’t make money off of advertising. I feel more confident in their intentions in a way that I would be a little bit more suspicious of some of the other tech companies. I believe that they have our best interests at heart when it comes to privacy and insecurity in this way.

 

[00:09:24] JP: Well, continuing to talk about Apple, this week Apple announced the first Mac models to utilize their new Apple Silicon chip, dubbed the M1. Apple’s first chip for Macs is an ARM processor built with a five-nanometer process and it’s structurally very similar to the A14 chip that’s currently used in iPhones and iPads. It has eight cores split between high performance and efficiency, as well as cores dedicated to graphics and machine learning. Apple introduced new models of the MacBook Air as well as the 13-inch MacBook Pro and the Mac Mini using this M1 chip. So these machines feature drastically better performance and battery life than the existing models with Intel processors. Since the M1 is an ARM-based processor and current Macs use Intel-based processors, developers will have to recompile their applications to take advantage of the increased performance of these chips. Apple has a couple of tricks to help. Rosetta 2 is a software layer that can run existing Intel applications without being recompiled. And since macOS and iOS now run on the same type of processor, iOS and iPad applications will be able to run on the macOS natively.

 

[00:10:31] SY: So what’s so fundamentally different from the regular processors they’ve been using and why can’t they just make a better Intel processor?

 

[00:10:40] JP: Well, Intel has been trying to make a better Intel processor, and we’ve talked a little bit about their struggles doing so. They’ve been trying to reduce the size of their process. That’s the five-nanometer process that I mentioned. And Intel’s just not there yet. They don’t have the technology and what they call “chip yield” to make processors that small. The other problem is that Intel just makes one processor, the CPU, the Central Processing Unit. And it has to work with a combination of other processors to do things like control your graphics, control the data coming in and out of the ports on your machine. Those are all separate processors right now on Macs and they have to communicate over the motherboard and there’s a lot of overhead in communicating back and forth like that. The M1 takes all those different processor types and puts them into one chip physically so they can communicate really, really quickly and efficiently with each other.

 

[00:11:37] SY: Okay. So I know you’re a big Apple fan. So how do you feel about this announcement?

 

[00:11:44] JP: This is pretty incredible. I’ll also say I’m an old-time Apple fan. So this is not the first chip transition I’ve lived through. Macs started out on Motorola processors. They transitioned to power PC processors in the mid-nineties and they transitioned to Intel processors. Now we’re going to ARM. Apple’s really good at these chip transitions. And every time they’ve done it, the performance benefits have been incredible. So for example, the MacBook Air that’s running the M1 chip has something up to 20 hours of battery life.

 

[00:12:17] SY: Oh, wow!

 

[00:12:19] JP: It doesn’t have a fan. It runs cool and silently. I know. This is like next-generation stuff. It’s absolutely incredible. I’m really interested to get my hands on one of these. I imagine a lot of developers are really interested as well.

 

[00:12:31] SY: Yeah. How do you think it’ll affect your coding or development?

 

[00:12:35] JP: I think in the short term, it might not be a huge difference. I don’t do a lot of iOS development, but my friends that do, do iOS and Mac development have said that Xcode, which is the development environment for writing Mac and iOS applications, is really, really resource intensive. So this could be a great benefit to those developers. I think for myself, I’m more excited about what we’ll see for like laptop models going forward. Will we see more laptops without fans, with longer battery life, with better performance for cheaper or the same price? I think I’m really interested to see where Apple takes these hardware designs in the next two or three years.

 

[00:13:16] SY: I think what’s really interesting and I have no idea how this works is not having a fan. The fan is one of those things where it doesn’t feel like a big deal until the fan turns off.

 

[00:13:27] JP: Oh, it’s a big deal to me.

 

[00:13:28] SY: You know what I mean?

 

[00:13:29] JP: Oh, no. It’s one of those things when I hear the fan on my laptop spin up, I stop everything and I hunt around on my computer. I’m like, “What is running? What is causing that fan to go?” It is screwing up my zen.

 

[00:13:44] SY: Yeah. And so coming up next, we’re going to chat with Ish ShaBazz, iOS Engineer and Creator of Capsicum App, about these ARM-based Macs and other Apple announcements after this.

 

[MUSIC BREAK]

 

[AD]

 

[00:14:13] JL: Triplebyte is a job search platform that allows you to take a coding quiz for a variety of tracks to identify your strengths, improve your skills, and help you find your dream job. The service is free for engineers and Triplebyte will even cover your flights and hotels for final interviews.

 

[00:14:28] SY: Vonage is a cloud communications platform that allows developers to integrate voice, video, and messaging into their applications using their communication APIs. Whether you’re wanting to build video calls into your app, create a Facebook bot or build applications on top of programmable phone numbers, you’ll have all the tools you need. Formally known as Nexmo, Vonage has you covered for all API communications projects. Sign up for an account at nexmo.dev/devnews2 and use promo code DEVNEWS2 for 10 euros of free credit. That’s D-E-V-N-E-W-S, in all caps, and the number 2, for 10 euros of free credit.

 

[AD END]

 

[00:15:15] SY: Here with us is Ish ShaBazz, iOS Engineer and Creator of Capsicum App. Thank you so much for joining us.

 

[00:15:22] IS: Hello.

 

[00:15:23] SY: So tell us a little bit about your developer background.

 

[00:15:25] IS: So let’s see. I started developing apps back in iPhone OS 3. So yeah. It’s been a bit. But 10 years, last 10 years, I think I put about 25 apps in the store, mostly productivity apps, things like that.

 

[00:15:39] SY: Wow! That’s cool.

 

[00:15:42] IS: Yeah.

 

[00:15:42] JP: So your latest app is Capsicum Notebook and Habit Tracker. Can you tell us what went into building it?

 

[00:15:48] IS: For sure. So Capsicum was actually my very first app idea back when the original iPad was announced that day. I was trying to talk myself into buying Apple products, like I always do. I was like, “Well, I guess if it had like a daily planner, then I would for sure get it.” And it didn’t have one, but then I decided that maybe I could make one. So I started looking into app development. It turns out I did not have the skillset and the tool set that Apple had at the time was very limited for something like that. But years later, I found out about bullet journaling and was really enthralled with it and decided to try it again and then I worked with a developer in Australia who goes by the name, Heidi Helen, and we start working on that. And about three years later, we released the first version of Capsicum. We’re currently working on a new version of Capsicum. That one’s going a little bit slower because we decided to kind of rethink some things with the way that Apple is going with SwiftUI and things like that.

 

[00:16:44] SY: So you are an iOS developer and a Mac user. So I’m wondering, what are your thoughts about the Apple ARM-based Mac news?

 

[00:16:52] IS: I am super-duper excited and very worried about Intel.

 

[00:16:57] SY: As a company, right?

 

[00:16:58] IS: As a company, yeah. Because initially, I was like, “Wow! Intel is really kind of lagging here.” If you track like the Geekbench scores, the iPhone SE, which is thought to be like the cheap iPhone or the budget iPhone, is like outperforming the 13-inch MacBook Pro from last year, which is kind of like embarrassing in a way like, “Wow. How did this happen?” So I think it was kind of inevitable that the Apple Silicon eventually took over and I’m super proud of everyone who worked on it and got this done in such a smooth way. So that it’s easy just to kind of like slide over into the new CPUs.

 

[00:17:34] JP: Do you think the transition of the Mac from Intel to ARM will affect your work or improve the way you develop apps?

 

[00:17:42] IS: I think it allows developers to easily kind of like having a platform, which is the Mac, right? So if you’re a mobile developer, most Apple developers are mobile developers. If you notice, there’s like way more apps on the iOS App Store than there are on the Mac App Store, because there’s kind of like different tooling and such and everything is a little bit easier to work with and I think nicer on the iOS side. What the Apple Silicon Macs allow is the same app that’s on iOS to run directly on the Mac without any middle layer. Once upon a time, there was a lot of hype around what eventually became Catalyst, but Catalyst isn’t even a factor with the new Macs. You can run iOS apps directly on the new Macs without having to have the little middle piece to kind of translate it, which is kind of neat.

 

[00:18:34] JP: Just for the audience, Catalyst, that’s the framework that allows you to use iOS frameworks on Mac?

 

[00:18:39] IS: Right. It allows some level of iOS to work on Mac. It’s not really Mac native exactly, but you can kind of more easily bring iOS apps to the Mac.

 

[00:18:51] SY: So how big of a deal is this increased performance when it comes to iOS developers and users of Xcode?

 

[00:18:59] IS: So I’m really excited to see what that looks like real world, but from the slides, it’s pretty amazing. So performance increases, it seems like 3x. That’s huge, right? Development time., also due to the lower power utilization, if you’re programming on a laptop, which a lot of us do, that’s huge also because it allows you to work longer without having to like plug in and things like that. So I think overall just having better tools, like faster, sharper tools allows you to work faster, have less lag and waiting for compile times and things like that. I also think that it kind of lowers the barrier of entry just to tab because now you can do more powerful development on like a MacBook Air for like students and folks who are kind of just getting started with programming. I think it allows them to have greater ability right out the gate, even with the lowest end machine.

 

[00:19:49] JP: Well, speaking of the new hardware, have you had a chance to play with either the development kit yet or the new versions of iOS or macOS? And what are your impressions?

 

[00:20:00] IS: I have played with all of the OSs and actually I love Big Sir. I’m really excited for everyone to build this Big Sir. It’s kind of like an iOS 7 style redesign and that everything is different. It looks very different, like there are no more square edges. Everything’s around it right. It’s been a lot of fun and I just really enjoy having like the full capabilities of the iOS apps on macOS. For example, in messages, one of my favorite things to do is send messages with Message Effects, like lasers or fireworks or any of that stuff. Right? I like sending fun messages and that wasn’t available before Big Sir and now it is. So everything you can do on the phone you can do on the Mac and I think that’s pretty.

 

[00:20:40] SY: So do you think that iOS apps running on macOS will become mainstream?

 

[00:20:45] IS: I do think it would become mainstream because it’s a default option. So if you don’t do anything as an iOS developer, your apps will be available on the Mac immediately. So I think because you don’t have to opt into it, you have to opt out of it. I think there’s going to be a ton of apps available to the Mac right away.

 

[00:21:03] JP: So speaking of apps available on the Mac, Capsicum is an iOS and iPad application. Are you planning on bringing it to the Mac as full or what are your plans perhaps to come in the future?

 

[00:21:16] IS: So yeah. The plan for Capsicum is to bring it to the Mac. But instead of just leaving the checks box checked and just having it by default, we really want to have like a dedicated experience that really takes advantage of the Mac and the larger much larger screen sizes that are available on the Mac, which actually ends up improving the design overall. It gives it a better iPad design and then that also influences the phone. So in thinking about the Mac, I think it improved our overall implementation.

 

[00:21:46] SY: So developers can opt out of their apps running on Mac, but why would they do that? What would be the reasoning behind that?

 

[00:21:53] IS: Some things might not make sense on the Mac. For example, some games would just be just an awkward or unusable experience. Other time for apps that historically have a separate Mac version, having the iOS version go directly to the Mac would potentially impact the revenue model. So if you had like an iOS version and then a more expensive white full feature of the Mac version, having an iOS version come over now impacts how you were thinking about your revenue. So that could be a legitimate reason also, but I think most of it will end up being just the functionality wasn’t right. And I haven’t looked into this because I’m not heavy into ads, but I don’t know how ads and things like that work. You can have a dramatic difference in how things like that go. For example, we have another app called Boomerang, which is a language translator. On the free tier, it is ad supported. There’s some ads, but the way Google charges for translation is per character, including white space. On a mobile device, there’s kind of a limit to how much a person can put in because they’re kind of limited in that regard. But with a Mac, suddenly folks could do a lot of translating, which is a model that we just really aren’t prepared for. So we’d have to kind of rethink some things like that before we open it up to the Mac.

 

[00:23:09] JP: Was there any other news from the Apple event that caught your attention?

 

[00:23:13] IS: Honestly, I was surprised at the range of products that were announced. So I was expecting the Air, but I was not expecting the Pro and the Mac Mini. Some of the limitations I was a little bit surprised about. For example, all of the machines max out at 16 gigs of RAM.

 

[00:23:31] JP: Yeah.

 

[00:23:31] IS: And I was like, “Whoa!” For development, I really recommend 32. Maybe there’s something I’m not seeing. Maybe there’s some fancy things going on with the controller between the SSD and the RAM. But at face value, just looking at it, I was kind of surprised at everything max out at 16, also that I’ve kind of gone like dark mode on everything. So the Mac Mini is only offered in silver. There’s no space gray. And I thought, “That’s kind of weird.” I think just kind of like little things like that. Overall, I find it all very exciting. I thought the power consumption was pretty amazing. We’re looking at like 17 to 20 hours of battery life. That’s pretty cool.

 

[00:24:10] JP: Incredible.

 

[00:24:11] IS: Having no fans on the Air was something I was hoping for, but it wasn’t like holding my breath for. I was like, “Eh, they’ll probably put a fan in it, but they’re able to do it without a fan.” That just makes it like pretty amazing. I was really curious about how the pricing would go and to be able to keep the pricing consistent for the Air. So I think the student version of the Air starts at 899, educational pricing. That’s pretty nice for a 17-hour machine. That’s actually 98% of the laptops on the market. Pretty impressive.

 

[00:24:42] SY: Is there anything else that we didn’t talk about that you want to add?

 

[00:24:45] IS: So interesting questions about the M1 and Apple’s new direction. A few years ago, I don’t know if you all remember Meltdown and Spectre that were these vulnerabilities in CPUs. They were the x86 variety, but also ARM. And Apple put out some patches and software to fix it and software. But at the time, the way I understood it, and I’m not like a hardware guru, so I’m not sure about this, but the way I understood it was there were fundamental issues in the way that modern CPUs were architected that impacted a hundred percent of basically how CPUs run. It didn’t impact the Apple Watch for some reason, but every other product that Apple had, I believe, it impacted. So I’m wondering what happened with the M1. Is the M1 still based on the same architecture? Are there still vulnerabilities there? Or is it protected because of things like the security like T2 chip being built inside the SoC. I don’t know, but I’m curious about things like that. Also about some of the limitations, I’m curious about why we only got 16 is because 32 is difficult to do. Is it because we kind of save something for next year? But I really love that.

 

[00:25:59] SY: Yeah. Thank you so much, Ish, for being here.

 

[00:26:02] IS: Absolutely.

 

[MUSIC BREAK]

 

[AD]

 

[00:26:21] JL: Join over 200,000 top engineers who have used Triplebyte to find their dream job. Triplebyte shows your potential based on proven technical skills by having you take a coding quiz from a variety of tracks and helping you identify high growth opportunities and getting your foot in the door with their recommendation. It’s also free for engineers, since companies pay Triplebyte to make their hiring process more efficient.

 

[00:26:42] SY: Vonage is a cloud communications platform that allows developers to integrate voice, video, and messaging into their applications using their communication APIs. Whether you’re wanting to build video calls into your app, create a Facebook bot or build applications on top of programmable phone numbers, you’ll have all the tools you need. Formally known as Nexmo, Vonage has you covered for all API communications projects. Sign up for an account at nexmo.dev/devnews2 and use promo code DEVNEWS2 for 10 euros of free credit. That’s D-E-V-N-E-W-S, in all caps, and the number 2, for 10 euros of free credit.

 

[AD END]

 

[00:27:30] SY: Joining us is Peter Yoo, Director of Research at Synchron, Inc. Thank you so much for being here.

 

[00:27:35] PY: Thanks for having me.

 

[00:27:37] SY: Tell us a bit about your research background.

 

[00:27:39] PY: So my research background is mainly in brain-computer interfaces. So this is a technology that’s been around for a while, but it’s been popularized recently. So it’s a technology that aims to establish a communication, direct communication between the brain and computers or immediate applications trying to alleviate some of the symptoms of paralysis.

 

[00:28:04] JP: So let’s get into the Stentrode implant. Can you describe what it is and how it works?

 

[00:28:09] PY: So the Stentrode is an array of electrodes, so electrical sensors, that are basically printed onto a stent. So a stent is a self-expanding stent. So this device can be implanted inside a blood vessel. So our brain has many, many blood vessels, which means that we can navigate this device up through a blood vessel into our brain without performing an open brain surgery. So the Stentrode sits inside a blood vessel that sits next to what we call a motor cortex. So this is the part of the brain that controls a lot of your movements and cognitions that relate to movements. So which means that when a person thinks about moving or plans a movement or actually performs a movement, our device can sense these motor impulses, these brain signals using our Stentrode. So which means that we can record brain activity from inside the brain. So then when we get the signal out of the brain, then we can do many things to process it and try and use it to control computers or other devices.

 

[00:29:17] SY: Can you walk us through what went through the development of the device? What was that journey like?

 

[00:29:23] PY: Right. So the Stentrode is being developed for a long time, I think since 2012, maybe even earlier. At the start, it really involved many prototypes, just mounting electrodes onto stent and we performed many, many bench top testing models and also some animal work to establish some of the safety features and how it performed the surgery and also validating the signals that we record from inside the blood vessel. So traditionally, recording intracranial brain activity, which means brain activity from inside the head has been done through a direct contact with the brain or what we call the meninges. Prior to the Stentrode, we didn’t know whether we could record reliable signal from inside the blood vessels. So a lot of work went into that. And just this year, we’ve launched human trials finally. So it’s been building on for a long time, but we got approval to perform the first in human clinical trials in Australia. That’s currently on the way at the moment. So it’s been a long journey.

 

[00:30:27] JP: What kind of software goes into making a brain implant like this?

 

[00:30:32] PY: So in terms of software, there’s a lot of low level firmwares that goes into the implantables that controls the device functions. But also on the other side of it, we’re building usable UIs using pretty standard techniques and also because we’re fast prototyping in terms of algorithm development, so we’re using Python to develop the decoding algorithms and also just performing some neural analysis to inspect the brain activity that we get out of the Stentrode. And in terms of the UI, I’m at the moment, literally just using a WPF app that we custom built that connects to our decoding API, just built up in Python.

 

[00:31:09] SY: So the device was first developed as a brain machine interface to help people with spinal cord injuries. Can you walk us through how we get from that to controlling computers with our brains? It feels like a significant leap.

 

[00:31:23] PY Sure. So it happens in several stages. So first, we have to extract the motor impulses that I described before. So it means the brain activity that we can capture when you think about moving. So the Stentrode records a signal from inside the blood vessel. Then we can transmit that brain signal out of the body wirelessly. So our current implant has a transmission unit that sits inside the chest cavity, subcutaneously. It’s just under the chest skin. So we wirelessly pipe the neural data out that we sensed out of the computer. Then we perform online processing of that neural data. The brain activity that we record happens in a variety of frequencies. So we can pick out certain frequencies that these activities go up and down and we can basically start to characterize what do these frequencies look like when the person’s thinking about moving and what does the frequency look like when the person is not thinking about moving. Then we can formulate a mathematical algorithm or a model then we can apply a set of logics to convert those into basically simple computer commands, a click or no click or a button down or a button up. So it really depends on characterizing the neural signal when the patient thinks about moving and not moving or moving less certain movements versus than other movement.

 

[00:32:46] JP: So you mentioned that the devices in its first ever human trials, what does that all entail? Can you tell us the details of what a human trial is and what kind of work you do in it?

 

[00:32:56] PY: The first-in-human trial in Melbourne at the moment is a safety trial. But also on top of that, we’re also investigating a lot of exploratory outcomes, which has got to do with efficacy. So how well can the patient utilize the system for the daily activities? So this involves a long screening for the patients to determine whether they are suitable for implantation or not. So there’s a very strict and stringent inclusion and exclusion criteria. Then we implant the patient with our device. So we perform that procedure. After a few days, the patient’s discharged and then we start basically we call training. So we start training the patient in how to utilize the system. So during this period, the patient will perform a set of movement and not perform a bunch of different tasks and then we can start to characterize what his or her brain activity looks like when they start thinking about these sorts of things. And then we can formulate an algorithm to detect those changes, to predict those different thoughts based on their brain activity. So it basically, on one hand, the algorithm is learning based on the patient’s brain activity and then the patient will have to react to that algorithm. So it’s kind of this symbiotic learning between the algorithm and the patient’s brain and so on and so forth until it hits a point where both of them can kind of understand each other. So from that point on, the patient can think about doing something, the algorithm knows and can reliably and correctly predict what the patient’s thinking. Then we can go and convert those into a digital command.

 

[00:34:29] SY: So have you learned so far? What are some of the biggest takeaways from this human trial?

 

[00:34:34] PY: I know it’s early days, but so far there has been no serious adverse events, which means the implant has not caused any serious side effects to the patients. All of our patients so far has been able to utilize the system to reliably generate few commands. So discrete clicks, if you will, and they have been able to interface those clicks into existing computer control methods. So we like to think about it as us giving the patients an agnostic extract buttons, if you will. So you and I have physical buttons, but we give them these digital buttons that they can wish to interface with whatever digital device should they wish. So in our instance, we interfaced with an eye-tracking plus click interface system that the patients are already kind of used. So which means that they can navigate to upon the screen just by looking at it and they can decide to engage with it using our brain clicks and also they can kind of click and hold so to speak by thinking, which means they can zoom into small portions of the screen while they’re holding the button, I’m thinking a quartz holding the button, and then once they lock onto a target, then they can release it. And what we really learned from this is that a lot of engineering goes into the hardware development, but also a lot of development is required in terms of a UI and kind of these user logic, how we can utilize this new way of interfacing with computers, how it can best create software and user experience that’s actually usable for people that are in need.

 

[00:36:13] JP: That’s a good point. That leads into my next question. I was going to ask, how exactly does this interface with a computer and is it a custom program? Is it something that is generic per operating system and how much kind of customization goes into that?

 

[00:36:29] PY: So up to the point of generating these virtual clicks is all custom code. But from that point on, from the computer’s perspective, it’s literally just receiving virtual key presses. It can tell the difference between me using my hand to click on a virtual keyboard versus the patient utilizing the brain clicks to generate that same click. That was on purpose so that patients can interface with existing methods of computer control. Obviously, this is a relatively new technology. So the interaction mechanisms that are currently available are not necessarily purpose-built for these kinds of inputs, but I think there’s going to be a huge need from the developer communities to basically create a very user-friendly UI for BCIs in the future. There are slight nuances between us clicking physical buttons versus someone generating these virtual clicks through thinking. There are slight differences and obviously all the existing methods of interactions and UI are not catered towards these brain clicks because they haven’t existed before.

 

[00:37:36] SY: Was there anything about the trials so far that surprised you?

 

[00:37:41] PY: The system uptake, how quick it was so far in the patients that we had so far. In the first patients, obviously we were learning a lot, so we’re learning as we go. We had a brand-new signal to look at that we’ve never had before. So it took a while in terms of deciphering the information out of that and creating a good pipeline to convert these brain thoughts into digital clicks. But once we’ve established that next few patients, they’re able to basically generate those virtual clicks on the first few minutes, I would say, of trying to use the system and they’re reliably controlling the computer with it by interfacing with an eye-tracking plus click system. So that was relatively surprising, in terms of the quick uptake and the level that they maintain control over right at the start. So that’s the most surprising for me.

 

[00:38:35] JP: How individualized is the hardware and the software right now? Is this a solution that multiple patients can use or do the models and the algorithm have to be recreated for each individual person’s brain signals?

 

[00:38:49] PY: So that’s actually an active area of research that we’re looking into. And so the overall pipeline and everything is the same across all, everyone, except the entire pipeline that converts the brain activity into clicks. We call that a decoder and the decoder consists of a variety of layers. So there’s a layer called preprocessing layer and then there’s a layer called classification layer and then the last layer is called logic layer. So the first two layers somewhat have to be tuned to at each individual, but their tuning process is obviously automatic. So while each decoder is individualized, the pipeline is generic.

 

[00:39:30] SY: It might be maybe too early to say this, but how far away are we from devices like this being widely used, widely available by the general public?

 

[00:39:40] PY: I think that’s going to be a little while in terms of general public. So we’re starting off with paralyzed individuals to see if we can improve their quality of life for this technology. But in terms of general public, I think it’ll be a few while away.

 

[00:39:55] SY: Decades? Years?

 

[00:39:57] PY: I would say upper-end of years.

 

[00:39:59] SY: That’s not too bad, actually.

 

[00:40:02] JP: It’s actually good I think.

 

[00:40:03] PY: I’m being optimistic.

 

[00:40:05] SY: Okay. Thank you so much, Peter, for being on the show.

 

[00:40:08] PY: Thank you very much.

 

[00:40:20] SY: Thank you for listening to DevNews. This show is produced and mixed by Levi Sharpe. Editorial oversight by Peter Frank, Ben Halpern, and Jess Lee. Our theme music is by Dan Powell. If you have any questions or comments, dial into our Google Voice at +1 (929) 500-1513. Or email us at pod@dev.to. Please rate and subscribe to this show on Apple Podcasts.