June 6, 2023

Beyond Automation: The Future of Work With Autonomous Drones -- Transcript & Video

In case you missed our most recent webinar with Commercial UAV, Beyond Automation: The Future of Work With Autonomous Drones, you can catch the full recording and transcript below. 



Kelsey: Hello everyone. Welcome and thank you for joining us for today's webinar, Beyond Automation: The Future of Work With Autonomous Drones, brought to you by Commercial UAV News and sponsored by Exyn Technologies and Easy Aerial. We have just a few housekeeping notes to go over before we officially get started. If you have any technical questions or issues using the WebEx platform, please use the chat box and I will respond right away. If you have any issues with audio, please click on the phone icon above the chat window to receive the teleconference information. For those that do call in to ensure call quality, everyone's lines have been muted today.

Kelsey: If you happen to get disconnected at any time, you can log on again using the instructions provided in your webinar confirmation email. If you continue to experience difficulty, please email info@expouav.com and we will respond as soon as we can. We encourage you to ask questions throughout the presentation. Please type your questions into the Q&A box and select all panelists when submitting your questions. Our presenters will answer questions at the end of the presentation today, but please do not hesitate to ask your questions during the presentation as they occur to you. This webinar will be recorded and you will receive a thank you email with the on-demand materials within two business days. And with all of that out of the way, I would like to introduce our moderator today. We have Lynn Jean-Louise, product marketing specialist at Exyn Technologies. And at this time, I would like to welcome our speakers and hand the webinar over to Lynn. Welcome Lynn.

Lynn Jean-Louise: Thank you Kelsey, and welcome everyone to the CUAV Webinar Beyond Automation: The Future of Work With Autonomous Drones. Thank you again to the CUAV team for having us. And thank you for all participants who are choosing to choosing to spend their afternoon with us today. We're thrilled to be here with you, and we have some great content coming your way. So yeah, like Kelsey said, my name is Lynn Jean-Louise. I'm a product marketing specialist with Exyn Technologies, and I'll be helping guide our session for today. So before we get started, if you have any questions along the way I wanna echo what Kelsey said I'd love to hear them, feel free to drop them into the chat, specifically under the Q&A box. We'll try to address some along the way, but just as an FYI, there is some time specifically for Q&A at the end of our session. So, I wanted to start off today by introducing our speakers. You'll be hearing from Nader Elm, who is Exyn's co-founder and CEO Raffi Jabrayan, who is Exyn's VP of Commercial Sales and Business Development, and we have a representative from one of our company partners here as well, Jason Fry is from the Easy Aerial team and leads up their work in government as director of DOD and federal programs. Welcome to you all. I'm excited to get the ball rolling. So let's kick off. Could you tell us a bit more about your background and how you ended up in the drone business? I'll start off with you Nader.

Nader Elm: I forgot to unmute. [laughter] Good afternoon good morning everyone. So, my path to my exposure to the drone industry was actually quite coincidental. I was gainfully employed in a variety of other deep technology companies and my background is very much in commercialization of technologies and launching new products. So, I was introduced to a gentleman by the name of Dr. Vijay Kumar, who is leading some groundbreaking research at the University of Pennsylvania, focused on autonomous robotics generally. And once I met him, I kinda saw some of his research. That's basically how I got introduced to autonomous drones. And, Vijay and I co-founded Exyn. That was back in 2014.

LJ: Great. Over to you Raffi.

Raffi Jabrayan: Thank you Lynn. Hello everybody. I'd say my introduction into the drone industry was quite coincidental as well. I was in the mining industry for a number of years and about five years ago I was running digital innovation for a mining company and I was at a mining event in Toronto and a bunch of young, eager engineers and the like approached me and they said they had these phenomenal drones. And that team ended up being Exyn. And now a short time later I joined. And it's been a fantastic four years now. Pass over to Jason.

Jason Fry: Hi, Good morning, good afternoon everybody, I'm Jason Fry. I'm the director of Duty and Federal programs here at Easy Aerial. My path to drones was more straightforward. I got an aerospace engineering degree and went and worked in guidance navigation control on suborbital and orbital launch platforms. Moved over into hypersonic missiles for a time period and then decided I wanted to try and find more of a startup quick, fast paced atmosphere and found Easy Aerial and been loving it ever since.

LJ: Awesome. Thanks for sharing that everyone. You'll have such a wide array of backgrounds that all landed you here in the same place. Happy to have you and shift gears into what we're really excited to talk about today, the future of working with autonomous drones. So, this first question goes to Nader. With all this talk about autonomy and drones and AI, could you provide some insight into what autonomy actually means, specifically within the context of a drone?

NE: Yes. So, obviously we're gonna be talking about autonomy and its application in the drone world. But generally when we talk about autonomous robotics and that's the research that was going on at university of Pennsylvania. They look at it more generally and basically try to emulate what you and I normally do without really much forethought in terms of navigating ourselves through different space. So, as we go through the presentation, again, I'm trying to be conscious of the fact that we're talking about this more generally, but we'll be using specific examples, obviously what Exyn and Easy Aerial have done. So, if you wanna go onto the next slide. Lynn, one of the things really just to set the stage, the interesting thing to know is that the word robot actually just celebrates its 101st anniversary this year.

NE: It was first used in a play back in '19, 2021. And every imagining of the future has included robots in it. And what you're seeing over here are actually some renditions of variety of different kinds of robots doing a variety of different things. And this was really the inspiration for me when I kind of met Vijay. I'm a big movie buff, and all of these futuristics science fiction movies have included these kinds of robots, and they're doing some really really remarkable things. So, what you'll see at the top left, obviously, Wally, you've got companion robots, assistance robots on the bottom right. And one of the interesting things to notice across all of these, they're all robots doing things, but at the core, common to all of them is the fact that they're all fully autonomous.

NE: So, it's a rather convenient assumption that the robots are autonomous and then they are doing something. So, this is science fiction and really is an inspiration for a lot of us technologists to actually start commercializing some of this. And what you'll notice on the bottom left as an example, is a scene from Prometheus, which has an orb going through an alien ship in an alien world, mapping it comprehensively so that the astronauts can figure out, okay, what to do. And while that's science fiction it is in large part what we are doing today. And we've commercialized that and it's been in market for over four years. So, we are at the cusp of turning science fiction into science fact. And that's the exciting thing about autonomous robotics generally. And it's applicator first in instantiation in the aerial world.

NE: Obviously, we wanna kind of see these get developed further, not so much for those folks who recognize the upper right. There will be some scenes for applications as well. So, we're kind of very conscious about how these things get built and deployed. But essentially, I think we're at the beginnings of some very, very interesting developments in autonomous robotics. So, if you wanna go onto the next slide, Lynn. One of the interesting things when we talk about autonomy is, it's a big word and it covers a fairly large universe. And the unfortunate thing about the fact that it... The word autonomy, not having much definition is everyone claims to have it and not disingenuously. So, I think what it really requires is some definition and some scoping of the different levels.

NE: So in the driverless car world there's an organization called SAE that's defined different levels of autonomy specifically for driverless cars. But we don't really have any definition outside of that. So, what we said about doing was, we wrote a white paper, this is over a year ago, trying to give it some definition and leveraging the good work that the folks at SAE had already done. So we took their framework as inspiration, and we published a white paper describing these different levels. And it's been recognized by the IEEE. So, we took a very academic approach to this, really just kind of put some clear definitions. So, I won't go into too much detail. We got a link to the white paper over here. So you're more than welcome to dive into it which will kind of go into details.

NE: But essentially the idea here is basically you have vehicles. So a drone is a vehicle, and in its initial early stages, it's very much about the pilot flying the system without really any assistance. And we've had drones in market for well over a decade commercially available as well as for the consumer. But over the these past few years, we've seen different functions and features added to the drones to make it easier for the pilots to operate the system. And that's kind of making it a lot more accessible, a lot easier to use. And that's what we are seeing in level one to level three. So, the real autonomy begins at level four. And this is again similar to what's happening in the car world, where it's about removing the pilot entirely from the inputs to the vehicle in terms of how it flies and effectively giving it a high level mission to accomplish, and then leaving it to the robot to break that down into how to actually navigate through the space and accomplish its mission, know when it's done and then come back.

NE: So, what we're at in the industry actually is level four A and moving to level four B. But we still have a bit of a journey to go to fully autonomous robust reliable in every kind of environment systems which is ultimately level five. So, what...

NE: We'll, show some videos a little later on in this presentation. So you actually see demonstrations of what we're doing in autonomous robotics, aerial robotics in very, very difficult, dangerous environments, a little further on. So, if we move on to the next slide, one of the interesting things to note, any kind of vehicle typically has, well, tethers to various things that help it navigate, help it actually control its direction. So, in order to have autonomy, you systematically have to sever all of those tethers. And what we're describing over here is essentially partly that. So this is a Commercial UAV presentation. So it's unmanned aerial vehicle. So obviously you know, it's pilot less, there's no pilot onboard the aircraft, but there is a tether back to a controller that the pilot is feeding instructions through.

NE: And that's essentially the first thing, is making sure that there is no pilot in the loop as the robot flies. But we go further than that because I talked about the various features that are coming on board these drones, which make it more accessible, easier to use. And this is in large part infrastructure that's put into place to help the robots in its navigation. And things like, for example, GPS, this is critical in robotic systems because one of the first questions a robot has to ask and an answer is, where am I? And GPS really simplifies that because you ask the question, satellites bounce back and kind of tell you exactly what your X, Y, Z coordinate is. And that then establishes where you are, you know where you want to go, and then you use GPS to kind of feedback your progress along that journey.

NE: So, we have severed that tether and basically said, "Okay, what does it look like for the robot to answer that question like you and I do as we navigate through a building, kind of figuring out, okay, where are we, where do we need to go and how do we get there?" The other one is basically communications more generally, whether it's WiFi, LTE, cellular communications, or what have you. One of the other things typical robotic systems do, and again, driverless cars, obviously they rely on GPS, but the other thing, they offload certain compute because it... To have everything on board is really, really hard. But if you're gonna be truly autonomous, you have to have all the intelligence on board with no reliance, with anything off board. So again, without, you have to make the assumption that you may have communications in parts of the flight, but you may lose it.

NE: So, without any communications, the mission should still prevail. And that is another key component for full autonomy. And with that, we've systematically untethered the system. So, it basically means all the intelligence obviously is on board. You can set the mission on, and that's the only time we need communications with the robot is pushing the mission to the robot. And once you press play, all the intelligence on board has to be able to repeatedly, reliably execute the mission. And in order to do that, basically means that you operate beyond line of site. You enable a variety of different kind of missions for the robot to go and execute without a pilot in the loop without putting an operator in dangerous situations. But the other side effect of that is basically makes it a lot more usable. So, you as the operator don't have to be the expert in driving the vehicle and robotics because all the intelligence is on board.

NE: And so, that's basically it. What you see on the right hand side over there is a tablet and the operator input kind of goes into that once you feed the mission to the robot, that is the extent of the operator interaction. Our operators are typically experts in something else and they use the robots as a tool for what they need to do, which is principally data collection. So, if you wanna move on to the next slide. So, this is what the robot looks like. Again we're exhibiting our robot, but general fully autonomous robotic systems, we'll have pretty much the same components on board. Now, what we want to illustrate over here is basically all of those things. So, what you'll see over here is there is a drone component. So that is the frame that the motors, the rotors, the legs, the battery.

NE: So, the big cube that you see hanging down, that is the battery that feeds energy to the vehicle itself, but incorporated into this. And by the way, this is the collaboration between Exyn and Easy Aerial, where we've done a tight integration of the components, the hardware components, and then the integration of the software to have the fully autonomous system. So, the other components we need to put into this are obviously the communication system, the thing that the communicates with the tablets so the operator can interact with the robot the compute on board. So, all the software algorithms for sensing the environment, building maps, navigation and establishing its path in those various different environments, feeding the control commands to the vehicle on where to go. And then kind of the iterative loops around all of that, all of that happens on board. So, the computes components are all on board, and there's a variety of different sensors. So, what you're seeing at the front with the Exyn logo is a LiDAR that effectively generates 600,000 points every second and is covering the entire environment. And that's how it builds its map.

NE: So, all of these components have to be deeply integrated, working concert with each other to solve some very, very real problems and answer very difficult questions in real time, all on board. And this is basically what we have over here, a fully autonomous robot that goes into for the purpose of data collection, in terms of building digital three-dimensional models, digital twins of different kinds of environments. So, if you go onto the next slide. Oh, yeah. So, at this point, so that is essentially the overview of autonomy and the platform, the robot itself. So, the next natural thing is, okay, so what's its application? And the initial application for us is, in underground mining and at this point I'm gonna hand over to Raffi so we can describe the use case in mining. Over to you, Raffi.

RJ: Thank you Nader. Lynn you can go ahead and start the video whenever you'd like. As Nader mentioned, one of the first places that we operated and where we still continue to operate the most is in underground mines. And this is where, the AL4, the Autonomy Level 4 really comes into play. Whereas before a lot of the autonomous drones or robots were basically going point to point, we are able to get a place where we're able to give it a general area to go to and it'll map the entire area and return home once it's ready. What we see here is the view that actual drone is seeing. So, the white voxels that you are seeing there is what the drone is seeing. And anywhere that is not mapped, it's gonna go and try to complete it before it comes in.

RJ: Now we're able to fly at quite a high level at two seconds and faster than two seconds per second. And the more interesting thing is trying to get to areas, or being able to get to areas which were previously completely unknown to the operators, previously unknown to the actual mining operations or whatever operations that you may have. So, we're able to get out there, map enormously large areas, very, very accurately in a very efficient manner without, as Nader mentioned it, without having any tethers whatsoever. This is vastly different from what was relevant or what was around in the industry many years ago where you basically have to tell the robot where to go, what to do, how to come back. We don't need to do that. Our robot is able to go and map an area, and if it needs to change course on its way back because something is coming their way or it has seen somewhere where it can go and come back and optimize its route, it's able to do that as well.

RJ: And where this comes into real life applications. Next slide please, Lynn. This is where it really shows us what we can do. So, not only do we map stokes and so on, this is a real life case where we were called a couple years ago to a certain mine where unfortunately they had had an underground incident and they needed to go and map an area where realistically it would be very difficult for people to get to. So you're seeing a completely autonomous flight here. The drone is going down adrift, it makes a sharp right and at this point, it's beyond line of sight. So, the operators are not able to see where the drone is going. Now, you can see there's a multi-level collapse here. So, this is not something that happens in mines every day. However when it does happen generally it's the brave men and women from the mine rescue team who are forced to go into these areas to basically save their fellow miners or go in there and see if it's safe to go back in.

RJ: What we were able to do is we were able to go in and collect all this data in a matter of mere minutes. This flight is approximately six or seven minutes. Now of course, we've sped it up here, but you see how it's able to traverse very, very difficult areas. There's ground support all over the place. There's mesh, there's protruding steel rods. It's able to go and completely get a map and return home. And here's where the interesting part is. As it's returning home, you're gonna be able to see in the far distance where the mine rescue and the survey folks are. So they're over 100 meters away under fully, fully supported ground where there is no risk to them. And before they go in and before they take the next steps in, again whether it's a rescue operation, whether it's just basically seeing if it's safe to go back in, they not only have a live or a video of what has happened, they now have a full 3D map, a 3D image of what they're getting into. Next slide please, Lynn.

RJ: And these are some of the current applications that we have. We spoke about underground mining. Now again, of course we've talked about a couple of the things that we do, but the underground market is very, very large and there's many different things that you can do there. So not only of course, are we going and doing mapping of the stokes, we map the actual drifts. We're able to go in there and do underground stockpile reconciliations where we're working in miner rescue and there's plenty more coming down the pipeline that we're able to get to. But it's safe to say that Exyn has revolutionized the way that underground mine mapping has been done. And again, not only are we able to do it, as I mentioned earlier, much more accurately, but we're able to do it much more efficiently and much safer, which is one of the key components that we've always talked about, is ensuring that the people who are going to work and using our product are able to work in a safer environment.

RJ: There's also a lot of work that we do with the government. This is something that I won't touch on too much, but again as far as gathering maps in very, very difficult areas, I would say this is something that we excel at. Something that we've also moved into recently is infrastructure inspection. And this is a lot of the work that's being done around bridges, silos, the wind turbines and things of the sort. This is very, very important work that again, we're able to do things in a matter of hours that used to take days, if not weeks. And of course the amount of power that's gone into this is from a resource perspective, from the personal perspective, we're able to reduce this greatly. This may bring up the questions, is automation, is robotics taking jobs? Definitely not. What we're doing is we're taking people who are doing work and we're retraining them to do different work. So, in the mining sphere, for example, a surveyor is still a surveyor, but instead of being underground for four or five hours, lugging heavy equipment going into dangerous areas, they're underground for maybe 30 minutes, working with state-of-the-art technology and spending the rest of the time on the surface analyzing data.

RJ: And of course there are different areas we work in as well. Energy, warehousing, so on and so forth. There's a vast array of things that we're able to do and again please feel free to check out our website and see some of the areas that we're working in as well. And of course, you get to reach out to us anytime as well. I'll pass it back over to Lynn.

LJ: Thanks, Raffi. Just unmuting. All right. So, for folks in the audience by now, you're probably thinking between what you've heard in this session and what you've heard from General Buzz in the industry, that there are likely some rules and regulations around how drones and AI are to be used commercially and even within the government. So, this question goes to Jason, could you shed some light on that for the audience?

JF: Yeah, thank you Lynn. So I'm gonna spend a few minutes talking about regulations and as, and with within sUAS Sphere, and then also with relation to AI. So the... Once you move from the... A lot of what we were talking about was subterranean or indoor operations, once you move from those indoor subterranean operations into outdoor operations, your regulator changes. So when you're outside in the US you're under FAA aerial space generally, most commonly. So with that, then you have to make sure that you're operating within those regulations safely and legally. So the main overarching document that's usually at the FAA is 14 CFR 107 or more commonly referred to as Part 107, or just 107 generally. That guide lays out a framework of how to operate UAS systems in a public airspace.

JF: So, the FAA has actually set up a really nice website called Drone Zone. It's real easy to google. You can go on there and it has explanations and examples and all kinds of information for you to use so that you can understand and make sure that you're operating properly when you're going out and doing UAS operations. I'll walk through a little bit of just generally what the requirements are for activities with drones like Easy Aerial and Exyn Technologies. So generally speaking you would need to have a 107 license pilot. That's a test that you can take and get that license, that license holder would then register with the FAA on one of these sites. Then you would need to register your aircraft. And then a new thing that's come down over the last year is the remote ID, which is installed in Exyn Technologies drones, as well as Easy Aerial platforms that would also need to be registered with the FAA.

JF: So, once all those initial registrations are taken care of, then you're ready to start looking at your airspace and go operate. There's lots of different types of airspace from uncontrolled or unrestricted to heavily restricted or completely no fly zones. A lot of this varies based on the, where you're located, your proximity to airports, big cities, all this different stuff. So it's really, so there's a bunch of tools available from the FAA and also some private tools that help you identify a location and then what type of airspace you're trying to fly in. If you're unrestricted or uncontrolled, then pretty straightforward, you're pretty much good to go and fly. But as the regulations or as the restrictions increase the type of airspace you're in, then there's a process that you can apply for that usually takes about 24 hours to get what's called a LAANC, L-A-A-N-C. It's a Low Altitude Authorization Notification Capability. I'm not sure why it's called that, but it is.

JF: But what that gives you, the permit that that gives you is you can fly up to 400 feet, a 100 miles an hour, not over people and within line of sight. So, that's kind of your baseline, kind of UAS operations. It gives you a pretty good envelope to operate in and it covers a lot of stuff. But as the drone industry and our customers and everybody is getting more and more exciting applications and doing all sorts of different things, we're starting to wanna go outside of those initial applications. We wanna go beyond the visual line of sight. As Nader, was saying on earlier, we wanna go above, fly above people. And so there's further processes, basically applications that you can do with the FAA called like BVLOS waivers or COAs, where you can stretch, basically apply to stretch those boundaries.

JF: Where one of the, like to hit on AI within this kind of structure is that I think AI is going to be... It already is, and I think it's going to continue to be more so relevant in that a lot of AI is providing extra safety. I don't know that it was originally intended on doing that or it was originally kind of going out to collect data and do these other things. But as the AI has gotten smarter and the safety features, it's basically building in a lot of safety features. And I think as the AI gets more robust and object avoidance type of things and different kind of scenarios to make sure that the UAS system is operating in a manner that's at a maximum safety level, what that allows is the FAA to make easier approvals of your different waiver of applications when you go through those processes. I think that that covers it. And feel free to ask questions. It's a big world, so there's a lot of different stuff.

LJ: Thanks for bringing some more information around those standards and regulations, Jason. While I still have you, I kind of have a follow up question. We're gonna shift gears a little bit. So, I'll pose this question to you and Nader after Jason please feel free to jump in as well. I'm curious on your take about this. So, you've all provided a lot of great detail on how autonomous systems work in which environments they can be helpful and some information on mandates around system use. So, thinking more broadly, what are some ways that you see this technology being advanced and applied in the future? What's up next for autonomy AI and drone platforms in general?

JF: So, I think that... I go to quite a few conferences and various different things and every time I go to a new event or a new place, or talk to a new person, there's somebody with a new payload and a new idea. It's amazing how fast the state industry is moving. A number of use cases that people are coming up with now, it seems like it's just exponential and different ways to use them from security and inspection and search and rescue and all these different groups or areas that are applicable. So, I think the future of our platforms and systems and our partnership with Exyn Technologies, I think is kind of expanding the fleets, the actual hardware itself and expanding the payload capabilities of those hardware systems. Easy Aerial does a lot of drone out of box solutions, so we provide hangar systems that go, that can actually house the drone and be mounted on vehicles and things like that as well. And then I think the complexity of operations being accomplished is also going to increase. So a, big swarms and stuff like that are very much in the news and things like that. And I think what we're going to start seeing is full facility solutions where there's drone airfields that have sets of drones or groups of drones that are all doing different operations at the same time.

JF: All those operations are being controlled remotely from a centralized control center that's doing BVLOS, the Beyond Visual Line Of Sight operations. And what this allows is you can have multiple specialists or single specialists in a single location that are actually utilizing multiple different sensors at different times. So, it allows the work to be accomplished much easier when you're able to do these things. So, kind of a facility example would be, you have some security drones, you have some drones performing inspections, you have other drones doing inventory management on that type on that facility. And all those are operating with a, in an autonomous manner and then the operators are able to just focus on the payload. That's really the key. And Nader was talking about that too, it's taking the flight portion out of the drone system so that the operators can focus on what they're really trying to do, which is collect data or find a person that jumped over a fence or something like that. Next person.

LJ: Yeah. And I'll pass to Nader as well, if you have any feedback on this?

NE: Yeah, so for us, as we say, there's the synergy of the intelligence to go on board, robots, aerial robots, as well, and the underlying hardware, the vehicle itself. So, there's gonna be a continuous and intimate relationship between those two going forward. So, as we look at all of these applications, we see there's certain ones that use cases that will require, pilots in the loop, other ones, I think, using autonomy as a multiplier, or as an effective way to kick off missions, put people out of harm's way. So, we continue with that thinking. We're now looking at, okay, so where do we go to from here? We as ExynAero more generally, autonomous robotics. So, if you want to go on to the next slide. So the, where we are right now is in the middle. So we've got an aerial system going into the kind of most difficult of environments. So, we were very purposeful about going into mining because, and using aerial robotics because we wanted to solve the three-dimensional problem first, and then put it into uncharted, no communications, no GPS, no prior information environments.

NE: And what better way to showcase that capability than in underground mining? But that is really the tip of the pyramid. As we start looking at applications and more general use of the systems, we start looking at different kinds of platforms, different environments, and even different transport modalities. So, what you'll see to the left of the center is we've taken our intelligence and put it onto a variety of different airframes. So what you're seeing over here is a very, very different one. So, it's a coaxial. So it's two rotors spinning on top of each other, much like the robot that was flying on Mars. And so, now you've got this smaller format robot that can fit through doorways. So it was one of the questions. So, porting the capability and putting it onto different kinds of aerial platforms, different configurations of platforms. But then going beyond that, so if we continue left, we've solved the three-dimensional problem. But what would it look like to actually take our intelligence, put it onto ground robotics, which lives in a two-dimensional plane? So, what you're seeing over here is the beginnings of our work in collaboration with Boston Dynamics and Trimble, where we're using our intelligence to guide Spot in its mission to go and collect information on various kind of projects like construction, infrastructure, and so on.

NE: And again, Spot already has some rudimentary forms of autonomous on board, so that has its own applications. But if you need that high level AL4, then there is the opportunity to put our intelligence on that. So, three dimensions moving to two dimensions, and we see other universes that we can... Other domains that we can actually address as well. We've had a lot of interest, and over time, we will be moving into those kinds of environments, which is underwater. So, submersibles right now, again, very much guided, but with full intelligence on board, you'll be able to kind of deploy it in variety of different situations like hull inspections or port inspections, inspecting critical infrastructure like power lines or underwater communications lines that need continuous monitoring and maintenance. And then taking it from underwater all the way into space going, back to the original slide where we were talking about robots in space, where right now we've got constellations of satellites that have been launched for communications.

NE: These are the CubeSats that SpaceX is putting into orbit, low earth orbit. All of those right now, in the focus has been very much on the launch systems, the deployment of all of these CubeSats. But they'll all need maintenance. They'll all need refueling and de-orbiting. And right now, that relationship as Jason mentioned, the relationship between an operator and a vehicle, that one-to-one relationship really isn't scalable. So having autonomous robots in space that can actually navigate themselves to the close proximity of a target with the ability to do that, and then hand over to an operator to do the maintenance refueling and de-orbiting, or ultimately maybe even do it fully autonomously without a pilot in the loop.

NE: There are applications even in space, asteroid mining, moving onto the moon. There's all of these missions currently being contemplated by all these space agencies. So, we look at autonomous robotic systems right now and its applications in industries, and very linearly we're kind of saying, "Okay, here's what we're doing right now. We can replace it with a robot." But over time, I imagine that autonomous robotics will continue to develop in different areas, but will also inspire a whole bunch of things we haven't thought of yet. And the best way I can characterize that is basically when for example we had cellular communications, we had telephones, and then Blackberry came out with the internet connected phone. Emails were coming through, but it was still very much linear thinking, and it led to the advent of the iPhone.

NE: And the breakthrough app on the iPhone was email and a browser. But over time, GPS was integrated and maps appeared on these devices. And only then could an application like Uber exist because it needed that underlying set of technologies to be integrated before breakthrough applications like Uber could be defined, designed, launched, leveraging all of that underlying capability. So, I imagine the same will be happening over time as well. So, I imagine the vehicles, the intelligence serves as a foundational element to some pretty wonderful and crazy new applications, and use cases coming up in future.

LJ: Awesome. Thanks Nader for providing some more information around that bigger picture. And from my perspective, robotic systems and drones in general used to be something that felt so farfetched and decades away, and it's remarkable to see such advanced technology being created within our lifetime. So, at this point, we're gonna kind of smooth transition into our Q&A. I hope that you've enjoyed what you have heard so far. On the screen, you'll see a couple of QR codes for Exyn and Easy Aerial that you can grab, or take to get to our websites to contact us, to our contact us pages. But I'd like to kick off today's Q&A session. I believe there are some questions that came through in the chat, so I will start with those. One second. Great. Okay, so looks like this first Q that I'm seeing is going to you Raffi. Raffi, are you seeing 5G and computer vision technologies further advancing the capabilities in autonomous UAVs domain?

RJ: Yeah. Thanks Lynn. That's a great question, Alex. So, most definitely, I mean, with everything that we're seeing with computer vision. Other than the videos that are coming out right now, the deep fakes realistically there is a whole new world that is gonna be opened up for companies who are able to capture data, both from a video perspective and from 3D imaging and so on and so forth. And of course, how does 5G fit into it is just being able to transmit that data in a much faster way. Now, we're seeing a lot of advances with these networks actually in the mining space. So, underground companies are installing connectivity networks. So, the future of what we're gonna able to see is obviously we go and capture the data. Our computer vision's gonna be able to analyze this data and within... With robotic process automation, have a quick analysis done of it and have that sent to the surface immediately.

RJ: What that does is gets to the core of short interval control. So what the mining companies have been talking about for years of being able to make decisions on the fly, as opposed to having decisions being made next shift next day, a week from now is gonna completely change. And of course, this isn't limited just to mining, that's where we operate the most. But obviously when you're talking about some of the potential government applications, some of the potential infrastructure applications, probably the computer vision stuff and the 5G networks are gonna be even more relevant in that area. So, it's a long-winded way of saying, most definitely both those things, computer vision and 5G are gonna play a huge role in taking this technology forward.

LJ: Awesome. Thank you, Raffi. Looks like Alexander says thank you as well. We have another question from Michael Shepherd. Which software is being used for LiDAR processing?

NE: So, I can answer that one. Our software from top to bottom is actually custom and proprietary. There are off the shelf software that can be used, but because of the deep integration that we need to do, because everything is running real time on board and we've got very limited CPU budget, we are just using a simple intel quad core. So, we really have to optimize, we have to have deep control over all the software. So for that reason, the full stack is including the date processing of the LiDAR data stream is actually proprietary.

LJ: Thank you. Sorry, advanced by accident. One second. All right. Okay, so next up in the queue from Ryan Pastor, this is also back to Nader. What minimum spacial requirements are required to fly the system in a building? Is it accurate enough to fly through a standard doorway that is open? Great question, Ryan.

NE: Yeah, so, what we showed you with the Easy Aerial platform is really optimized for a particular kind of mission. So, we match the hardware to the mission, and in this case, it's mostly focused on underground mining or mining in general, as well as infrastructure inspection, where we have a little more leeway in terms of the size of the platform. However, as I mentioned, our software has been implemented on a variety of different platforms, and we have put it onto a drone, small enough to fly through doors. There is a trade-off though in managing that because it has to carry the compute and sensing payload. So there is a payload that it has to carry, which reduces the mission life. So, the smaller the platform, the smaller the energy and payload it can carry, the shorter the mission life. So, we've done it, and we are watching the continuing developments in the drone space, in the battery space and in the sensor space. So, we can shrink things down even further as we look for applications in smaller environments. So the answer is yes, but we haven't commercialized it fully yet. And I mentioned the other platform, it can fit through a double door. It will be able to fit through. So, we have seen applications there.

LJ: Great. Let us know if that answers your question, Ryan. Jumping to the next one in the queue, this one's probably gonna go over to Jason from Alex Danes. So what is the FAA stance on fully autonomous UAS?

JF: I'm not sure what the specific stance is on fully autonomous UAS. The BVLOS waiver is a thing that's a lot of people are going after now. As Nader was talking about, we're still, the state of AI is getting closer and closer to that full autonomy, and I think that the FAA is going to need to spend time on developing how they wanna use these systems and how they wanna regulate the systems. So, right now a lot of the Weaver applications that we're working with and doing are sort of that one step lower than the full autonomy stack. So it's not a full answer, but that's what I have.

NE: If I might chime in actually...

JF: Yeah, go ahead.

NE: So, yeah, exactly. So the first thing to really appreciate is the role of the FAA. So they... Their mandate is to make sure the skies are safe. So, in that sense, everything has to be very thoughtful and purposeful in terms of the developments and everything that we're seeing right now is incrementally increasing the scope of UAS and its integration into the airspace, but also as they build further confidence, they can actually push out the envelope. So, my perspective from a rather naive thing, I'm certainly no regulatory expert, but what I can say is basically, AI, I imagine will be a core and fundamental part of the future of aviation anyways. Right now, the most of piloting that's done on commercial airlines is actually through AI and intelligence onboard the plane.

NE: But we haven't removed the pilot, and I see the same thing happening over time. We export more and more of the load of piloting the vehicles to the aerial unit itself. And at some point I imagine there will be a switchover, in order for that to happen, I imagine there has to be a new level of not only intelligence onboard the robot to sense and avoid other aircraft, but also integrate it because it's not everyone's gonna be flying the same software, so they have to recognize each other, communicate with each other so they can actually safely navigate through a heterogeneous airspace. So, in that sense, I think there's still more work to be done. It's got to be done in partnership type, partnership with the FAA and other regulators. So we can continue to build confidence to the point where we can actually let these robots fly without anyone in the loop and beyond visual line of sight.

LJ: Thanks for adding that extra info, Nader. Looks like we've got another question from Pula. Excuse me if I pronounced your name incorrectly. Who wants to know where they can find an online training for these autonomous systems? I guess this goes to Raffi or Nader. [chuckle]

RJ: Yeah, I could take that. So we don't put anything online to be honest with you. Having said that, we're more than capable and we've done so before where we do virtual training. So, we've actually trained people down in Africa virtually. We've done it in Latin America and various places. So, the training is actually quite simple. If we could have people up and running generally within a day or so. And that's with people who have zero experience. Again, the way it generally works is if somebody purchases products from us we offer also onsite training where we can hand carry the product and do white glove service and actually go through step by step. But if somebody would like it done virtually, as a refresher, we're capable of doing that as well.

LJ: Awesome. Thanks Raffi for jumping to that one. There's a question from Dan Ceroni, which I'm assuming is about making sure that the Q&A is also being recorded. I think that is the case. Kelsey, can you confirm? I don't hear from you, I might take that as verbal agreement [laughter]

NE: I think Kelsey actually responded that it is...

LJ: Oh, she did?

NE: Yeah. There will be a recording.

LJ: Great. Perfect.

NE: I can see it's still recording because someone asked whether it includes the Q&A and the answer is yes, it's still being recorded.

LJ: Awesome. Perfect. Perfect. Thank you. Thank you. Next question is, this is a great one. How soon can we expect to see UAV carrying human passengers?

NE: I think that's for you, Jason.

JF: So, that's a pretty cool space. You're starting to see videos online of companies that are doing that. Those are bigger platforms than the Easy Aerial platforms. We generally stay in the group one, which is quite a bit lighter than those heavy lift platforms. But the progress that looks to be being made with the air taxis and things like that, and the announcements from different companies are increasing. I wouldn't be surprised if we have at least initial operations within the next year or two and then maybe expanded operations within that same period of time. But they're... It looks like those companies and that side of the industry is really moving quickly.

LJ: Yeah. Awesome. Thanks Jason for covering that.

NE: It does beg the question, does the U in UAV apply when you're carrying a human?

[laughter]

LJ: Right. That may change the definition a bit. Time will tell what they end up calling it. [laughter] Awesome. Okay. I've got one question I'd like to ask myself, actually. So, to the team here, would you say that you have seen a reluctance from larger corporations in terms of their willingness to adopt this type of technology?

RJ: That's a great question and I'm happy to take that one on. Given the fact that I was part of a large corporation that was apprehensive of taking on this technology, [chuckle] the short answer is yes. However, that was also five years ago. So we are seeing a change in the landscape. However, it is still new to people, so a lot of the people who've been in place or who have been in power for a number of years are not really up to date on this technology. So, they have to rely on this new generation of people who are coming in, who are able to be decision makers and say, we need to make these changes for the betterment of our people, for the betterment of the company and the long-term future of a said corporation. And I think this is something that will keep moving the right direction as the technology itself not only emerges but matures, but these corporations also mature in a different way. And we get some younger blood in there who's understanding exactly how this emerging technology is able to have a day-to-day effect on the operations. I think we'll see a lot more adoption.

LJ: Yeah, that makes a lot of sense. Nader, do you have anything to add here?

NE: Yeah, I've been around long enough to kind of see net new technologies come out and the pace for it, is a bit of a no-brainer, but its adoption initially is quite slow because, especially if the question is specifically around large corporations, they're not ones to take big bets, so they want to be the first followers, and your definition of first is obviously different depending on which industry you're in, which region you might be in. But generally I think what we found, and for example, cloud computing, I came from the computer industry before and cloud computing was coming up. But you had a lot of people, as Raffi said, the incumbents, the senior leadership who were leading large IT departments in mainframes, in air conditioned vast football field sized, computer rooms who could not conceive a future where there was no computer room within the company.

NE: But now, fast forward to today, everyone's on AWS and Azure. So, we are seeing that and the initial stages, yes, it needed someone like Raffi and these more nimble forward looking organizations to be the first to adopt it and then showcase it to everyone else. So, it became a like house customer, a reference place. And now we're getting to the point where historically for us more very realistically every customer has required us to demonstrate, but we're getting to the point now where customers are kind of saying, "Well, if it's big enough for that massive mining company, it's good enough for me and I'll buy sight unseen". So, we're at that tipping point. We've already started seeing that happening. And then that's basically when you get general and wider adoption. So, and that's specifically within the mining sector, and then that'll have spillover effects in other industries and other applications as well.

LJ: Yeah, that is a great response. Sorry, I'm watching some other questions come in. We're just about time. I'm gonna take one more and then unfortunately I have to wrap. So, Alex Danse, your question came in first, so I'm gonna take it. All things being taken into consideration. How do cost of an operation done with an Exyn system compare to traditional methods mapping and underground space, for example, is what they provided? I guess to Nader.

NE: It really depends again on the specifics of the application the use case... The industry. So, I'll talk to something that we obviously know a lot about, which is the stope mapping. So, what I can tell you is basically the incumbent technology right now is a LiDAR that goes at the end of a very long stick that gets inserted into these cavities, these very large cavernous cavities. The setup is difficult and needs two people, and once they set it up, it takes about anywhere between half an hour to an hour to collect. And then you got to tear everything down, pack it up in the truck take it up to the surface, and then analyze the data on the surface. So all of that all in is about four hours. And you... To give a number to it, you get 50,000 data points.

NE: So, what we do is the robot, one person can carry it, put it in front of the stope, point it in the direction, say go. And the mission typically takes anywhere between two to three minutes, lands back, and you put the robot in the truck, but you got instant access to the data. So, that transforms not just the four hours to 10 minutes. The critical key impact is the fact that you are making decisions within the shift on what the next shift looks like. As Raffi mentioned you collect the data, you go up to the surface at the end of your shift, the next shift, you're analyzing the data to inform what's happening in the third shift. So with autonomous robotics like this where you have instant access to the data, you can actually make a decision and save eight hours, an entire shift. So the ROI on that is fairly significant in terms of productivity as well as in terms of safety. So, I know we're up on time.

LJ: Yeah, Nader, that's really helpful. Thank you. So this wraps today's Q&A, and also today's session. And so to close, I pieced this earlier, so we're back on it again. If you're interested in hearing more about the modern day uses of autonomous robots, please check out our next webinar with CUAV on June 14th at 1:00 PM Eastern Time, titled Drones as Tools for Inspection, how, why and ROI. We'll hear our COO, Ben Williams and one of our other company partners discuss how organizations can rethink how routine inspections are done using autonomous drones. Scan the QR code right here to register for free. So, thank you all again. I will pass back to Kelsey to close us out.

Kelsey: Excellent. Thank you Lynn. I just have a few housekeeping notes to go over before we officially close out today. If you could please fill out the short evaluation form that will appear once you close the webinar. This will help us better serve your needs in future webinars. And as a reminder, I've had many people asking me, the on-demand materials will be emailed to everyone within two business days, and these materials will include the link to the recording of today's presentation as well as a PDF of the slides. And I apologize for not confirming this earlier, my mute button was not cooperating with me. But yes, all registrants will automatically be emailed the on-demand materials within two business days. And thank you again to all of our speakers for a great presentation today. And thank you to all of our attendees for taking the time out of your busy days to join us. Have a great day, everyone.

Subscribe to email updates