in

Depth Perception | Paul Powers from Physna

Image search technology has become an important tool for a wide range of industries and applications, ranging from research, law enforcement, e-commerce, historical archiving, and more.

And yet, as sophisticated and useful as this technology has become, it has one severe limitation. It’s flat. Only two-dimensional data can be searched. Yet, we live in a three-dimensional world.

This is a problem Paul Powers understood personally as a patent attorney. “We couldn’t find the 3d models that people were using to steal intellectual property,” he says. “We could find anything else—plagiarism, 2d pictures, whatever, but as soon as it was three dimensional, forget it.”

When he discovered the emergence of 3d search technology, he became excited. And when he discovered how underdeveloped it was, he started his own company, Physna.

On this edition of UpTech Report, Paul tells us how his efforts to solve this one particular problem turned into a solution for seemingly endless use cases, and he also talks about the relationship to his other venture, thangs.com, an open 3d modeling community.

More information: https://physna.com/


Paul Powers is the Co-Founder and CEO of Physna, the industry leader in geometric deep learning technology. An experienced technology founder, Powers founded Physna in 2015. In 2019, Powers was recognized by Forbes Magazine as one of the brightest young entrepreneurs in the United States in the Forbes 30 Under 30. Powers has presented on the TED Talk stage and at several tech startup conferences like StartUp Grind, ConX19, and Startup of the Year at SXSW. Powers is a regular contributor on national news networks including TD Ameritrade and Cheddar. 

Physna recently announced a $20 million Series B funding round led by Sequoia Capital. Physna’s technology bridges the gap between the physical world and digital code by codifying the 3D world through a deeper understanding of the physical properties of real-world objects and the relationships between them. Physna is building the 3D future of software, and has already improved engineering, industrial design and procurement by putting new, powerful search capabilities in the hands of innovators and creators – one 3D model at a time.

In addition to Physna’s enterprise solution, the company powers a free search engine for consumers called Thangs.com. Thangs is an easy to use search tool for physical object and 3D model search, allowing users to search with models directly rather than relying on text. Thangs provides the world’s first global geometric search and smart automated collaboration tool. Since launching in August of 2020, hundreds of thousands of people have used Thangs to improve their workflow in everything from product design to 3D printing.

DISCLAIMER: Below is an AI generated transcript. There could be a few typos but it should be at least 90% accurate. Watch video or listen to the podcast for the full experience!

Paul: We give people who work on hardware, physical things, physical devices, the ability to work with it like you could work with any other digital asset

Alexander: Welcome to UpTech report. UpTech report is sponsored by TeraLeap, learn how to leverage the power of video at TeraLeap.io. Today I’m very excited to be joined by my guest, Paul Powers, who’s based in Ohio, founder and CEO of Physna. Good to have you on Paul.

Paul: Thanks for having me.

Alexander: Now, you’re you got two things are going on right now you got Physna. And you’ve also got Thangs both interrelated. I’m excited to dig into both. But Physna is a geometric deep learning and 3d search company that searches comparison, analyze 3d models, and your focus with that organization is helping enterprise organizations that have ton of 3d models and needing to be able to understand where are they how to access it correctly, they get that correct.

Paul: That’s what let’s go over Yes.

Alexander: And now, but you’ve also started Thangs which is basically anyone can jump on and start to utilize that it’s like a fastest growing 3d community that makes it You said, and the first 3d search engine help me understand though, both for Physna. And now to Thangs interesting. THANGS.COM for both of these, why did you set these up? Like what was the purpose of it? What did you see is the problem with 3d assets and models? And why do we need to search for them?

Paul: Sure. So actually, thanks as part of Physna. So because my own Thangs.com, and we released that as the same technology, just a lighter version that everyone can use. So why did we set this up? The initial idea came from the fact that I needed to find models. I’ll skip through a whole bunch of my own background, because it’s a long complicated one. But long story short, was I was working as a patent attorney, and we couldn’t find the 3d models that people were using to steal IP. We could find anything else plagiarism, 2d pictures, whatever. But as soon as those three dimensional Forget it, and looked on the market, 1000 things called geometric search, I got all excited. And so I moved back to America. And I thought, Okay, I’m gonna, I was living during the time. So I’m like, Okay, I’m going to start this company up and found out that all those geometric search tools that I was hearing about and reading about, they did not work, not well enough anyway.

Alexander: So like, you found some stuff, you had the problem, you’re like you needed to search, you found some stuff that just was not working the way you wanted it to, or you thought it should. So like, I’m gonna create my own.

Paul: Yeah, I wanted to create a company, but I wanted to base it in a partner with another, some technologies already exist some other company. And I tried really hard to do. But I tried to figure out how to make this technology work, I found out that the end of the day, it was just how they were viewing data was flat, essentially. So the problem we have in technology, think about the evolution of technology we went from, to communicate with you, I have to send someone on a pony with a letter, then we invented a telegraph. And now it’s a Morse code now means I can communicate with you instantly. That’s what we call Unicode one dimensional code. And then eventually, we had binary code. Now everything that we use is possible because of this extra dimensions are two dimensional code. Well, the world is not flat. So the world is three dimensional, and that the software that would that existed for geometric search was really trying to look at everything in a flat way, essentially, the best way I can describe it, and we realized, no, there’s got to be a different approach rooted in mathematics and machine learning. And so we took a totally different approach took a long time, a lot of money, but eventually it worked. And behind all things related. Exactly.

Alexander: Now, 20 will look just like timewise to give some some scope here. 2016, you started on this journey? What did that look like? And how and what what kind of data points of where now you are today?

Paul: Sure. So started in 2016, took a year or two to even get the technology to really work, I went to out to a couple of trade shows time people, hey, I’ve got this awesome thing. And it finally works now. And it helps solve this problem of your intellectual property being stolen. So you can, you know, prevent patent lawsuits or prevent needing the front the damage of a patent infringement, essentially. And I found out that nobody cared. They really didn’t care about that use case, they were like, I really just love the technology that what you do is super cool. I don’t really care about this whole, you know, patent thing. I don’t know. I mean, we’ve never really had that problem, but the tech is cool. And so I realized, okay, wrong audience. They’ve never dealt with this problem before. But they’re just in the tech. So I’ll just and you know, I’ll geek out with them a bit, and came back feeling a little bit deflated from that thinking like, okay, no one really cares about the use case that I thought was gonna be this huge thing. But at least not that audience. And then I found out later, though, that we did have a really good product that wasn’t a problem. It wasn’t a solution in search of a problem. It was actually a solution to many problems. Within a month of that I had 20 different companies roughly call me back, and we had over 30 different use cases. For our technology, and they were all over the map. I mean, it would take me this whole call to explain what they were but everything from quality control to engineering productivity to predicting things in healthcare, like, you know, cancer at an earlier stage. I mean, you name it, it’s all over the place inspection automation. Things in AR and VR augmented reality, virtual reality, you name it. So there was a lot of use cases, we thought, okay, we can’t do all that. So it took a while to figure out what to focus on. Because everyone said, Yeah, you’re an idiot if you don’t focus on this or that. And I said, well, eventually, I’m gonna warn everybody in community no matter what I do, because I focus on this one or that one, I’m going to

Alexander: get into something and then some people call you an idiot, but who cares? Who cares, right?

Paul: Right, so eventually, you would just kind of focus on, it might sound like a broad use case, but it’s actually fairly narrow. For us, it’s people who are creating physical products. So whether it’s a plane, a smartphone, a car, it doesn’t matter, it’s all works the same, but we help. Essentially, we give people who work on hardware, physical things, physical devices, the ability to work with it, like you could work with any other digital asset. So physical items and their designs, we treat them like any other digital asset, I mean, but that is, you can search for geometry with geometry, just like you could search for textbook tax, and see how everything’s related doesn’t have to be a duplicate, it can be this little bit of something that’s found in something else, right? Doesn’t matter what kind of file format it’s in, etc. So if you’re an engineer, that means that you’ll design faster because you’re reusing existing parts more. If you’re a purchaser or a supplier, it means that you’ll be able to find the part you’re looking for and alternatives to that part. So you don’t order unnecessary parts. And you’re also getting the maximum value out of what you have in stock. If you’re in warehousing, inventory management or maintenance, you’ll see, you can look up what something in front of you is and say what is this thing you can describe it, you can measure it, you can take a picture of it or scan it, however you want to search for it, you can see the physical find it. And we’ll tell you what it is. Alternatives to that there are multiple partners will tell you all the part numbers and will tell you how it can be used and will actually show you will say, look, this is how you can use it on this model. This is if you use that part, it’ll look like this. It’ll fit differently, but it will fit and it will pull up that gap or whatever. And it’s automated. It’s not there’s no metadata there. So it’s not everything I said sounds kind of like to engineers anyway, to people we work with in that field. Because we’re dealing with physical things. That sounds like magic. I know why, right. But that’s actually stuff. We’ve been doing software for a long time, you know, we just haven’t been able to do with physical things or their designs because the world they’re not flat, right? The world’s not first organizational. So that’s what we fix.

Alexander: What’s the just to dig a little bit further into the technology. And then we’ll take a step back at the use cases and the business model and how you guys are helping other organizations. The ingest process like, obviously, what how, as far as fizno works, folks are using it as they’re, they’re applying it to their own assets, library of assets, and then they can share access to that library of assets are making it searchable. It’s not one giant database that everyone suddenly that’s more Thangs which we’ll get to, but help me understand, like, what’s the ingest? And what’s that process look like to get started? How does it work?

Paul: It’s really easy. It’s just like any process and just anything would be right, you can either upload something in bulk, or we have API’s look, connect to whatever system you’re already using. So it’s,

Alexander: if you already have files in a library somewhere, have all your 3d assets, you import it, and then it’ll understand what those three dimensional sizes are in the shapes, etc. So you can search now based on all that.

Paul: It’s a lot easier than it sounds. You know, if you have a system already, it’ll integrate with it if you’d like if you want to replace it, you don’t need to, you can do a bulk ingestion, we have our software will identify what kind of files I can work with. And so it’ll, if you can just point it at something and say, take all the files that you that you care about in there, and I don’t have any organization, I save everything next to each other, it’s fine. it’ll, it’ll know what to look for. And they’ll take all that and organize it.

Alexander: No manual metadata, tagging or anything needed.

Paul: You don’t need to do anything manually. So if you took if you already have a system place, you’ll have meta data with the models, right? So you don’t need to do anything, then, if you don’t have it. Well, obviously, we don’t know what the metadata is, if we don’t if you don’t have it saved with the model. But typically, in those cases, metadata is safe, like you’ll have it in another system, you might have it like an Excel sheet or something. But what we found is that those that’s actually the lowest common denominator, which is an Excel sheet, everything can seem to export to an Excel sheet. So we created this really simple little tool and the software, which is really easy to use, you just upload an Excel sheet. And it figures out what data goes where so it’ll say, Oh, this part has all this stuff associated with it. Okay, this is how much it costs. It’s how much it was manufactured, with how much material you use, how it performed, even whatever data you have, it’ll track it’ll connect that to that model. And then a machine learning will use that and say, Okay, so this other model that has no tags on it, right? It looks very similar to these models. So maybe it’s got the so maybe these attributes are true. And so it actually uses like, we use machine learning, essentially, and eventually it’ll get that it’ll figure out okay, this model is going to suck because it has all the same attributes of all these other models that suck or this model is going to be amazing because it looks just like these models that how they all have the same common denominator which makes them Amazing, right? So it can make those predictions obviously amazing and suck isn’t really the most common term you use. But you know that we’ve I’ve seen people do it. Yeah. Cost material performance, how to manufacture what setting to use If it’s connected to the geometry in some way, you can protect that, which is really cool.

Alexander: And then use cases, you mentioned to me earlier that the department defense is actually using your, this, this platform.

Paul: We have one of the rare distinctions of being technology where we have users that are like 10 years old getting used to a 3d printer for the first time, or they’re playing with animation, or doing something in 3d. And then you have the Department of Defense using the exact same thing. So you know, everywhere in between, like, what’s the addressable market, and like, it’s kind of wide. But that doesn’t mean that we, you know, proactively sell to that entire market just means that there’s a lot of people who can use it. And obviously, the 10 year old, would be someone on things. Also, you have people from department events using things. So we actually have the largest 3d model repository in the department of fence. The reason I mentioned them is because that’s public, right? That was made public by them. But most of our customers are, you know, anywhere from small to very, very large, mostly, actually, a surprisingly large percentage of them are our customers are very large, you know, OEMs, so an automotive car brands, you know, major suppliers, automotive, or aerospace or medical devices. It’s actually pretty broad, but we typically focus on, you know, the higher value per design use case when we sell, because, obviously, the ROI, you’ll see is very fast. And so it’s really easy to you know, shut them up when they’re inbound, you know, we have, which most of our leads are, you know, they’re everything from consumer packaged goods to apparel to, you know, you name it, I mean, it’s really, really broad. That’s just because the physical world is broad, right? Yeah, it makes sense.

Alexander: And now for Physna. your target market is enterprise, it is a large organizations that just have some mass amount of data that they need to be 3d assets they need to organize, but it’s things that’s where you kind of open it up and simplified a light version for anyone to be able to use. Let’s dig into that a bit more like why did you set that up? And how would Why would someone want to go on there and use it.

Paul: We want to actually democratize some of the software, let people think that we have some kind of weird evil plan with a lot of people in like, in like, Reddit communities, and some will be like, Okay, well, what are they doing here? They’re a VC backed company. You know, Sequoia is behind them, you know, what are they trying to do? Are they trying to steal my debt? Why are they doing this for free? So it’s actually a very altruistic reason to be honest with you. We’re giving it we’re letting people use things for free for a couple of reasons. One, the business reason for it is twofold. One we want to be we want people to get used to the idea of using this type of technology so that when they do, you know, maybe they’re a student, now, maybe they’re hobbyist, maybe they’re a small company, but when they become bigger company, and even more powerful tools, then, you know, they know the value of that. So then they’re more likely to go over to fizzle enterprise, it becomes like a lead generation tool of sorts, right? The other reason that we do it is because the 3d market right now and people, when you hear 3d and technology together, right, you probably think of a very narrow gap area, like a very narrow niche, you probably think of AR and VR, or you might think of 3d printing or something like that. Reality, 3d is the majority of our economy, the world very, very, very few things are actually flat. It’s just our interpretation of them in Texas flat, really, so the value in 3d is significant. But and the exposure to 3d right now is niche, but it won’t be very long. Because when you have augmented reality glasses coming out, when you have Apple glass coming out, right? I’ll just put it this way. If you think about what the world looks like, in 20 years, by What does technology look like? And I asked a lot of people this, they, you know, everybody comes back with slightly different answer, but they all have one thing in common that is no one thinks that we’re going to be on a flat rectangle pushing buttons, right and 20 years from now, that’s still how we’re gonna interact with technology, everybody assumes because it would be very valuable. If it did this, that technology will be more immersive more, instead of input for output, its output based on your situational circumstance. So it’s just it interacts with the world around you. Well, for technology to do that, it has to have spatial awareness, it has to, you know, for it to for you to live in an immersive environment, it has to understand space, and 3d, not just data that is two dimensional, they can’t do that right now very well. So there’s no bridges back up. And that’s where I see us ultimately playing the the, our big role in the future.

Alexander: I like the vision that you painted this interaction of where it will be for the more immediate future, what are the some of the use cases is effectively for people that just want to search for 3d assets to be able to upload and track their own assets and be able to search through it?

Paul: So it’s a couple different groups. So right. So things is like a Google Plus a GitHub for 3d, right? So if you so if you’re an engineer, let’s say you’re a 3d printer, okay, so we’ll stick with a simple use case, you’re a hobbyist you’re trying to find a model, 3d printing or whatever, whatever your use cases, things crawls the web. So things will find not just what’s on things, but what’s on you know, great scatter Thingiverse or McMaster Carr, if you’re looking for a supplier or whatever, and we’re not. Now if when we find it from a third party site, we’re not taking the data and downloading and things, we’re just we find it with the geometry or text, whatever you’re looking with. And then we help get you to that model directly. So you’ll actually go to that third party page, or if it’s on things you’ll go, you’ll stay on things. So it’s first partially its discovery. The next thing is collaboration. So believe it or not, there are only about 3% of people who use 3d design tools, like CAD actually have 3d document management, right, like so like PLM, or PDM. Right. So about 3% use it. But the market for PDM is actually bigger than the CAD market, which is crazy. The reason the only 3% use it is because if one is traditionally very expensive, and two, because it is kind of hard to use, we’re not trying to we’re trying to compete with PLM PDM. But we’re trying to do is give people a different experience that works for everybody, right, we want everyone to be to collaborate. And traditionally, it’s been very hard to do things like revision control and version control in 3d, because you don’t have tools like GitHub. So the CTO of GitHub is actually on our board directors and worked with us very closely. And we actually, because we understand how models relate, we’re about to release something now, which which will automate the version and revision control process. So when you upload a new model, you’re collaborating with a friend or a colleague, that whole process is automated, you don’t have to worry about Oh, wait a minute, no, I’ve got this version. You’ve got that version? Oh, yeah. Hold on, wait till you’re done. And then I’ll work on this. It’s just this automated, right? And you don’t have to worry about Oh, crap. Where’s the other? I think that I just redesigned something, or is this like that? One thing? We designed that one time, what was that call? It knows that it says, Hey, this is like that don’t make a new version, a new model. And it shows you the differences. It shows you how things have changed, even if you’re working in different CAD systems, right. So your friend can use something different than you’re using, it doesn’t care. So that it just makes it more accessible to people, because very few things can really be designed 100% by one person. So that’s also one of the reasons why not as many people are adopting 3d design technology as they should be.

Alexander: Gotcha, interesting choice of words. Thangs. A couple times you’ve actually referenced, you know, when you’re trying to find things, is that how you just come across it? Like, let’s just find a place to put things where did that come from?

Paul: Honestly, the name was, this is total honest truth. It was I, we were trying to figure out that we initially didn’t know what to call this. And so Physna didn’t really have a name for the software, cuz business name of the company. And we only had one thing that was Physna. So we didn’t have a different name. So for this new product, we were like, what are we going to call it? And internally for the longest time, we called it like business social, like the social version of business, or whatever, like, that sounds weird. We need a real name for this thing. It’s not going to be successful with such a weird name. So I put together a list of ideas. And as a joke, not meant seriously at all. Right? Just make a little bit of levity to the board meeting, we were going to be going over the ideas. I wrote down things, and everybody loved and they were like, Oh, my God, that’s the perfect name, we should absolutely use thanks. And I thought they were getting up early. I don’t really know about that. It’s a little bit too. It’s not very serious. And I looked up the URL, and it was for sale. And we kind of looked through the rest. We did a pro and con evaluate. We did this big evaluation on the different name ideas and other alternatives and realized you know, what, actually, things works. It makes sense. That’s the only reason it started as a joke. And then we’re like, it works, though. It makes sense. And it’s easy to remember and spell. So let’s just roll with it. Yeah, we roll with it. I mean, it’s Sunday. Now right now I can say with a straight face. It took me a while to get there.

Alexander: For a little while. You’re like Thanks.
Yeah, exactly. It took me about now,
how long has has things been online and been using?

Paul: Six months ish? Have you ever take? So we started and like late August of last year, so six, seven months?

Alexander: Now do you see competition? What does that look like? And how are you planning on differentiating and moving forward?

Paul: I mean, there are other sites where you can find models, right? So if you’re just looking to have go to a site where you can download a 3d model, you know, then yeah, there’s competition. But you to prove that we don’t think of it that way. We will take we will send you to that site. Right. So when you go to things and you search for something, right? A lot of people say, Well, you got differentiate from Thingiverse or grabcad. It’s like, No, we don’t, we’re not thinking Reza grabcad. In fact, if you search for something, and Thingiverse, or grabcad has a better result than we do. We’ll say go there. Here’s the result. If you click on it, it will take you to that website, right. So we’re not trying to be them. What we’re trying to do is we’re trying to make it easier to work and collaborate in 3d. That’s really it. And we want people to get used to this, this, what we’ve developed in terms of software like that we want to make, but we’ve created a standard so that when they become professionals, or when their company becomes larger, or just when it makes sense to do so, you know, they say, hey, maybe I should maybe I should get the enterprise version of this. You know, and which is that right? And we’re also trying to expand the market. We want more people to go into 3d and so we’re not trying to become a thing averse or grabcad It’s a totally different angle in mind. But if you first look at it might look relevant, might look similar, but it’s not.

Alexander: You have the backing of some good venture backed comm a company so that you have the resources to be able to think a little bit bigger and longer term, but it’s the balancing of immediate needs with longer vision. How do you balance the two? Those two things?

Paul: That’s probably the hardest thing in the company. Right? So I told you, we got to 30 use cases after like that first month after that, after a year or so, after that first, you know, interaction with the outside world, I guess you can say, we got to over 300 use cases. So it’s no, you can’t build 300 different products, right? So the question becomes, how do you focus but not pigeonhole yourself. So it’s, that’s the balance, I hate the word balance, but I guess I’ll use it here. If you’re going to use Word balances be the right time to do it, you have to find a way to make sure that you’re focused enough on the short term, that you can prove that there’s value, there’s traction, there’s momentum does product market fit, but at the same time, you don’t want to go so all in and once very small sector, that it becomes impossible for you to pull yourself back out or very hard to pull yourself back out. And, and focus on other things. Because the way that you built yourself up to go there is not a reusable, and so what we decided to do is okay, we’re going to focus on this one area, it might sound like it’s three, because I mentioned three different use cases, but they’re all typically the same company. It’s the same software, it’s just people who click the same button for different reasons, right. So we were focusing on on that use case and that user, but when it comes to the longer term vision, you know, we’ve the way that we built this, they’re, they’re done. They’re built in a way where API’s are where they can basically put any service into an API. So if you want to use it for something completely different than what we have in mind, right, we’re using it for you can and that’s actually a growing thing, a lot of our customers now, in fact that the last two deals we did with fortune 500 companies, which were both in the past two weeks. So we’re moving, it’s a lot of momentum here. They were both API deals, you know, and not regular SAS deals. Now they want to use the SAS two, but really just as a starting point, because they want to build their own thing, their own use case out of this. And so they use API’s to add to their own software.

Alexander: Now, the the API, there’s a lot of companies I talked to API is like, just growing like crazy, because it’s just like the integration of all the best tools out there and to what you’re trying to build helped me understand, okay, let’s look at your route technology is is it the is the machine learning items that you build to look to be able look at an image and be able to pull out okay, well, what are the 3d dimensions etc, like, what’s your what’s your sweet sauce that you can share? That is really, that truly is physica. And what you have.

Paul: So the secret sauce of Physna is a series of algorithms that are supplemented by machine learning. That is a very proprietary very specific way of very fine tuned way of taking any three dimensional data. So any up something from the physical world that you’ve scanned, or imaged, or whatever, or something that’s digital and 3d, normalizing that data in other words, making it all the same. As far as the computer is concerned, this is all over I was concerned. And then relating it and not just this thing looks like that thing. But hey, this one thing was found, like the way I bet I can explain that the best Actually, we actually have a demo of us doing exactly this, which was really fun. It was a lot of fun doing this, take a vase and shatter it into my thesis, right? And then take those million pieces. And whatever the number of pieces on the table whatever shadow however you want. This note will actually rebuild it, it will say, it’ll say hey, so basically, let’s say I take two bases, one, I shatter into a billion parts. The other one, I just leave on the on the shelf, right? And then let’s say I have both of those digital assets. So I can take the vase and say, what goes in here, and it’ll take every single shard of glass from the other identical base and say, This is how that goes back together, that you can take these shards or put them together like this, and I’ll make that face. Or I can take one of those shards and say, What are you? What do you do? And it’ll say, Hey, I belong to this vase. And this is where I go, right here. And so and there’s no, like, parent child relationship. There’s no metadata. It’s purely geometry is pretty mathematics. But because we can do that our machine learning is crazy powerful. So you know, we don’t have to actually use we use the latest and greatest of everything, like especially machine learning, but we don’t have to have like a secret sauce on, you know, how to build a neural network or whatever. Like the way that we build machine learning is best of class, but it’s standard right now. Right? It’s not like we are working on some stuff that is pretty unique. But in general, it’s been standard, but we can use it’s that way that we take that data and that allow Us too. To make predictions, we actually calculated it. So if you were just to take raw 3d data and say I want to make predictions about you, first of all, there’s a lot, you’d have a subset of predictions you could make, you won’t be able to predict all the things that we can predict, you can predict some basic things, right? what something is maybe. But even then, we get to the same, we get to the competence level of what is what 10,000 times faster than if you use base 3d data without without the process that we put everything through

Alexander: the data that you’re able to then.

Paul: So it’s far faster, and far more and much more accurate. And then also you can, you can make many, many, many more times types of predictions than then you could ever make. Otherwise, if you use 2d pictures, which is how most almost everything works in computer vision right now, still to date, right? So there, it’s 100,000 times. So it’s forward as a manga time versus other 3d five orders of manga time, five orders of magnitude versus 2d. And so one of the things that a lot of our customers are using right now, we just built this, this tool that allows you to take a picture of something and know what it is, well, people, other people, I mentioned that too. We’re like, you didn’t invent that we’ve had that forever. You can, everyone I mean, computer vision, part notification, I mean, that’s not really new. So yeah, you’re right. It’s not, but try us versus anything else. Like, it’s crazy, how much more accurate it gets, you know, if you take a picture of something, or even a group of things, whatever, and you search it against Physna, it will tell you, I mean, assuming that your environments tested correctly, it’s trained correctly, you have the right data in there, etc. But you know, all that being assumed, you will, it’ll find exactly will tell you this part is this exact thing, not a category of things, right. So traditionally, it’ll say like, oh, that that’s that, right, there’s a bolt. And you’re like, Oh, thanks. I had no idea. Okay, thank God, I had this vision API. But we’ll tell you, that’s this bowl. And by the way, it’s also sold as this other bowl, and you can end up by the way, where does that book go? You can use it here. I know, it goes in your lawnmower, I know that’s where the the part numbers associated. But it’ll also work on your car. So if your car breaks down, take the damn thing off your lawnmower and fix your car.

Alexander: Honestly, I had a similar situation, I had a screw, right and I had to find another screw just like you, you would have the ability to scan this and tell me exactly another type of screw that would fit. Or you could even like scan a spot where it needs something and say what screw or bolt would have would fit in here.

Paul: Yeah, so you would be able to take. So if you you wouldn’t even have to do a scan for the bolt, you just take a picture of it. And you know, and we’d say this is where it goes, this is what to do with it, think of it. And then it’s how parts fit together is really cool, because it’s shown in 3d. But it’s very simple. You don’t need a background in 3d to use it. In fact, I would say only maybe one in five physical users as an engineer. The rest of them are people who are maintenance personnel field workers. They’re working in procurement and supply chain, we save on average 40% of procurement costs, because how much waste is in procurement because of duplicate parts. So most people aren’t engineers. But if you are an engineer will make you several times more productive than you were before.

Alexander: You could you imagine yourself applying this or maybe it’s just an API to someone else who’s doing this or would do this, were using AR goggles, whatever they’re in a workers doing something and then they’re like, I need a part for this. And it’s, it’s, you know, being able to immediately bring in show, oh, this is the part you need, etc.

Paul: We have something coming out very soon that will do that. Exactly for that. It’ll take all 3d data globally, and make it available in AR. So you’ll be able to take whatever 3d data exists or that you’ve put together, scan whatever, it’ll work in AR and it also matches in that way. So you know, one of the things that is cool about having 3d matching technology, we talked about AR, right? Traditionally, when we had AR glasses and stuff, they had cameras on the front of them. And that’s a big reason why they’re not very popular because no one likes to have a creepy camera aimed at everybody all the time, right? So that’s not that’s kind of a no go. But that’s a problem. Because you’re you know, if you want Apple glasses, to be able to put things into your environment, they need to know what the environment looks like. So how do they do that? Well, you might have noticed that they released LIDAR on the last iPhone, right? And that that’s gonna become a standard moving forward. And why did they do that? Well, yeah, maybe it’s fun to scan things, but it’s also really for AR. And, but there’s a problem with that, right? LIDAR doesn’t pick up color. And LIDAR just picks up, you know, depth, essentially. So depth sensing thing. So all you’re going to see is this, like, vague, you know, this thing of it, like being a blob that you’re gonna get, like echolocation feedback is what you’re gonna get, right? So how does it know what anything is? You’re literally as blind as a bat. You know, your your glasses are. So it can tell you where to put something, but doesn’t anything in your environment? Well, because we don’t need 2d, we can use 3d, right? That depth information is plenty. So you’ll be able to, you know, a couple years when these become popular, you know, that API become a big deal because it’ll say hey, I can wear these goggles. wear these glasses and without having a camera will be able to tell you. That’s what this is. That’s that’s that’s this, this is just because of the geometry.

Alexander: Mass will rule all? It does. It’s amazing. No, I’m curious for you guys, it’s a This is an exciting time. I mean, you’ve been developing in the last four or five years, but it’s now just in this past year or two that the acceleration it’s applicate application and both physical and things for for open use. What can you share as far as that you already shared a little bit there for the AR, but what’s on the roadmap, like where are you headed and exciting opportunities that you would want others to know about?

Paul: There’s a lot I mean, so that, in addition to adding that we’ve have, you know, this, this geometric GitHub thing I mentioned, coming up pretty soon, we have, we’re gonna have a lot more data and things in the near future than we have. Now, if we add so fast, if you, you know, a lot of people, when they talk about development timelines, they, they’re shocked at how fast things actually changes, like if you go into things when we can go the next week, and then go the next, if you even or even daily, we have releases on every other day kind of base or out of it, I don’t know, it changes a little bit bit like it’s constant. It’s constantly changing. It’s constantly improving. And it’s, it’s evolving in a very, very quick manner, and largely a feedback ratio of a really prominent feedback button on the site. And we and we get tons of people clicking on that and saying, you know, sometimes they’re just thinking, I’d say, Hey, I love this thing. But what really is Bibles, when they click on and don’t say that they say, hey, this sucks, I can’t do this thing, I want to be able to do that. And so we start to learn from that. And so we improve the site. But the ultimate goal for what we’re doing is to create, basically, with thanks, we want to be able to index the entire physical world, right? So that if you’re using, especially as an enterprise customer or on things, you’ll be able to upload something and say, How do I put this together? Where do I get the parts? Or what’s this thing? What does this go? What do I What do I do with this for, you know, how do I, what’s the next phase and the development of this one thing that we’ll be able to do, because of the way that our machine learning works is? You’ve probably heard of generative design before, if not observed, anyone who isn’t isn’t like 3d, or 3d printing or design. They’ve probably heard of that. But the problem of generative design is that thinks about from a physics perspective, what’s the most economic way to build this thing? In reality, you probably care more like from a realistic supplier, where can i How do I put this thing together? perspective? How should I put this thing together? What do I add next? What do I do? And so it’ll literally it’ll be very generative from the because of that data. So all that data, and all the machine learning that goes into that will allow us to say, hey, if you have an idea for something, you don’t need to have a background and CAD, you don’t need to have any training, put something to build something to invent something. And then once you’ve invented the digital version of that, you can use it in your augmented reality environment, or you could manufacture it and by the way, here’s who can manufacture it for you. Here’s the parts that you would need, who’s here’s who puts it together, we just make it so that everybody can reach their full potential. That’s what the goal Fang’s is. And our ultimate goal is not his, which is includes things, his gap, all that data, have all that keep all those capabilities, so that we can really create what I would call chinery code. You know, we talked a little bit about unary code, and then the binary code, let’s get that trinary level, because I think in the end, five years from now, my goal would be that no matter what you care about, if you, you might have zero interest in 3d and a lot of people you know, they still think it does a very niche thing. But what we have a power, so many applications that virtually everybody will be using, maybe not physically, directly, maybe thanks directly, or not, but there’ll be using something that’s derived from RF technology using an API. And that’s the ultimate goal is to empower that next generation of software and overcome that last, that last dimensional gap, right from 2d technology, 2d software to 3d. That’s the goal.

Alexander: It’s an exciting future that that you paints and you guys are on the journey. How big is the team now for you guys?

Paul: Little over 3030

Alexander: Okay, so it’s mighty, mighty works happening with, with the growth that you guys are doing? Just a kind of a question. I’m looking forward even broader than what you’re doing any thoughts on? Future tech predictions that you see in the near term next year or two that we’re gonna see? Because you painting some of the future of like, very futuristic, but what do you think of the more immediate not listed just for physica, but in the space of 3d and assets?

Paul: I think that the future of painting is not as far away as it sounds. If you look at I mean, we already have most of the hardware and software technology now needed for that next stop. So physics is really empowering. The I think business is the key to the software side. The hardware side has long existed, and it’s now becoming mainstream have by the end of next year, probably He knew, probably almost most of the majority of people, and you certainly anyone who buys a new smartphone probably have LIDAR in it, you’ll probably be using AR apps a lot more than you can imagine right now, right now, they’re kind of, you know, they might feel a little gimmicky or whatever, but they won’t be for long. You know, I’m not here to say that tomorrow, we’re all gonna be a little work, but we living in the future to grow, we’re gonna walk around, and everything’s gonna rock with us. But the beginning of that is not far away. And so the world that the picture that I’m painting where people are using this to build applications for different for new reasons, where you have 3d code being used in a mainstream way, and you have AR or something similar, right, so and to become more part of your everyday life, it’s, I would argue that it’s closer to here, the not, not, not 20 years away, and not even five, maybe one or two, that you start to see this happen. I think that’s the biggest area of development and technology. It’s kind of like, and the reason I say that is because it’s hard to, it’s hard to see it right now. But if you look at all the different components, they’re all there, and it’s a perfect storm. It’s a lot like what the iPhone was about to come out. Right. At that time. You couldn’t say if someone said, Oh, and I guarantee you in a year or two, everyone’s gonna have a phone that’s gonna do you can trade stocks on it, you know, call it you know, you do live video with anybody in the world, all this other fun stuff, you’ll be like, You’re crazy, like, No, I’ve got the greatest Blackberry. And it’s far from being able to do that. But it happened, right? If right before the internet, you didn’t think you’d be able to do like anything, what we’re doing, right? I mean, all this technology happens. And it’s not always gradual. Right? It’s not always, sometimes it happens in leaps. And it’s about predicting that leap that I think is important, not so much based on the current trajectory, where we’re going to be later because technology does not progress linearly at all. So I think that the leap is coming very soon, within the next year or two.

Alexander: I just saw that interview I did with a fellow but on the concept of spatial web, and like the next version of the web and and being able to understand where 3d assets are in your in, in, in a physical world, as well as digital and you paint the same picture and saying, it’s, it’s coming, it’s here. We better better be ready for it. I love it last, just kind of a question for you, Paul, of just as an individual, are there any books or podcasts or sites that you read a lot and would recommend podcasts or that you get insight and that could you share?

Paul: A lot of them. But I think that the top of my head one that’s most current, therefore, on the top of my head, for me is I actually was yesterday, I just got a book from someone named Elijah Gill, who’s very well known in the valley. And he totally, to my surprise, sent me over a hardcopy version of his book, which is called high growth Handbook, the high growth handbook. And it’s fantastic. So if you’re a startup entrepreneur, especially if you’re already you know, VC funded and have a little bit of traction, I haven’t read the whole thing. But I’ve read because I just got it. But I read the first chapter, but the first 60 or so pages, and it’s, it’s really, really good read has a lot of great interviews in it. So if you’re into that kind of thing, that’s what I recommend checking out.

Alexander: Love it. Well, thank you so much, Paul, for for sharing what you’re doing at Physna as well as Thangs. For those that want to learn more. Yeah, you can be able to go to Thangs Thangs.com and Physna. What’s the web address for that?

Paul: Physna.com

Alexander: And be able to get a demo. Thanks again for your time, Paul. Good to have you.

Paul: Thank you. Appreciate it.

Alexander: All right, everyone, enjoy the rest of the episodes at Uptechreport.com, and be able to check that out. And we’ll see you guys next time. Bye. Bye.

That concludes the audio version of this episode. To see the original and more visit our UpTech Report YouTube channel. If you know a tech company, we should interview you can nominate them at Uptechreport.com. Or if you just prefer to listen, make sure you’re subscribed to this series on Apple podcasts, Spotify or your favorite podcasting app.

The AI Lawyer | Jerry Ting from Evisort

Finding Talent with Advanced Tech | Josh Millet from Criteria