3D Scanning of a Cottage with a Phone

  Рет қаралды 118,884

matlabbe

matlabbe

7 жыл бұрын

For more info: github.com/introlab/rtabmap/w...
Sketchfab model: skfb.ly/6p7YJ (Updated Nov 2019: skfb.ly/6OyUy)
Erratum: the phone is "Phab2Pro", not "Phan2Pro"

Пікірлер: 127
@tokyowarfare6729
@tokyowarfare6729 6 жыл бұрын
5 laser scanner salesman did not like the video :D
@GyLala
@GyLala 5 жыл бұрын
Does support Samsung Galaxy S9+? sadly :c I can't install. well Google Tango
@mute3189
@mute3189 4 жыл бұрын
A reassuring example; close to the quality I got of my basement floor with the iPhone X + Meshroom. You had a more complex area than me, with more input data. - Very interesting. Thank-you for sharing. Any added tips you (or anyone reading) may have found to heighten roomscan quality (in conditions without a better camera, and conditions with) are appreciated. Incredible exciting stuff!
@yawnyawning
@yawnyawning 6 жыл бұрын
it really looks incredible... cant imagine Lenovo got this kind of technology...
@subramanyam2699
@subramanyam2699 6 жыл бұрын
This is insane !!!
@gulhankaya5088
@gulhankaya5088 2 жыл бұрын
hello, how do I use the map I created? I am using the application in an autonomous vehicle. How will it go through the map I created when I put the vehicle at the starting point?
@moseschuka7572
@moseschuka7572 2 жыл бұрын
Is it possible to implement object detection, recognition, pose and tagging will doing this? if so, please how?
@chemistry2023
@chemistry2023 2 жыл бұрын
A great full 3d map
@aussieraver7182
@aussieraver7182 3 жыл бұрын
Maaaaad! Now add VR with PBR + HDRP + Post Processing on Unity and you got yourself realism in the digital.
@Nicoda1st
@Nicoda1st 7 жыл бұрын
Thanks for sharing!
@davidmartin1628
@davidmartin1628 5 жыл бұрын
It would be great to see some interpolation between adjacent vertexes where there is missing data to 'fill in' the gaps to make a full 3D rendering without having faces of objects missing when looking at it from one side.
@matlabbe
@matlabbe 5 жыл бұрын
In the first part of the video, this is the raw mesh that can be created from the data coming from the depth sensor. Holes are created by: 1) we didn't scan the area (like we didn't turn around an object to see behind it), 2) the depth sensor cannot "see" the surface (black or metallic surfaces), 3) there is a window or a mirror. However, at the end of the video, in the final optimized mesh we can actually see an interpolation in some areas to fill most small holes, even windows. For example, look at how windows are "closed" and textured from images from the outdoor. cheers
@davidmartin1628
@davidmartin1628 5 жыл бұрын
@@matlabbe matlabbe Thanks for the reply. I feel like I didn't give enough credit where it was due as the project was fairly impressive. I fully understand that the 3D rendering is an result of the camera directly mapping the image to the mesh created by the depth sensor. My idea was to form a gradient of the textures in regions of large objects that haven't been imaged, based on adjacent regions that have been scanned, like the underside of tables. An example of this is the mesh has been captured for the pool table upstairs and the top and side surfaces of the object has been imaged. There has been interpolation of the mesh for the underside. However the underside remains untextured/transparent as it was never directly observed with the camera which is to be expected. My idea is that I think it would make a nice added bonus if the interpolated mesh surfaces got shaded or textured with textures from surrounding regions to make the image look more 'complete' without having actually scanned them. This way when you look at the 3D rendering from any angle, the object will not have sides missing. I know the interpolation of textures would produce a lot of incorrectly shaded sides but I think it would look neat.
@sergioandresbarragallardo2447
@sergioandresbarragallardo2447 6 жыл бұрын
Dear friend, I downloaded the example of multisession ( chalet 1;2;3;4) and can run the files on my computer, all the files works great, but at the moment of generate my own database with my phone the process fail. I made 3 models (1.db; 2.db; 3.db) but is imposible for me merge the scenes. There is some special method to scan with the application on the phone between scene and scene ? thanks !
@mathieulabbe4889
@mathieulabbe4889 6 жыл бұрын
Make sure to start/stop from "anchor locations" (this is explained in the scanning tips section of the page linked in the description). If lighting has changed between the sessions, this could also influence the ability to merge the sessions.
@Insectula
@Insectula 7 жыл бұрын
Can't they for indoor architectural scanning create something that calculates "hey, that's a flat wall" and make a flat plane out of it? "oh look, that's an organic object so I'll skip that"..."that's a curved surface so I'll try to calculate my best attempt at a radius". I mean somthing that gives a basic structural layout, and you can fill it with photogrammerty scanned or modeled objects? If it recognizes a basic primitive it replaces with one to reduce the polycount and clean it up? Sort of like the conversion of a bitmap to a vector in 2D apps?
@matlabbe
@matlabbe 7 жыл бұрын
Indeed, detecting walls/floors as planes can save a lot of polygons and make sure the walls are actually straight. It is not an easy task for SLAM systems as some information may be missing or there are errors in the map (caused by drift) making double walls/floors effect. With a good scan, it could be possible though (see Matterport's dollhouse view for example, not sure if they detect planes, but they seem to reduce quite a lot the required polygons to show). We will probably see in not so far future mobile apps that can do that.
@attreyu65
@attreyu65 7 жыл бұрын
Is it possible to use an iPad/iPhone with a Structure sensor attached ? I mean, all these SLAM solutions are getting better and better at detecting the loop closures, not having spikes or other defects in the geometry - but the texturing is still very, very low. I understand that Tango uses video clips, but even then, those clips are being split into their respective frames, and surely when you film in 1080p at 30 or 60 FPS, the extracted frames must be good enough, with the quality of the cameras these days. Still, I don't know of even one SLAM solution, except maybe itSeez3D with Structure, which has results comparable to photogrammetry - and even itSeez3D or Structure+Skanect (via uplink), don't let us browse around the environment, they only let us scan objects, bodies or rooms - but the higher the volume scanned, the lower the resolution, so they are ultimately useless. I also understand that these options are being chosen by you, as developers, to increase the FPS, but surely the processing can be offloaded to a workstation via WiFi, and when you have a couple of Titans or 1080 Ti waiting to work and chunk the data you throw at them - it shouldn't be an issue. Hell, I would be more than happy to skip the realtime processing, if you can give me a perfect quality texturing, based on 2-5mb/frame jpegs or png. I don't know why everyone is thinking about scanning dolls and their girlfriend - when these RGB-D cameras, very cheap, could in theory do the job a Faro or Leica scanner does for 50.000€. Sorry for the rant :) I just know that you guys already have the algorythms you need, the processing power is something we, as users, have - so why isn't it done yet ? :)
@matlabbe
@matlabbe 7 жыл бұрын
For the first question, yes, we could get similar results with a Structure sensor, though I am more concerned with the motion tracking accuracy that could be not as good as Tango (for large space like that). About processing power, the main goal of RTAB-Map app is to do processing onboard and to limit exportation time, so point clouds are downsampled and textures are downscaled. If you want to export in high texture resolution and high point cloud density, I suggest to save the database and open it in RTAB-Map Desktop to use more processing power so that exportation is done faster (Section 7 of github.com/introlab/rtabmap/wiki/Multi-Session-Mapping-with-RTAB-Map-Tango ). Note that full resolution images are always saved in the database, even if the online rendering texture quality and point cloud density are low.
@attreyu65
@attreyu65 7 жыл бұрын
It would be nice to also use the Structure with RTAB. In Desktop you have all major sensors, except this one. Any plans of compiling the sources to use Structure as well ? Why I'm insisting on Structure - because it's relatively known and it's completely mobile - you can really explore the surroundings, something impossible with a R200 or a Kinect, which need to be attached to the desktop/laptop via cable. Also, if you've used Structure with Skanect, you know it can uplink the data to a computer via WiFi, which is perfect. Regarding the texturing size - in the options we have a maximum of 8192x8192 - is this the size of the whole texture atlas, or per capture ? Have you used Bundle Fusion ?
@matlabbe
@matlabbe 6 жыл бұрын
Yes, I approve. I don't have a structure sensor, so it is the main reason why it is not supported in RTAB-Map. It is the size of the texture atlas. However, there is an option to output more than one texture atlas. For example, I just uploaded this video kzfaq.info/get/bejne/fr12Z92VvdCqpWg.html showing the model after being exported with 15 4096x4096 textures (camera images are not scaled). See the description for link to compare with model exported with default RTAB-Map settings (only one texture atlas). I use the Point Cloud Library.
@kylegreenberg8690
@kylegreenberg8690 7 жыл бұрын
how do these sessions work? and how do you combine sessions
@mathieulabbe4889
@mathieulabbe4889 7 жыл бұрын
Images are matched between sessions. When there is a match, a constraint is added between the maps. In the video, when we see reappearing a previous map it is because such constraint has been found, the graph of both sessions are now linked into a single global graph. See paper referred on this page for more details: github.com/introlab/rtabmap/wiki/Multi-session
@kylegreenberg8690
@kylegreenberg8690 7 жыл бұрын
Mathieu Labbé thankyou for sharing!
@Snyft
@Snyft 3 жыл бұрын
I think this is something I need. I want to make a 1:1 vr recreation of my apartment but I dont know any of this. Is this tool a good way to start?
@matlabbe
@matlabbe 3 жыл бұрын
If you have a iPhone/iPad with LiDAR, give it a try with the iOS version, it is free. There are also other apps on iOS that would give similar results. This is currently the most easiest way to scan without any other specialized hardware.
@keyserswift5077
@keyserswift5077 4 жыл бұрын
when I tried this it ended up looking like a big ball. lol
@etherialwell6959
@etherialwell6959 3 жыл бұрын
Make sure you dont live in a completely spherical home! Results may differ based on what your apartment looks like ;P
@animowany111
@animowany111 6 жыл бұрын
Hi, does this project need a depth camera? Do you use any sensors like the gyroscope and accelerometer? Thanks
@matlabbe
@matlabbe 6 жыл бұрын
A depth camera is required. For motion estimation, it is the approach developed by Google Tango, which is a fusion of gyroscope/accelerometer and a fish-eye camera (i.e., visual inertial odometry).
@animowany111
@animowany111 6 жыл бұрын
Too bad, I don't own a depth camera. I've experimented with ORB-SLAM2, but that gives extremely large scale drift using video from my phone (and even using one of the proper datasets, namely freiburg2_large_with_loop). I haven't managed to compile LSD-SLAM yet, as I am a beginner to ROS and it uses some extremely non-standard build system.
@minhajsixbyte
@minhajsixbyte 3 жыл бұрын
can this be done with any phone camera or special hardware required? is it possible with an iphone Xr or huawei y7 pro 2018
@songqiaocui2950
@songqiaocui2950 3 жыл бұрын
Unfortunately no, it only works on google tango smartphones, which only has two phones. And lenovo phab 2 pro is one of them. And to make it worse, google killed tango project 4 years ago and switched to AR core. Tango needs specific hardware like IMU and TOF sensors, which normal phones do not equipt with.
@minhajsixbyte
@minhajsixbyte 3 жыл бұрын
@@songqiaocui2950 :(
@Aristocle
@Aristocle 6 жыл бұрын
The android app for my samsung s7 dont work. why?
@matlabbe
@matlabbe 6 жыл бұрын
The app doesn't work on ARCore, only on Google Tango compatible phones. It is because a depth camera is required.
@youbutstronger1453
@youbutstronger1453 7 жыл бұрын
how do I export the finished scan to my pc?
@matlabbe
@matlabbe 7 жыл бұрын
If you did Export and saved the mesh/cloud on the device, you can find the zip file on the SD-CARD's RTAB-Map/Export folder. You can use Astro file manager to browse files on the SD-CARD. From there you can send the file by email or copy it in your google drive.
@youbutstronger1453
@youbutstronger1453 7 жыл бұрын
matlabbe Thank you 😁😘
@alightimages7401
@alightimages7401 3 жыл бұрын
Would this work with any 360 cameras?
@matlabbe
@matlabbe 3 жыл бұрын
no...
@curtiswilson8402
@curtiswilson8402 4 жыл бұрын
Is the "render" photo-grade clarity? ☺
@matlabbe
@matlabbe 4 жыл бұрын
There is still room for improvements, but we try to improve the texture quality over time. For example, compare the old and new sketchfab models linked in the description.
@michaeld954
@michaeld954 5 жыл бұрын
Will this do the outside easily
@matlabbe
@matlabbe 5 жыл бұрын
It would do outdoor as long there is no direct sunlight on the house (may work on a cloudy day). However it would not be easy to scan more than the first floor. For outdoor, you may use structure from motion / photogrammetry with a flying drone. This will be a lot faster and easier.
@belikepanda.
@belikepanda. 2 жыл бұрын
I wish there was a tutorial on how to do this
@matlabbe
@matlabbe 2 жыл бұрын
See github.com/introlab/rtabmap/wiki/Multi-Session-Mapping-with-RTAB-Map-Tango. If you want to try with your own data, a Google Tango phone is required, or a iPhone/iPad with LiDAR (using the RTAB-Map iOS app).
@Tom-fb6nz
@Tom-fb6nz 5 жыл бұрын
Would this work on the s10?
@matlabbe
@matlabbe 5 жыл бұрын
Unfortunately no. This technology is currently only available with Lenovo Phab2Pro and Asus Zenfone AR, which have a rear long-range depth sensor (a time-of-flight technology similar to LiDARs).
@mrbulp
@mrbulp 3 жыл бұрын
@@matlabbe have try with nokia? the latest had the depth sensing tech in it
@Jasonreninsh
@Jasonreninsh 7 жыл бұрын
Do u also use IMU for assistant?
@matlabbe
@matlabbe 7 жыл бұрын
Yes, Tango's odometry is a Visual Inertial Odometry (VIO) approach.
@Jasonreninsh
@Jasonreninsh 7 жыл бұрын
okay, I see. I tried with Kinect and the position of camera would be lost very often. Now I get the point. tks.
@ziweiliao4044
@ziweiliao4044 4 жыл бұрын
it will be really crazy if no IMU is used... such kind of shakes and moving speed... hope the tech without IMU achieving the same result will come out in the next 5 years ( T.T )
@harshilsakadasariya7684
@harshilsakadasariya7684 7 жыл бұрын
hey , I want to make this type of application bt without using tango enabled devices . is it possible to make that ?
@matlabbe
@matlabbe 7 жыл бұрын
RTAB-Map is available without Tango. However, in this kind of application (hand-held scanning with RGB-D sensor), the Visual Inertial Odometry approach of Tango helps a lot to get more accurate maps.
@harshilsakadasariya7684
@harshilsakadasariya7684 7 жыл бұрын
so is it possible to do 3D scanning with iphone or any simple android phone such as motoG5 ?
@matlabbe
@matlabbe 7 жыл бұрын
No, a depth sensor is required. You may find photogrammetry-based apps that could do 3D scanning using only the phone's camera. Some of these apps are very good for small objects or scanning single person, but for scanning large indoor environments like in this video, it would not be as easy, fast or reliable.
@harshilsakadasariya7684
@harshilsakadasariya7684 7 жыл бұрын
thank you very much sir it is really helpful to me.
@Martin-dx5zs
@Martin-dx5zs 7 жыл бұрын
Harshil Sakadasariya AKkit
@jozatheman
@jozatheman 4 жыл бұрын
i cant scan my room with the app i have :/
@sergiesaenz6735
@sergiesaenz6735 6 жыл бұрын
can this 3d scanned map be used in a first person game ?
@matlabbe
@matlabbe 6 жыл бұрын
Yes, this is a 3D model like everything else.
@simeonnedkov894
@simeonnedkov894 5 жыл бұрын
You need to remodel basically everything
@supersaiyajin1599
@supersaiyajin1599 5 жыл бұрын
@@simeonnedkov894 why?
@pixelflex7297
@pixelflex7297 7 жыл бұрын
How long did it take to map the building?
@matlabbe
@matlabbe 7 жыл бұрын
~30 minutes of scanning
@pixelflex7297
@pixelflex7297 7 жыл бұрын
Thanks for the update.
@venomman
@venomman 7 жыл бұрын
How good is the image in 3d? Im thinking of the new Asus Tango phone with the 23 MP camera. But I really want the resolution to be very high to create these worlds for VR.
@matlabbe
@matlabbe 7 жыл бұрын
Tango API uses the camera in video mode, so you won't have 23 MP RGB images. For example on Phab2Pro, we can get 1080p max. Note that for an area as large as in this video, the textures should be downscaled a lot to be viewable on most devices.
@venomman
@venomman 7 жыл бұрын
matlabbe well that's somewhat good news as the zenfone has a 4k camera...?
@myperspective5091
@myperspective5091 7 жыл бұрын
Cool. What was the actual time laps?
@matlabbe
@matlabbe 7 жыл бұрын
1 Hz frame rate and ~1860 frames, so about ~31 minutes of scanning.
@myperspective5091
@myperspective5091 7 жыл бұрын
1.Were you involved in development of this software? 2.Did you test to see what the maximum range was. 3. Did you test to see what the minimum range was (inside of a shoe box or a closet, area between furniture and the wall)? 4. Can you port it to any graphics environments? If so, have you heard anyone who has filled in the blanks to complete the image?
@matlabbe
@matlabbe 7 жыл бұрын
1. yes 2. ~7 meters max in good conditions, biggest issues are black and reflective materials. For example, if you pause at 2:32 (frame 1494), I've enabled on left the visualization of the depth over rgb, so we can see that no depth data can be captured on the black couch (invisible couch!). 3. min 30 cm, maybe 50 to avoid Tango drifting 4. On the sketchfab link in the description, this is a common OBJ model that has been exported. No for the last question :P cheers
@myperspective5091
@myperspective5091 7 жыл бұрын
Thank.👍 I always wanted to combine something like this with a robots navigation system. By using some object recognition software to tag the location of objects in the environment. That way it can make list to make it's own task tree. It could even create it's own access level expectations on where to find people that use that environment because it could identity different types of rooms by their contents. It could do it just from look up check off list.
@JetJockey87
@JetJockey87 7 жыл бұрын
You could try using a Simultaneous Localization And Mapping (SLAM) algorithm for an autonomous vehicle to navigate its environment. This is how robotic vacuum cleaners find the way back to their charging stations.
@GospodinJean
@GospodinJean 2 жыл бұрын
which software is that?
@matlabbe
@matlabbe 2 жыл бұрын
RTAB-Map for Google Tango, but now available on iOS (with LiDAR required)
@sergioandresbarragallardo2447
@sergioandresbarragallardo2447 7 жыл бұрын
Export in pcv or xyz format?
@matlabbe
@matlabbe 7 жыл бұрын
Currently, there are only PLY (binary) and OBJ (ASCII) export formats available, though they should be readable in most softwares (e.g. MeshLab) for conversion in other format.
@sergioandresbarragallardo2447
@sergioandresbarragallardo2447 7 жыл бұрын
Thanks for the answer, the program is great, it would be wonderful to reconcile the potential with programs like archicad or revit
@moahammad1mohammad
@moahammad1mohammad 4 жыл бұрын
How are the scans so goo- Oh hes using a handheld scanner
@valeriavidanekalman3885
@valeriavidanekalman3885 3 жыл бұрын
Freeman Google made a phone that uses 3d sensors to capture the idk reality its called tango, its shittyass now so they made the Google ar, its better for 3d scanning.
@widgity
@widgity 4 жыл бұрын
I assume this relies on Tango or ARcore? Such a shame Google killed Tango, I wish I bought a device while they were available.
@mathieulabbe4889
@mathieulabbe4889 4 жыл бұрын
Yeah, it works only on Tango phones (not ARCore). I use Asus Zenfone AR, which can still be bought I think (here bestbuy in canada still sell it at 800$CAD).
@widgity
@widgity 4 жыл бұрын
@@mathieulabbe4889 Huh, I thought AR core was meant to be a direct replacement for Tango. I guess not. I played with the zenphone, and nearly bought one for about £200, but they are out of stock anywhere I can find them locally now. If i bought one now, I'd be worried it would come with AR core instead of Tango.
@matlabbe
@matlabbe 4 жыл бұрын
@@widgity My Zenfone AR has both ArCore and Google Tango working
@oxpack
@oxpack 3 жыл бұрын
That’s not a cottage maybe vacation home but cottages are way smaller and have an old pair of skis on the wall.
@matlabbe
@matlabbe 3 жыл бұрын
It seems there is a debate around cabin, cottage or chalet depending on where you live in Canada (www.narcity.com/life/canadians-cant-agree-on-whether-its-called-a-cottage-cabin-or-chalet). I am used to "chalet de ski" in french, but it seems the most common translation is "cottage". Here also the renting site (www.chaletsalpins.ca/en/cottages-for-rent/), they also called them "cottage". They are all new luxurious constructions on Stoneham ski resort just a little north of Québec city, but I agree with you, a "chalet" or "cottage" is a lot smaller and rustic in general.
@kingarchnyc
@kingarchnyc 3 жыл бұрын
What software/app is this?
@matlabbe
@matlabbe 3 жыл бұрын
RTAB-Map: play.google.com/store/apps/details?id=com.introlab.rtabmap It is free and Open Source.
@kingarchnyc
@kingarchnyc 3 жыл бұрын
@@matlabbe thank you 🙏🏿 too bad I use iPhone, this does not seem to have an iOS version... 🤦
@matlabbe
@matlabbe 3 жыл бұрын
@@kingarchnyc An iOS version will be released soon, but it will work only on iPhones/iPads with LiDAR sensor.
@GirizdL
@GirizdL 3 жыл бұрын
@@matlabbe Dear Mathieu, I'm looking for a new phone, and the biggest dilemme for me is the Asus ZenFone AR and the iPhone 12 MAX pro. I found some materials about the object scanning function of the LidAR, and was a little bit disappointing for me, and the CrossPoint's house-scanning didn't briefed me that iPhone worth it. What do you think about the using of the old Tango Tablets for this purpose? Are they as good as Zenfone
@matlabbe
@matlabbe 3 жыл бұрын
@@GirizdL I prefer Tango phones with TOF (zenfone AR or Phab2 pro) over the original tango dev kits. The advantage of iPhone is that it will be supported and get new apps over time, while nobody develop for tango anymore.
@Hermiel
@Hermiel 4 жыл бұрын
Can this be made to work on the 2020 iPad?
@mathieulabbe4889
@mathieulabbe4889 4 жыл бұрын
Yes theoretically, it has the hardware. The problem is from the software. I didn't have the confirmation yet if ArKit lets us get the raw point cloud or depth image from the LiDAR (if someone knows, tell me! I am waiting for that before buying one). What I see from latest Xcode 11.4, we can get a mesh but not the point cloud/depth image (registered with color camera). Maybe they will unlock that possibility in the future.
@Hermiel
@Hermiel 4 жыл бұрын
@@mathieulabbe4889 Can you use the facial scanner on the iPhone? www.fabbaloo.com/blog/2020/3/31/apples-new-lidar-ipad-disappoints-or-does-it
@mathieulabbe4889
@mathieulabbe4889 4 жыл бұрын
@@Hermiel maybe, but it is not very user friendly if you cannot see the screen when you scan... Note that I don't see a problem with the actual resolution of the lidar, it seems similar to zenfone ar or huawei P30 pro
@GirizdL
@GirizdL 3 жыл бұрын
@@mathieulabbe4889 Will you publish your scanning software for the AppGallery for the Huawei P50 pro or for the P40 Pro ?
@matlabbe
@matlabbe 3 жыл бұрын
@@GirizdL I am currently having some depth distortion issues on huawei ArEngine with my latest build, I'll see what I can do.
@dior1992
@dior1992 4 жыл бұрын
What kind of accuracy did you manage to have on the measurements?
@matlabbe
@matlabbe 4 жыл бұрын
For a single room, it is easy to get only 1 to 3 cm error. For large areas, it can be worst if loop closures are not detected (vio drift not corrected).
@shuixing85
@shuixing85 3 жыл бұрын
cannot wait to use this program on iPhone 12 pro😀😀😀
@matlabbe
@matlabbe 3 жыл бұрын
2021 ;)
@bobpro583
@bobpro583 4 жыл бұрын
Work on iPhone 11???
@matlabbe
@matlabbe 4 жыл бұрын
No, currently working only on Tango-enabled phones (or selected android phones with back TOF camera)
@matteo_petruz1435
@matteo_petruz1435 3 жыл бұрын
@@matlabbe where i can download app?
@matlabbe
@matlabbe 3 жыл бұрын
@@matteo_petruz1435 play.google.com/store/apps/details?id=com.introlab.rtabmap or latest version: github.com/introlab/rtabmap/wiki/Installation#rtab-map-tango-apk
@jonixmotogp1423
@jonixmotogp1423 5 жыл бұрын
holy shit
Outdoor stereo SLAM with RTAB-Map
10:45
matlabbe
Рет қаралды 110 М.
Comparing Top Five 3D Scanner Apps | Photogrammetry VS NeRFs VS LiDAR
9:40
JackIsBuildingKIRI
Рет қаралды 126 М.
Playing hide and seek with my dog 🐶
00:25
Zach King
Рет қаралды 34 МЛН
Best Toilet Gadgets and #Hacks you must try!!💩💩
00:49
Poly Holy Yow
Рет қаралды 20 МЛН
Automated and easy 3D scanning with OpenScan Mini - Guide and test
18:53
3D Scanning in Meshroom using Photogrammetry
6:28
Arghya Chatterjee
Рет қаралды 22 М.
RTAB-Map Tango 0.11.14 on Phab2Pro
4:35
matlabbe
Рет қаралды 20 М.
3D Mapping The Exploratorium with Matterport!
11:30
Adam Savage’s Tested
Рет қаралды 179 М.
My room in VR in 1:1 scale (Photogrammetry Scan)
1:43
Azad Balabanian
Рет қаралды 516 М.
Free Photo Scanning Workflow! (VisualSFM and Meshlab)
23:50
Gleb Alexandrov
Рет қаралды 433 М.
3D Scan a Building with Reality Capture
23:48
Phil Nolan
Рет қаралды 68 М.
RTAB-Map for iOS - LiDAR Scanner App
10:35
matlabbe
Рет қаралды 12 М.
Todos os modelos de smartphone
0:20
Spider Slack
Рет қаралды 64 МЛН
My iPhone 15 pro max 😱🫣😂
0:21
Nadir Show
Рет қаралды 715 М.
Это Xiaomi Su7 Max 🤯 #xiaomi #su7max
1:01
Tynalieff Shorts
Рет қаралды 2,1 МЛН
Лазер против камеры смартфона
1:01
Newtonlabs
Рет қаралды 737 М.
Новые iPhone 16 и 16 Pro Max
0:42
Romancev768
Рет қаралды 2,2 МЛН
Проверил, как вам?
0:58
Коннор
Рет қаралды 147 М.