For more info: github.com/introlab/rtabmap/w... Sketchfab model: skfb.ly/6p7YJ (Updated Nov 2019: skfb.ly/6OyUy) Erratum: the phone is "Phab2Pro", not "Phan2Pro"
Пікірлер: 127
@tokyowarfare67296 жыл бұрын
5 laser scanner salesman did not like the video :D
@GyLala5 жыл бұрын
Does support Samsung Galaxy S9+? sadly :c I can't install. well Google Tango
@mute31894 жыл бұрын
A reassuring example; close to the quality I got of my basement floor with the iPhone X + Meshroom. You had a more complex area than me, with more input data. - Very interesting. Thank-you for sharing. Any added tips you (or anyone reading) may have found to heighten roomscan quality (in conditions without a better camera, and conditions with) are appreciated. Incredible exciting stuff!
@yawnyawning6 жыл бұрын
it really looks incredible... cant imagine Lenovo got this kind of technology...
@subramanyam26996 жыл бұрын
This is insane !!!
@gulhankaya50882 жыл бұрын
hello, how do I use the map I created? I am using the application in an autonomous vehicle. How will it go through the map I created when I put the vehicle at the starting point?
@moseschuka75722 жыл бұрын
Is it possible to implement object detection, recognition, pose and tagging will doing this? if so, please how?
@chemistry20232 жыл бұрын
A great full 3d map
@aussieraver71823 жыл бұрын
Maaaaad! Now add VR with PBR + HDRP + Post Processing on Unity and you got yourself realism in the digital.
@Nicoda1st7 жыл бұрын
Thanks for sharing!
@davidmartin16285 жыл бұрын
It would be great to see some interpolation between adjacent vertexes where there is missing data to 'fill in' the gaps to make a full 3D rendering without having faces of objects missing when looking at it from one side.
@matlabbe5 жыл бұрын
In the first part of the video, this is the raw mesh that can be created from the data coming from the depth sensor. Holes are created by: 1) we didn't scan the area (like we didn't turn around an object to see behind it), 2) the depth sensor cannot "see" the surface (black or metallic surfaces), 3) there is a window or a mirror. However, at the end of the video, in the final optimized mesh we can actually see an interpolation in some areas to fill most small holes, even windows. For example, look at how windows are "closed" and textured from images from the outdoor. cheers
@davidmartin16285 жыл бұрын
@@matlabbe matlabbe Thanks for the reply. I feel like I didn't give enough credit where it was due as the project was fairly impressive. I fully understand that the 3D rendering is an result of the camera directly mapping the image to the mesh created by the depth sensor. My idea was to form a gradient of the textures in regions of large objects that haven't been imaged, based on adjacent regions that have been scanned, like the underside of tables. An example of this is the mesh has been captured for the pool table upstairs and the top and side surfaces of the object has been imaged. There has been interpolation of the mesh for the underside. However the underside remains untextured/transparent as it was never directly observed with the camera which is to be expected. My idea is that I think it would make a nice added bonus if the interpolated mesh surfaces got shaded or textured with textures from surrounding regions to make the image look more 'complete' without having actually scanned them. This way when you look at the 3D rendering from any angle, the object will not have sides missing. I know the interpolation of textures would produce a lot of incorrectly shaded sides but I think it would look neat.
@sergioandresbarragallardo24476 жыл бұрын
Dear friend, I downloaded the example of multisession ( chalet 1;2;3;4) and can run the files on my computer, all the files works great, but at the moment of generate my own database with my phone the process fail. I made 3 models (1.db; 2.db; 3.db) but is imposible for me merge the scenes. There is some special method to scan with the application on the phone between scene and scene ? thanks !
@mathieulabbe48896 жыл бұрын
Make sure to start/stop from "anchor locations" (this is explained in the scanning tips section of the page linked in the description). If lighting has changed between the sessions, this could also influence the ability to merge the sessions.
@Insectula7 жыл бұрын
Can't they for indoor architectural scanning create something that calculates "hey, that's a flat wall" and make a flat plane out of it? "oh look, that's an organic object so I'll skip that"..."that's a curved surface so I'll try to calculate my best attempt at a radius". I mean somthing that gives a basic structural layout, and you can fill it with photogrammerty scanned or modeled objects? If it recognizes a basic primitive it replaces with one to reduce the polycount and clean it up? Sort of like the conversion of a bitmap to a vector in 2D apps?
@matlabbe7 жыл бұрын
Indeed, detecting walls/floors as planes can save a lot of polygons and make sure the walls are actually straight. It is not an easy task for SLAM systems as some information may be missing or there are errors in the map (caused by drift) making double walls/floors effect. With a good scan, it could be possible though (see Matterport's dollhouse view for example, not sure if they detect planes, but they seem to reduce quite a lot the required polygons to show). We will probably see in not so far future mobile apps that can do that.
@attreyu657 жыл бұрын
Is it possible to use an iPad/iPhone with a Structure sensor attached ? I mean, all these SLAM solutions are getting better and better at detecting the loop closures, not having spikes or other defects in the geometry - but the texturing is still very, very low. I understand that Tango uses video clips, but even then, those clips are being split into their respective frames, and surely when you film in 1080p at 30 or 60 FPS, the extracted frames must be good enough, with the quality of the cameras these days. Still, I don't know of even one SLAM solution, except maybe itSeez3D with Structure, which has results comparable to photogrammetry - and even itSeez3D or Structure+Skanect (via uplink), don't let us browse around the environment, they only let us scan objects, bodies or rooms - but the higher the volume scanned, the lower the resolution, so they are ultimately useless. I also understand that these options are being chosen by you, as developers, to increase the FPS, but surely the processing can be offloaded to a workstation via WiFi, and when you have a couple of Titans or 1080 Ti waiting to work and chunk the data you throw at them - it shouldn't be an issue. Hell, I would be more than happy to skip the realtime processing, if you can give me a perfect quality texturing, based on 2-5mb/frame jpegs or png. I don't know why everyone is thinking about scanning dolls and their girlfriend - when these RGB-D cameras, very cheap, could in theory do the job a Faro or Leica scanner does for 50.000€. Sorry for the rant :) I just know that you guys already have the algorythms you need, the processing power is something we, as users, have - so why isn't it done yet ? :)
@matlabbe7 жыл бұрын
For the first question, yes, we could get similar results with a Structure sensor, though I am more concerned with the motion tracking accuracy that could be not as good as Tango (for large space like that). About processing power, the main goal of RTAB-Map app is to do processing onboard and to limit exportation time, so point clouds are downsampled and textures are downscaled. If you want to export in high texture resolution and high point cloud density, I suggest to save the database and open it in RTAB-Map Desktop to use more processing power so that exportation is done faster (Section 7 of github.com/introlab/rtabmap/wiki/Multi-Session-Mapping-with-RTAB-Map-Tango ). Note that full resolution images are always saved in the database, even if the online rendering texture quality and point cloud density are low.
@attreyu657 жыл бұрын
It would be nice to also use the Structure with RTAB. In Desktop you have all major sensors, except this one. Any plans of compiling the sources to use Structure as well ? Why I'm insisting on Structure - because it's relatively known and it's completely mobile - you can really explore the surroundings, something impossible with a R200 or a Kinect, which need to be attached to the desktop/laptop via cable. Also, if you've used Structure with Skanect, you know it can uplink the data to a computer via WiFi, which is perfect. Regarding the texturing size - in the options we have a maximum of 8192x8192 - is this the size of the whole texture atlas, or per capture ? Have you used Bundle Fusion ?
@matlabbe6 жыл бұрын
Yes, I approve. I don't have a structure sensor, so it is the main reason why it is not supported in RTAB-Map. It is the size of the texture atlas. However, there is an option to output more than one texture atlas. For example, I just uploaded this video kzfaq.info/get/bejne/fr12Z92VvdCqpWg.html showing the model after being exported with 15 4096x4096 textures (camera images are not scaled). See the description for link to compare with model exported with default RTAB-Map settings (only one texture atlas). I use the Point Cloud Library.
@kylegreenberg86907 жыл бұрын
how do these sessions work? and how do you combine sessions
@mathieulabbe48897 жыл бұрын
Images are matched between sessions. When there is a match, a constraint is added between the maps. In the video, when we see reappearing a previous map it is because such constraint has been found, the graph of both sessions are now linked into a single global graph. See paper referred on this page for more details: github.com/introlab/rtabmap/wiki/Multi-session
@kylegreenberg86907 жыл бұрын
Mathieu Labbé thankyou for sharing!
@Snyft3 жыл бұрын
I think this is something I need. I want to make a 1:1 vr recreation of my apartment but I dont know any of this. Is this tool a good way to start?
@matlabbe3 жыл бұрын
If you have a iPhone/iPad with LiDAR, give it a try with the iOS version, it is free. There are also other apps on iOS that would give similar results. This is currently the most easiest way to scan without any other specialized hardware.
@keyserswift50774 жыл бұрын
when I tried this it ended up looking like a big ball. lol
@etherialwell69593 жыл бұрын
Make sure you dont live in a completely spherical home! Results may differ based on what your apartment looks like ;P
@animowany1116 жыл бұрын
Hi, does this project need a depth camera? Do you use any sensors like the gyroscope and accelerometer? Thanks
@matlabbe6 жыл бұрын
A depth camera is required. For motion estimation, it is the approach developed by Google Tango, which is a fusion of gyroscope/accelerometer and a fish-eye camera (i.e., visual inertial odometry).
@animowany1116 жыл бұрын
Too bad, I don't own a depth camera. I've experimented with ORB-SLAM2, but that gives extremely large scale drift using video from my phone (and even using one of the proper datasets, namely freiburg2_large_with_loop). I haven't managed to compile LSD-SLAM yet, as I am a beginner to ROS and it uses some extremely non-standard build system.
@minhajsixbyte3 жыл бұрын
can this be done with any phone camera or special hardware required? is it possible with an iphone Xr or huawei y7 pro 2018
@songqiaocui29503 жыл бұрын
Unfortunately no, it only works on google tango smartphones, which only has two phones. And lenovo phab 2 pro is one of them. And to make it worse, google killed tango project 4 years ago and switched to AR core. Tango needs specific hardware like IMU and TOF sensors, which normal phones do not equipt with.
@minhajsixbyte3 жыл бұрын
@@songqiaocui2950 :(
@Aristocle6 жыл бұрын
The android app for my samsung s7 dont work. why?
@matlabbe6 жыл бұрын
The app doesn't work on ARCore, only on Google Tango compatible phones. It is because a depth camera is required.
@youbutstronger14537 жыл бұрын
how do I export the finished scan to my pc?
@matlabbe7 жыл бұрын
If you did Export and saved the mesh/cloud on the device, you can find the zip file on the SD-CARD's RTAB-Map/Export folder. You can use Astro file manager to browse files on the SD-CARD. From there you can send the file by email or copy it in your google drive.
@youbutstronger14537 жыл бұрын
matlabbe Thank you 😁😘
@alightimages74013 жыл бұрын
Would this work with any 360 cameras?
@matlabbe3 жыл бұрын
no...
@curtiswilson84024 жыл бұрын
Is the "render" photo-grade clarity? ☺
@matlabbe4 жыл бұрын
There is still room for improvements, but we try to improve the texture quality over time. For example, compare the old and new sketchfab models linked in the description.
@michaeld9545 жыл бұрын
Will this do the outside easily
@matlabbe5 жыл бұрын
It would do outdoor as long there is no direct sunlight on the house (may work on a cloudy day). However it would not be easy to scan more than the first floor. For outdoor, you may use structure from motion / photogrammetry with a flying drone. This will be a lot faster and easier.
@belikepanda.2 жыл бұрын
I wish there was a tutorial on how to do this
@matlabbe2 жыл бұрын
See github.com/introlab/rtabmap/wiki/Multi-Session-Mapping-with-RTAB-Map-Tango. If you want to try with your own data, a Google Tango phone is required, or a iPhone/iPad with LiDAR (using the RTAB-Map iOS app).
@Tom-fb6nz5 жыл бұрын
Would this work on the s10?
@matlabbe5 жыл бұрын
Unfortunately no. This technology is currently only available with Lenovo Phab2Pro and Asus Zenfone AR, which have a rear long-range depth sensor (a time-of-flight technology similar to LiDARs).
@mrbulp3 жыл бұрын
@@matlabbe have try with nokia? the latest had the depth sensing tech in it
@Jasonreninsh7 жыл бұрын
Do u also use IMU for assistant?
@matlabbe7 жыл бұрын
Yes, Tango's odometry is a Visual Inertial Odometry (VIO) approach.
@Jasonreninsh7 жыл бұрын
okay, I see. I tried with Kinect and the position of camera would be lost very often. Now I get the point. tks.
@ziweiliao40444 жыл бұрын
it will be really crazy if no IMU is used... such kind of shakes and moving speed... hope the tech without IMU achieving the same result will come out in the next 5 years ( T.T )
@harshilsakadasariya76847 жыл бұрын
hey , I want to make this type of application bt without using tango enabled devices . is it possible to make that ?
@matlabbe7 жыл бұрын
RTAB-Map is available without Tango. However, in this kind of application (hand-held scanning with RGB-D sensor), the Visual Inertial Odometry approach of Tango helps a lot to get more accurate maps.
@harshilsakadasariya76847 жыл бұрын
so is it possible to do 3D scanning with iphone or any simple android phone such as motoG5 ?
@matlabbe7 жыл бұрын
No, a depth sensor is required. You may find photogrammetry-based apps that could do 3D scanning using only the phone's camera. Some of these apps are very good for small objects or scanning single person, but for scanning large indoor environments like in this video, it would not be as easy, fast or reliable.
@harshilsakadasariya76847 жыл бұрын
thank you very much sir it is really helpful to me.
@Martin-dx5zs7 жыл бұрын
Harshil Sakadasariya AKkit
@jozatheman4 жыл бұрын
i cant scan my room with the app i have :/
@sergiesaenz67356 жыл бұрын
can this 3d scanned map be used in a first person game ?
@matlabbe6 жыл бұрын
Yes, this is a 3D model like everything else.
@simeonnedkov8945 жыл бұрын
You need to remodel basically everything
@supersaiyajin15995 жыл бұрын
@@simeonnedkov894 why?
@pixelflex72977 жыл бұрын
How long did it take to map the building?
@matlabbe7 жыл бұрын
~30 minutes of scanning
@pixelflex72977 жыл бұрын
Thanks for the update.
@venomman7 жыл бұрын
How good is the image in 3d? Im thinking of the new Asus Tango phone with the 23 MP camera. But I really want the resolution to be very high to create these worlds for VR.
@matlabbe7 жыл бұрын
Tango API uses the camera in video mode, so you won't have 23 MP RGB images. For example on Phab2Pro, we can get 1080p max. Note that for an area as large as in this video, the textures should be downscaled a lot to be viewable on most devices.
@venomman7 жыл бұрын
matlabbe well that's somewhat good news as the zenfone has a 4k camera...?
@myperspective50917 жыл бұрын
Cool. What was the actual time laps?
@matlabbe7 жыл бұрын
1 Hz frame rate and ~1860 frames, so about ~31 minutes of scanning.
@myperspective50917 жыл бұрын
1.Were you involved in development of this software? 2.Did you test to see what the maximum range was. 3. Did you test to see what the minimum range was (inside of a shoe box or a closet, area between furniture and the wall)? 4. Can you port it to any graphics environments? If so, have you heard anyone who has filled in the blanks to complete the image?
@matlabbe7 жыл бұрын
1. yes 2. ~7 meters max in good conditions, biggest issues are black and reflective materials. For example, if you pause at 2:32 (frame 1494), I've enabled on left the visualization of the depth over rgb, so we can see that no depth data can be captured on the black couch (invisible couch!). 3. min 30 cm, maybe 50 to avoid Tango drifting 4. On the sketchfab link in the description, this is a common OBJ model that has been exported. No for the last question :P cheers
@myperspective50917 жыл бұрын
Thank.👍 I always wanted to combine something like this with a robots navigation system. By using some object recognition software to tag the location of objects in the environment. That way it can make list to make it's own task tree. It could even create it's own access level expectations on where to find people that use that environment because it could identity different types of rooms by their contents. It could do it just from look up check off list.
@JetJockey877 жыл бұрын
You could try using a Simultaneous Localization And Mapping (SLAM) algorithm for an autonomous vehicle to navigate its environment. This is how robotic vacuum cleaners find the way back to their charging stations.
@GospodinJean2 жыл бұрын
which software is that?
@matlabbe2 жыл бұрын
RTAB-Map for Google Tango, but now available on iOS (with LiDAR required)
@sergioandresbarragallardo24477 жыл бұрын
Export in pcv or xyz format?
@matlabbe7 жыл бұрын
Currently, there are only PLY (binary) and OBJ (ASCII) export formats available, though they should be readable in most softwares (e.g. MeshLab) for conversion in other format.
@sergioandresbarragallardo24477 жыл бұрын
Thanks for the answer, the program is great, it would be wonderful to reconcile the potential with programs like archicad or revit
@moahammad1mohammad4 жыл бұрын
How are the scans so goo- Oh hes using a handheld scanner
@valeriavidanekalman38853 жыл бұрын
Freeman Google made a phone that uses 3d sensors to capture the idk reality its called tango, its shittyass now so they made the Google ar, its better for 3d scanning.
@widgity4 жыл бұрын
I assume this relies on Tango or ARcore? Such a shame Google killed Tango, I wish I bought a device while they were available.
@mathieulabbe48894 жыл бұрын
Yeah, it works only on Tango phones (not ARCore). I use Asus Zenfone AR, which can still be bought I think (here bestbuy in canada still sell it at 800$CAD).
@widgity4 жыл бұрын
@@mathieulabbe4889 Huh, I thought AR core was meant to be a direct replacement for Tango. I guess not. I played with the zenphone, and nearly bought one for about £200, but they are out of stock anywhere I can find them locally now. If i bought one now, I'd be worried it would come with AR core instead of Tango.
@matlabbe4 жыл бұрын
@@widgity My Zenfone AR has both ArCore and Google Tango working
@oxpack3 жыл бұрын
That’s not a cottage maybe vacation home but cottages are way smaller and have an old pair of skis on the wall.
@matlabbe3 жыл бұрын
It seems there is a debate around cabin, cottage or chalet depending on where you live in Canada (www.narcity.com/life/canadians-cant-agree-on-whether-its-called-a-cottage-cabin-or-chalet). I am used to "chalet de ski" in french, but it seems the most common translation is "cottage". Here also the renting site (www.chaletsalpins.ca/en/cottages-for-rent/), they also called them "cottage". They are all new luxurious constructions on Stoneham ski resort just a little north of Québec city, but I agree with you, a "chalet" or "cottage" is a lot smaller and rustic in general.
@kingarchnyc3 жыл бұрын
What software/app is this?
@matlabbe3 жыл бұрын
RTAB-Map: play.google.com/store/apps/details?id=com.introlab.rtabmap It is free and Open Source.
@kingarchnyc3 жыл бұрын
@@matlabbe thank you 🙏🏿 too bad I use iPhone, this does not seem to have an iOS version... 🤦
@matlabbe3 жыл бұрын
@@kingarchnyc An iOS version will be released soon, but it will work only on iPhones/iPads with LiDAR sensor.
@GirizdL3 жыл бұрын
@@matlabbe Dear Mathieu, I'm looking for a new phone, and the biggest dilemme for me is the Asus ZenFone AR and the iPhone 12 MAX pro. I found some materials about the object scanning function of the LidAR, and was a little bit disappointing for me, and the CrossPoint's house-scanning didn't briefed me that iPhone worth it. What do you think about the using of the old Tango Tablets for this purpose? Are they as good as Zenfone
@matlabbe3 жыл бұрын
@@GirizdL I prefer Tango phones with TOF (zenfone AR or Phab2 pro) over the original tango dev kits. The advantage of iPhone is that it will be supported and get new apps over time, while nobody develop for tango anymore.
@Hermiel4 жыл бұрын
Can this be made to work on the 2020 iPad?
@mathieulabbe48894 жыл бұрын
Yes theoretically, it has the hardware. The problem is from the software. I didn't have the confirmation yet if ArKit lets us get the raw point cloud or depth image from the LiDAR (if someone knows, tell me! I am waiting for that before buying one). What I see from latest Xcode 11.4, we can get a mesh but not the point cloud/depth image (registered with color camera). Maybe they will unlock that possibility in the future.
@Hermiel4 жыл бұрын
@@mathieulabbe4889 Can you use the facial scanner on the iPhone? www.fabbaloo.com/blog/2020/3/31/apples-new-lidar-ipad-disappoints-or-does-it
@mathieulabbe48894 жыл бұрын
@@Hermiel maybe, but it is not very user friendly if you cannot see the screen when you scan... Note that I don't see a problem with the actual resolution of the lidar, it seems similar to zenfone ar or huawei P30 pro
@GirizdL3 жыл бұрын
@@mathieulabbe4889 Will you publish your scanning software for the AppGallery for the Huawei P50 pro or for the P40 Pro ?
@matlabbe3 жыл бұрын
@@GirizdL I am currently having some depth distortion issues on huawei ArEngine with my latest build, I'll see what I can do.
@dior19924 жыл бұрын
What kind of accuracy did you manage to have on the measurements?
@matlabbe4 жыл бұрын
For a single room, it is easy to get only 1 to 3 cm error. For large areas, it can be worst if loop closures are not detected (vio drift not corrected).
@shuixing853 жыл бұрын
cannot wait to use this program on iPhone 12 pro😀😀😀
@matlabbe3 жыл бұрын
2021 ;)
@bobpro5834 жыл бұрын
Work on iPhone 11???
@matlabbe4 жыл бұрын
No, currently working only on Tango-enabled phones (or selected android phones with back TOF camera)
@matteo_petruz14353 жыл бұрын
@@matlabbe where i can download app?
@matlabbe3 жыл бұрын
@@matteo_petruz1435 play.google.com/store/apps/details?id=com.introlab.rtabmap or latest version: github.com/introlab/rtabmap/wiki/Installation#rtab-map-tango-apk