MooreBot Scout Bedroom Patrol
3:21
2 жыл бұрын
KR01 Rotate in Place Test 2
0:43
2 жыл бұрын
KR01 Rotate in Place Test 1
0:36
2 жыл бұрын
KD01 Moth Behaviour Test
3:17
3 жыл бұрын
KD01 Rotate in Place Test
1:32
3 жыл бұрын
KD01 Rotate in Place
0:37
3 жыл бұрын
Tuning the PID Controller
2:47
3 жыл бұрын
KR01 Telerobotic Joyride
2:29
3 жыл бұрын
KR01 VL53L1X Scanner
0:16
4 жыл бұрын
KR01 Obstacle Avoidance Demo
0:37
4 жыл бұрын
KRZ01 Motor Control Demo
0:58
4 жыл бұрын
Пікірлер
@jennibgmailcom
@jennibgmailcom 11 күн бұрын
You have the right voice to narrate a film noir.
@VigneshBalasubramaniam
@VigneshBalasubramaniam Ай бұрын
Downside of these sensors is that they have to be pretty far away from the surface to track a fast moving robot. A way of solving this is simply flipping the sensor to instead aim at the ceiling instead of the floor, which should be much farther away in most scenarios.
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics Ай бұрын
The working range of the PMW3901 is 80mm to infinity, as it was (to my understanding) designed for use with flying drones. So it's fine on a Mars rover or other robot that's at least 80mm from the ground, but can't be used any closer than that. But flipping them to look at the ceiling won't work either, as they are effectively a low-resolution camera that relies on detail in order to determine movement, and most ceilings have little to no detail, just an expanse of white paint. I lobbied Pimoroni to consider implementing a sister sensor, the ST PAA5100JE, which is almost identical to the PMW3901 except its operating range was designed for something like a robot vacuum, 15-35mm, and they happily went ahead and developed the product as a Breakout Garden board. I described this in a blog post at: robots.org.nz/2021/07/09/the-paa5100je-near-optical-flow-sensor/ Now, there's still that range between the two sensors, i.e., between 35mm and 80mm, that's not covered by either, and while it's possible to mount a PAA5100J below a tall robot, one would be sacrificing ground clearance. As with many things in robotics, a compromise...
@zark4711
@zark4711 Ай бұрын
Looking forward to the adjustments mentioned in the description.
@puntabachata
@puntabachata Ай бұрын
How will it handle martian dust that gets into an sticks to everything?
@guillermom1708
@guillermom1708 Ай бұрын
read the title of the video again
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics Ай бұрын
Every robot has a set of requirements and an expected working environment. Some are indoors only, some outdoors only on sunny days, some can handle inclement weather or even work underwater. Some are designed for other planets. This is called a Mars rover because it's based on a Mars rover design , not because it's going to Mars. That may not be so obvious now, as this is early days, but eventually it will sport a rocker arm suspension and (perhaps) an additional pair of wheels.
@nishanthpatil4531
@nishanthpatil4531 Ай бұрын
How will it navigate uneven terrain?
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics Ай бұрын
The MR01 is being built in stages, both to spread out the cost and also so everything doesn't have to be tackled at once. Phase 0 (the current state) is a test bed to get the robot up and moving, as none of my previous robots had four wheel independent steering and motor control. The robot doesn't currently have any suspension system at all, the steering servos are mounted directly to the chassis. It is only expected to run indoors and on a wooden deck, maybe on a grassy field. Phase 1 will add an aft rocker assembly, larger off-road wheels and motor arm mounts so that the steering axes will line up with the centerline of each wheel, and the robot will gain significant ground clearance. The front two steering assemblies will still be mounted directly to the chassis. Phase 2 will add a port and starboard pair of rockers and steering/motor assemblies to make it a "proper" six wheel Mars rover with a triple-rocker suspension, similar to the triple-rocker suspension of the ExoMars rover, not the NASA/JPL rovers, which have a rocker-bogie suspension.
@Simon_Rafferty
@Simon_Rafferty 3 ай бұрын
Targeting a drone in the event of jamming is way easier than using AI. You can use an off the shelf Optical Flow Sensor such as PMW3901 tied to the flight controller to keep it flying towards the same object, if it looses contact with the controller. So long as the target is kept in the centre of view during the approach, it will continue towards, even a moving target. All for $10 worth of hardware & a bit of easy software.
@memelab2007
@memelab2007 11 ай бұрын
Cool demo. Are you processing this with an Arduino, and if so what library are you using? I have a version of this chip on a breakout board and am not having any luck initializing it. Thanks.
@theredstormer8078
@theredstormer8078 3 ай бұрын
The robot has a raspberry pi onboard so I would assume it uses that.
@shivam4499
@shivam4499 Жыл бұрын
Do you know what angle the cone of light sensed is? I couldn't find any of that information in the datasheet. I plan to use this similarly with a cutout near the lens and want to make sure the opening is big enough (or close enough) to not interfere with the light.
@HanClinto
@HanClinto 2 жыл бұрын
Have you found any optical flow sensors that include rotation as one of the values returned? It's annoying to me that we have to get rotation out of gyro or other means -- it seems like optical flow algorithms should be able to return rotation delta just as well as it does X and Y, but so far it seems like all of the ASICs that I've run across only return X and Y.
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics 2 жыл бұрын
No, and there must be a good reason for that as I've viewed the tiny black and white movie that gets created so I get your point, but there must be some good technical reason why these sensors don't return a rotation delta. Even the significantly more expensive ones don't. Must be problematic to interpret image rotation in real time.
@ai-matt
@ai-matt 13 күн бұрын
You could have two of them and extrapolate the rotation based on the x/y change delta between them.
@HanClinto
@HanClinto 13 күн бұрын
@@ai-matt That's a really good idea -- I like that!
@akumal5819
@akumal5819 2 жыл бұрын
Does this flow sensor works on plain or less textured floor . are you still using this sensor for odometry ?
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics 2 жыл бұрын
I've done some fairly extensive testing on the PixArt PAA5100JE, a sister to this sensor designed for close distances more suited to ground-based robots. I'd suggested to Pimoroni that it would be better than the PWM3901 (which is really designed for flying drones) and they're now selling it, which is great. What I've found is that you can calibrate to fairly accurate distance tracking on a single floor type, but if your robot goes from say grass to concrete to wood floor, the numbers sent back by the sensor will vary depending on surface, basicaly on surface complexity. It does still work on a "plain" floor if it has some kind of surface irregularity. On a smooth featureless tile its camera isn't able to discern detail, but you can help it out by adding additional white LEDs underneath to supplement the ones it has. But in short, it works better on surfaces that have *some* texture.
@akumal5819
@akumal5819 2 жыл бұрын
@@NewZealandPersonalRobotics I already bought PAA5100JE, which is only for close distance .now I've the PWM3901. .i'm going to tracking the ceiling. I've not installed on the robot yet Thank you very much for the reply.
@CyberPilot360
@CyberPilot360 3 жыл бұрын
Pretty sweet. Did you use the ledn pin for syncing the led with the flow update rate ?
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics 3 жыл бұрын
Thanks! If you mean the RGB LED, no. The code for the PMW3901 sensor doesn't send any data at all when it doesn't sense movement (so there's no default data rate), and then spits out x,y coordinate data as it does. I'm just listening for this data in a loop and updating the RGB LED as that data comes in.
@chrisn1237
@chrisn1237 3 жыл бұрын
Please share how you did the video streaming. Did you use MJPEG or H.264 ?
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics 3 жыл бұрын
I've reposted the actual python code used at: github.com/ifurusato/ros/blob/master/lib/video.py The format is set on camera.start_recording() as "mjpeg".
@dmitrytyugin6261
@dmitrytyugin6261 3 жыл бұрын
Did you use PMW3901 for odometry? How much drift did you get per meter? Very appreciate any practical information about PMW3901.
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics 3 жыл бұрын
The KR01 is using Hall Effect encoders on the motor shafts (closed-loop) in order to measure and set the velocity of the motors. This allows the robot to drive in a straight line despite the fact that the port motors are rotating the opposite of the starboard motors, and DC motors generally are more efficient (and faster) in one direction than the other. So to answer your question, no, the PMW3901 is only used to determine motion in front of its sensor lens, and is unfortunately is using a sensor designed really for flying drones, so I'm not using it on this robot at all. Upon request from myself and several others I believe Pimoroni are going to release a new Optical Flow Sensor based on the PixArt PAA5100JE-Q Optical Tracking Chip, which is tuned for sensing floors and close-range surfaces with a range of 15-33mm (whereas the PWM3901 has a range of about 90mm to infinity). I'm looking into using it for a mechanum-wheeled robot, where Hall Effect encoders on the motor shafts would be rather useless or at least extremely difficult to use since the wheels are slipping all the time.
@bobcookdev
@bobcookdev 3 жыл бұрын
Looks like a fantastic project and great explanation about what is going on with your display. Curious to know more about the LCD on the robot itself, what does it show you?
@NewZealandPersonalRobotics
@NewZealandPersonalRobotics 3 жыл бұрын
Hi Bob, thanks very much for the reply. The display on the robot is an Adafruit 2" 320x240 pixel TFT LCD display, very small but still readable. When the robot's OS fires up it shows up as a terminal display, and I use a small Bluetooth keyboard that types directly into the terminal. In the video I'm running htop, a nicer version of top, the Linux system monitor. It provides me the status of each of the four cores on the Raspberry Pi, average system load, memory usage, and the lower part shows a table of running processes.