Making an app was uncharted territory for us as a research group, and would not have been possible without my wonderful collaborators: Zeyu (Michael) Bian, @venkyp.bsky.social, @haritheja.bsky.social, @eneserciyes.bsky.social, @notmahi.bsky.social and @lerrelpinto.com.
26.02.2025 16:49
π 2
π 0
π¬ 0
π 0
GitHub - NYU-robot-learning/AnySense: An iPhone app for multi-sensory data collection and learning
An iPhone app for multi-sensory data collection and learning - NYU-robot-learning/AnySense
AnySense is built to empower researchers, engineers, and developers with better tools for sensor-based AI. Our code is fully open-source and can be found on
Github: github.com/NYU-robot-le...
Website: anysense.app
26.02.2025 16:49
π 0
π 0
π¬ 1
π 0
Why does this matter? Here, we use AnySense to scale data and train visuo-tactile policies using Robot Utility Models (robotutilitymodels.com) for a whiteboard erasing task. With AnySense-enabled live streaming, you can just plug your iPhone into your robot and seamlessly deploy your policies!
26.02.2025 16:49
π 0
π 0
π¬ 1
π 0
Need to connect external sensors? No problem! AnySense supports data streaming over Bluetooth at the press of a button! Hereβs a visualization of data collected by connecting the AnySkin (any-skin.github.io) tactile sensor with AnySense via Bluetooth.
26.02.2025 16:49
π 1
π 0
π¬ 1
π 0
βAnySense
βAnySense is an open-source iPhone app that enables multi-sensory data collection by integrating the iPhoneβs sensory suite with external sensors via Bluetooth and wired interfaces, enabling both offl...
Try AnySense right now: apps.apple.com/us/app/anyse...
Even if you donβt have external sensors, you can start using AnySense immediately to record and stream:
β
RGB + Depth + Pose data
β
Audio from the iPhone mic or custom contact microphones
β
Seamless Bluetooth integration for external sensors
26.02.2025 16:49
π 0
π 0
π¬ 1
π 0
Ever struggled with multi-sensor data from cameras, depth sensors, and other custom sensors? Meet AnySenseβan iPhone app for effortless data acquisition and streaming. Working with multimodal sensor data will never be a chore again!
26.02.2025 16:49
π 5
π 2
π¬ 1
π 0
P3-PO is a great example of how simple human priors can facilitate significantly better generalizability for robot policies.
10.12.2024 20:48
π 3
π 2
π¬ 0
π 0
BAKU is fully open source and surprisingly effective. We found it easily adaptable for a host of visuotactile tasks in visuoskin.github.io
10.12.2024 18:23
π 8
π 2
π¬ 0
π 0
Robot utility models are not just among the first learned models that work zero-shot on a mobile manipulator, but also provide a nuanced discussion on what works and what doesn't in data-driven robot learning.
09.12.2024 16:54
π 7
π 1
π¬ 0
π 0
I'll be presenting AnySkin at the Stanford Center for Design Research today at 2pm! Stop by for a chat and try the sensor out!
More info: any-skin.github.io
25.11.2024 18:15
π 6
π 2
π¬ 0
π 0
I just joined bluesky, and would love to connect with folks interested in embodied AI and robotics. I am a postdoctoral researcher at NYU working at the intersection of sensing, machine learning and robotics. Hit me up if you'd like to chat!
More about my research: raunaqbhirangi.github.io
22.11.2024 17:50
π 2
π 0
π¬ 0
π 0