# Overview

## Overview

<figure><img src="https://2932036210-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FnY2oefBLnx298VvOh2aa%2Fuploads%2Fgit-blob-02ffb3fe6e8dedba20571cec6d0a65519af6b19a%2Fbanner.jpg?alt=media" alt=""><figcaption></figcaption></figure>

MagiClaw is a versatile two-finger end-effector that can be hand-held by human operators or mounted on a robotic arm.

It captures comprehensive action data (including kinematic motion, force, and contact deformation) from real-world demonstrations. The fingers feature soft, parallel-mechanism-driven designs, each with a miniature camera for vision-based deformable perception.

An iPhone (with integrated LiDAR) provides additional environmental sensing and processes all sensor streams via a custom iOS application.

{% embed url="<https://www.youtube.com/embed/QV0aQd1CYwY?si=EGqAmccfXgsnD4hm>" %}

## Citation

Please cite MagiClaw if you use it in your projects or publications:

```tex
@misc{wu2025magiclaw,
      title={MagiClaw: A Dual-Use, Vision-Based Soft Gripper for Bridging the Human Demonstration to Robotic Deployment Gap}, 
      author={Tianyu Wu and Xudong Han and Haoran Sun and Zishang Zhang and Bangchao Huang and Chaoyang Song and Fang Wan},
      year={2025},
      eprint={2509.19169},
      archivePrefix={arXiv},
      primaryClass={cs.RO},
      url={https://arxiv.org/abs/2509.19169}, 
}

```
