InfraredTags: Invisible AR Markers & Barcodes Using Low-Cost, Infrared-Based 3D Printing & Imaging Tools

M. Doga Dogan, Ahmad Taka, Michael Lu, Yunyi Zhu, Akshat Kumar, Aakar Gupta, Stefanie Mueller
2022 ACM CHI Conference on Human Factors in Computing Systems
Best Demo Honorable Mention

Existing approaches for embedding unobtrusive tags inside 3D objects require either complex fabrication or high-cost imaging equipment. We present InfraredTags, which are 2D codes and markers imperceptible to the naked eye that can be 3D printed as part of objects, and detected rapidly by low-cost near-infrared cameras. InfraredTags achieve this by being printed from an infrared-transmitting filament which infrared cameras can see through, and by having air gaps inside for the tag’s bits which infrared cameras capture as darker pixels in the image. We built a user interface that facilitates the integration of common tags (QR codes, ArUco markers) with the object geometry to make them 3D printable as InfraredTags. We also developed a low-cost infrared imaging module that augments existing mobile devices and decodes tags using our image processing pipeline. We demonstrate how our method enables applications, such as object tracking and embedding metadata for augmented reality and tangible interactions.

[project page] [doi] [paper] [video] [talk]
Featured on Popular Science, New Scientist, and  MIT News logo MIT News.

SensiCut: Material-Aware Laser Cutting Using Speckle Sensing and Deep Learning

M. Doga Dogan, Steven Vidal Acevedo Colon, Varnika Sinha, Kaan Akşit, Stefanie Mueller
2021 ACM User Interface Software and Technology Symposium (UIST)

Laser cutter users face difficulties distinguishing between visually similar materials. This can lead to problems, such as using the wrong power/speed settings or accidentally cutting hazardous materials. To support users in identifying the sheets, we present SensiCut, a material sensing platform for laser cutters. In contrast to approaches that detect the appearance of the material with a conventional camera, SensiCut identifies the material by its surface structure using speckle sensing and deep learning. SensiCut comes with a compact hardware add-on for the laser cutter and a user interface that integrates material sensing into the cutting workflow. In addition to improving the traditional workflow, SensiCut enables new applications, such as automatically partitioning the design when engraving on multi-material objects or adjusting the shape of the design based on the kerf of the identified material. We evaluate SensiCut’s accuracy for different types of materials under different conditions, such as with various illuminations and sheet orientations.

[project page] [doi] [paper] [video] [talk]
Featured on The Next Web logo The Next Web, Photonics.com logo Photonics.com, and  MIT News logo MIT News.

G-ID: Identifying 3D Prints Using Slicing Parameters

M. Doga Dogan, Faraz Faruqi, Andrew Day Churchill, Kenneth Friedman, Leon Cheng, Sriram Subramanian, Stefanie Mueller
2020 ACM CHI Conference on Human Factors in Computing Systems

G-ID is a method that utilizes the subtle patterns left by the 3D printing process to distinguish and identify objects that otherwise look similar to the human eye. The key idea is to mark different instances of a 3D model by varying slicing parameters that do not change the model geometry but can be detected as machine-readable differences in the print. As a result, G-ID does not add anything to the object but exploits the patterns appearing as a byproduct of slicing, an essential step of the 3D printing pipeline. We introduce the G-ID slicing & labeling interface that varies the settings for each instance, and the G-ID mobile app, which uses image processing techniques to retrieve the parameters and their associated labels from a photo of the 3D printed object. Finally, we evaluate our method’s accuracy under different lighting conditions, when objects were printed with different filaments and printers, and with pictures taken from various positions and angles.

[project page] [doi] [paper] [video] [talk]
Featured on  3DPrintCom 3DPrint.com, hackster.io logo Hackster.ioand ITMediaNews ITmedia (Japanese). 

DefeXtiles: 3D Printing Quasi-Woven Fabric via Under-Extrusion

Jack Forman, Mustafa Doga Dogan, Hamilton Forsythe, Hiroshi Ishii
2020 ACM User Interface Software and Technology Symposium (UIST)
Best Demo Honorable Mention

We present DefeXtiles, a rapid and low-cost technique to produce tulle-like fabrics on unmodified fused deposition modeling (FDM) printers. The under-extrusion of filament is a common cause of print failure, resulting in objects with periodic gap defects. In this paper, we demonstrate that these defects can be finely controlled to quickly print thinner, more flexible textiles than previous approaches allow. Our approach allows hierarchical control from micrometer structure to decameter form and is compatible with all common 3D printing materials. In this paper, we introduce the mechanism of DefeXtiles and establish the design space through a set of primitives with detailed workflows. We demonstrate the interactive features and new use cases of our approach through a variety of applications, such as fashion design prototyping, interactive objects, aesthetic patterning, and single-print actuators.

[project page] [doi] [paper] [video] [talk]
Featured on gizmodo.com logo Gizmodo and MIT News logo MIT News.

FoldTronics: Creating 3D Objects with Integrated Electronics Using Foldable Honeycomb Structures

Junichi Yamaoka, Mustafa Doga Dogan, Katarina Bulovic, Kazuya Saito, Yoshihiro Kawahara, Yasuaki Kakehi, Stefanie Mueller
2019 ACM CHI Conference on Human Factors in Computing Systems

FoldTronics is a 2D-cutting based fabrication technique to integrate electronics into 3D folded objects. The key idea is to cut and perforate a 2D sheet to make it foldable into a honeycomb structure using a cutting plotter; before folding the sheet into a 3D structure, users place the electronic components and circuitry onto the sheet. The fabrication process only takes a few minutes enabling users to rapidly prototype functional interactive devices. The resulting objects are lightweight and rigid, thus allowing for weight-sensitive and force-sensitive applications. Finally, due to the nature of the honeycomb structure, the objects can be folded flat along one axis and thus can be efficiently transported in this compact form factor. We describe the structure of the foldable sheet, and present a design tool that enables users to quickly prototype the desired objects. We showcase a range of examples made with our design tool, including objects with integrated sensors and display elements.

[project page] [doi] [paper] [video] [talk]
Featured on hackster.io logo Hackster.io.

Magnetically Actuated Soft Capsule Endoscope for Fine-Needle Aspiration

Donghoon Son, Mustafa Doga Dogan, Metin Sitti
2017 IEEE International Conference on Robotics and Automation (ICRA)
Max Planck Institute for Intelligent Systems
Best Medical Robotics Paper Award Nomination

This paper presents a magnetically actuated soft capsule endoscope for fine-needle aspiration biopsy (B-MASCE) in the upper gastrointestinal tract. A thin and hollow needle is attached to the capsule, which can penetrate deeply into tissues to obtain subsurface biopsy sample. The design utilizes a soft elastomer body as a compliant mechanism to guide the needle. An internal permanent magnet provides a means for both actuation and tracking. The capsule is designed to roll towards its target and then deploy the biopsy needle in a precise location selected as the target area. B-MASCE is controlled by multiple custom-designed electromagnets while its position and orientation are tracked by a magnetic sensor array.

[doi] [pdf] [video]
Featured on Engadget Engadget and IEEE SpectrumIEEE Spectrum.