Art && Code 3D A Festival for DIY and Arts-Oriented 3D Sensing and Visualization. You know... Kinect Hacking! 2012-11-21T05:42:57Z /feed/atom WordPress admin <![CDATA[5B: Kinect-Based MoCap for Flash and AfterEffects]]> /?p=132 2011-10-18T16:46:23Z 2011-10-23T14:16:42Z Animators — create your own 3D motion capture studio in your living room! This introductory 2-hour workshop by animator Nick Fox-Gieg will get you started using the Kinect for inexpensive, DIY, full-body motion capture. With little or no programming, you’ll be able to acquire OpenNI “skeleton” data and use it to rig characters and other movements in Adobe Flash and/or AfterEffects.

Recommended levels: Introductory & Intermediate.
Recommended hardware/software: Please bring a Mac laptop running OSX Snow Leopard; Flash or After Effects CS4 and later. A limited number of Kinect sensors will be available to borrow, but please bring your own if possible.

◀ Back to SCHEDULE

]]>
drue <![CDATA[5A: Using the Kinect with Pure Data]]> /?p=276 2011-10-18T16:46:37Z 2011-10-23T14:15:01Z Dancing with Technology; The Kinect and the Temporal Lobe

This 2-hour workshop with Sofy Yuditskaya will get you started using the Kinect for gestural interaction in Pure Data, a free, open-source, cross-platform toolkit for “patching-based” arts engineering. Assisted by Dan Wilcox.

Pd (aka Pure Data) is a real-time graphical programming environment for high-quality audio, video, and graphical processing — considered by many as a free alternative to Max/MSP. For this workshop, prior experience in Pure Data is not required, but some experience with “patching-based” programming environments (like Max/MSP, VVVV, or Grasshopper) is a plus. We will use the Kinect as an input mechanism to control objects and mechanisms in the physical world with a focus on the spacial and the performative.

Recommended levels: Introductory & Intermediate.
Recommended hardware/software: Snow Leopard or Windows Vista A limited number of Kinect sensors will be available to borrow, but please bring your own if possible.

◀ Back to SCHEDULE

]]>
0
drue <![CDATA[Mini Maker Faire]]> /?p=701 2011-10-15T19:33:54Z 2011-10-23T14:00:09Z We are providing complimentary shuttles to and from Pittsburgh’s first Mini Maker Faire, happening at the Children’s Museum of Pittsburgh on the North Side. You must have an Art && Code 3D badge in order to ride the shuttle.

Shuttles depart at 2pm and 4pm. Return shuttles leave the museum at 3:30pm and 5:30pm. Please reserve your shuttle seat in advance by signing up at Registration, Breakfast or Lunch.

Admission to Mini Maker Faire is included in the price of Museum admission:
$12 for adults / $11 for children 2-18 and seniors / Free for children under 2

The Faire is presented by HackPittsburgh. For more information, please see the Pittsburgh Mini Maker Faire website.

◀ Back to SCHEDULE

]]>
0
drue <![CDATA[Lunch (Sunday)]]> /?p=303 2011-09-02T02:08:52Z 2011-10-23T13:00:22Z Tasty lunches will be served (on Saturday and Sunday) to participants who have reserved a lunch ticket in advance. Vegetarian meals are available; please request this when you register. Please note: Lunch tickets are available for $10 – and are not included in General Registration fee.

Good food ends with good talk.
– Geoffrey Neighor, 1993

◀ Back to SCHEDULE

]]>
0
admin <![CDATA[4B: Using the Kinect with Max/MSP/Jitter]]> /?p=138 2011-10-18T16:46:56Z 2011-10-23T10:45:07Z In this workshop, presented by Joshua Goldberg, you’ll get to experiment with the Microsoft Kinect sensor in Max/MSP/Jitter, a powerful environment for arts engineering and interactive sound. The Kinect becomes available to the Max environment with the help of Jean-Marc Pelletier’s free Jit.Freenect.Grab library. You’ll be able to:

  • Retrieve RGB images.
  • Retrieve depth maps.
  • Retrieve images from the infrared camera.
  • Retrieve accelerometer readings.
  • Control the Kinect’s tilt motor.
  • Use multiple Kinects simultaneously.

Max is a visual programming language for music and multimedia developed and maintained by San Francisco-based software company Cycling ’74. During its more than 20-year history, it has been widely used by composers, performers, software designers, researchers, and artists for creating innovative recordings, performances, and installations.

Recommended levels: Introductory & Intermediate.
Recommended hardware/software: Please bring a Mac laptop running OSX10.5+ and Max5+. A limited number of Kinect sensors will be available to borrow, but please bring your own if possible.

◀ Back to SCHEDULE

]]>
0
admin <![CDATA[4A: Calibrating Projectors and Cameras: Practical Tools]]> /?p=148 2011-10-18T16:47:18Z 2011-10-23T10:44:09Z Camera calibration” is the process of deriving the numeric parameters of a camera (such as focal length, principal point, and lens distortion) that produced a given photograph or video. Calibrating devices together is essential for augmented reality (AR) and other activities (like augmented projection) in which light must be accurately projected onto geometries observed by cameras.

This 2-hour workshop, led by Elliot Woods with additional contributions from Kyle McDonald, will present the mathematics of calibration in simple terms, and provide a number of practical open-source tools for calibrating projectors and depth cameras like the Microsoft Kinect.

Recommended levels: Intermediate & Advanced. Comfort with algebra and trigonometry is helpful.
Recommended hardware/software: None Required.

◀ Back to SCHEDULE

]]>
drue <![CDATA[3B: Using the Kinect with OpenFrameworks]]> /?p=279 2011-10-18T16:47:49Z 2011-10-23T08:30:34Z The Microsoft Kinect sensor — the first consumer depth-camera — has radically altered the landscape of possibilities for the use of machine vision in interactive art and computational design. This workshop — presented by Zachary Lieberman with the assistance of Dan Wilcox — introduces libraries and techniques for Kinect programming in OpenFrameworks, a popular arts-engineering toolkit for creative coding in C++. You’ll learn how to access the depth buffer and export a 3D point cloud using Libfreenect via ofxKinect; how to obtain the skeleton approximation of a person using OpenNI; and some helpful computation techniques for working with these data.

OpenFrameworks (OF) is a free, open-source, C++ library designed to assist the creative process by providing a simple and intuitive framework for experimentation. It provides tools for accessing and playing with a diverse range of open-source libraries, such as OpenGL, OpenCV, and many others. Its aim is to lower the entry barrier for software development, so that anyone – especially artists, designers, tinkerers, and young people – can create high quality software projects.

Recommended levels: Intermediate and Advanced. Participants should have some experience developing in OpenFrameworks and/or C++.
Recommended hardware/software: Bring a Mac laptop running OSX 10.6.8 or 10.7, with XCode 3.x or 4.x installed. A limited number of Kinect sensors will be available to borrow, but please bring your own if possible.

◀ Back to SCHEDULE

]]>
0
admin <![CDATA[3A: 3D Scanning with Structured Light Projectors]]> /?p=54 2011-10-18T16:17:40Z 2011-10-23T08:30:30Z Structured Light” 3D scanning techniques begin with video imagery of special image patterns (like stripes and checkers) projected onto a subject. When observed by a camera and processed by software, these flickery patterns permit the creation of extremely high-resolution 3D models. With less noise and higher spatial resolution than Kinect scans, structured light is (under certain circumstances) an important low-budget alternative to depth cameras like the Kinect.

In this 3-hour workshop, Kyle McDonald explains and demonstrates the basics of creating 3D scans with structured light. Open-source tools for structured light 3D scanning, such as Kyle’s DIY 3D Scanner, will be presented, discussed and distributed. The relative merits of low-budget 3D scanning techniques will be productively compared.

Recommended levels: Any; Intermediate through Advanced.
Recommended hardware/software: 

Bring a laptop with some form of webcam, and

◀ Back to SCHEDULE

]]>
drue <![CDATA[Breakfast]]> /?p=316 2011-08-29T20:39:27Z 2011-10-23T07:30:19Z You can sleep on the plane ride home. Join your fellow Art && Coders for coffee, carbohydrates and camaraderie before the morning workshops.

◀ Back to SCHEDULE

]]>
0
drue <![CDATA[Mareorama Resurrected [Audiovisual Performance]]]> /?p=342 2012-11-21T05:42:57Z 2011-10-22T20:15:13Z The perfect finish to a long day. Experience a century-old immersive virtual reality — a moving panorama, the new media performance of its time — in this highly unusual “illustrated lecture” by Erkki Huhtamo.

Maréorama Resurrected: An Illustrated Lecture by Erkki Huhtamo from STUDIO for Creative Inquiry on Vimeo.

Performed throughout the 1800s, moving panoramas were among the most popular entertainment of the 19th century. In this poetic lecture-demonstration, scholar and media archeologist Erkki Huhtamo draws on his research into moving panoramas and dioramas to discuss various historical apparata that laid the groundwork for 20th and 21st century immersive applications—including those created now by game designers and media artists. The particular focus of this presentation will be on the Maréorama, a huge multi-sensory spectacle created by Hugo d’Alesi and his team for the Universal Exposition of 1900 in Paris. Drawing from high-resolution scans and the original piano music composed for the Maréorama by Henri Kowalski, Huhtamo reconstructs several sequences from this simulated sea voyage on the Mediterranean. The performance features live piano accompaniment by Stephen L. I. Murphy.

◀ Back to SCHEDULE

]]>
0