In this article we will discuss the following:


Last night, I donned a Microsoft HoloLens for the second time. It was incredible. I could see objects made of light appear in the real world and this time, I could freely walk around them without a tether. I even built my own holographic app. It felt so easy.

Now let me tell you why I m still a bit skeptical.

In January, the first time I tried HoloLens, it was a big, bulky contraption with exposed circuitry everywhere, plus a separate processor unit you literally had to hang from your neck. It was tethered to the ceiling. Just about the prototype-iest prototype ever. Oh, and you had to walk into some very specific, very small rooms in the basement of a Microsoft building to see it in action, which raised a few questions about whether the demo were staged.
Well, all of that has definitely changed. HoloLens is now a slick, futuristic headset that doesn t require a cord. It actually looks kind of like a consumer device, with a microUSB port and a headset jack instead of cable soup. It already lasts up to four hours on a charge. At the Intercontinental Hotel in San Francisco, they just handed me one, and let me put it on my head by myself. I walked around a giant room. I think it s safe to say this demo wasn t staged.
But what you might not understand at least, not without trying HoloLens yourself is how little the experience you see resembles Microsoft s demo videos.
Here, try this for me real quick. Pick up your smartphone. Hold it about a foot in front of your face. Now imagine that the phone is a window into a parallel world. Through that window, you can see holograms that appear to exist in the real world but all around that phone, you re only seeing the world. In other words, you have to be looking directly at a digital object to see it, because HoloLens currently has a ridiculously tiny field of view. As soon as you turn your head a little bit, the holograms disappear.

And if those holograms are large enough, you ll only be able to see a little piece of them, too. See this human body? She can probably only actually see his neck, shoulder, and jaw from the distance she s standing:
And he can t actually appreciate this big screen on the wall:
Unlike the Oculus Rift and other virtual reality headsets, it s just not an immersive experience. It s a bit of a bummer.

Does that make the technology any less exciting? Perhaps a bit. It s certainly worrying that the field of view hasn t improved. (I also found the prototype pretty uncomfortable to wear, even though I really like the design of the folding, stretching band.) But it s still so amazing to me that this works at all that a portable device can convincingly place CG objects into the real world. And the current HoloLens does feel good enough for a developer kit; to give game developers and app developers a glimpse at what they re building.
And to my complete and utter surprise, building an app for HoloLens the first app I ve ever built, mind you was remarkably exciting.

مطلب مشابه :  راهنمای بالا بردن ارزش ویژه برند

Okay, okay, so I didn t actually write any code. Microsoft just sat us down with the Unity game engine, Visual Studio, and a whole bunch of premade 3D objects and scripts. All I had to do was check some boxes, drag and drop some objects, hit a few keys to compile, and give it a try on the headset itself.

But it wasn t like Microsoft hid any complexity, either. I could look through every script in Unity to see exactly how they worked, how few lines of code holographic apps will require. To turn a normal Unity game into a HoloLens game, for instance, all you ve got to do is add an object called a holographic camera. You can add new voice commands with a single line of code I decided that ;pew pew pew would cause an crunched up virtual paper ball to drop onto a virtual pad of graphing paper, and ;by the beard of Zeus would return it to the sky.

hologram tech

Sony has made it a point to come to SXSW, the annual Austin-based tech and culture meet-up, every year with a warehouse full of weird gadgets, demos, games, and other interactive experiences. This year was no different, as Sony opened the doors yesterday on the Wow Factory, its name for the wide-ranging exhibit that blends art and technology borne from its experimental, Japan-based Future Lab program. The experiences in the Wow Factory tend to center on Sony s display tech, specifically its advances in projectors that ultimately seem to have manifested as a pricey consumer product called the Xperia Touch.


But Sony hasn t stopped pushing the limits of the tech. The core premise is that with a mix of smart sensors that perform depth detection and motion tracking with a high-quality light source, you can create the closest thing we have today to interactive holograms. The projectors create objects out of light that typically exist on a flat plane either in front of the projector or below on a tabletop. You can interact with these virtual objects using your hands because the projector s software is able to recognize and track your movements. Effectively, Sony has figured out a way to make augmented reality without requiring you wear bulky goggles or goofy smart glasses.

مطلب مشابه :  تبلیغات برای بیمه

Going one step further, Sony has designed custom demos that make use of real-world objects. Sony returned to Austin this year with a collaborate music game that combines four of its projectors into a single cohesive system. With small 3D models of instruments, including a miniature saxophone and a piano, users can work together to play a series of songs by directing spotlights to each instrument. The small 3D-printed models are recognized by the software and come to life under the projectors light, while other sensors track your finger motions as you move the spotlight around the table.

The demo is not at all practical, because it requires custom software and custom props. And no consumer would ever spend many thousands of dollars to outfit a table with four of Sony s prototype projectors just to pull off silly games and tech proof-of-concepts like this. But it is a genuinely impressive demonstration, as each object placed under the projectors light and within range of the system s sensors is brought to life in a way that looks and feels like the closest manifestation of software in the real world.

it s also a great example of how to take an alternative approach to AR. Something like this is both more accessible and can be experienced collectively, without requiring everybody wear a pair of smart glasses, a VR-style helmet, or even a compatible smartphone with the requisite software. Sony s approach here is akin to a hologram it exists physically as light in a 3D space that everyone can see and interact with.

Sony has been working on this tech for a while. It s mostly a marketing stunt to showcase its experimental hardware, but over the years, we ve seen the full breadth of what this tech allows. We saw Lewis Carroll s Alice s Adventures in Wonderland jump off the page and interact with physical objects like a teacup and a deck of cards in 2016, and last year, Sony built an architectural demo to show the enterprise use cases of its projector tech, as a standard block of wood was transformed into a top-down scale model of a home.

This year, Sony engineers took the lessons the company picked up with the Xperia Touch and its prior demos to develop a three-person virtual hockey game. The custom circular table is equipped with a standard projector with its new image IMX382 image sensors to track the puck and paddles, while the projector creates a virtual interface that reacts to your physical movements.

مطلب مشابه :  قلق های بازاریابی در اینترنت : ویژگی ها ، فرصت ها و محدودیت ها

We don t know whether this tech will ever turn into a viable mainstream consumer product the mini-projector Sony sells now capable of running these AR-style hologram demos costs about $1,700. And without a true reason to own one or develop applications for it, it ll never take off in the way AR apps on iOS and Android can, thanks to software frameworks like Apple s ARKit and Google s ARCore. But if Sony does find a way to commercialize this tech, it could pave the way for a unique and novel way to create immersive, collaborate AR experiences that can be deployed using everyday objects and on something as ordinary as a kitchen table. That s exciting, if it ever does leave the quirky demo phase it exists here in Austin.

first X-ray holographic images of viruses

Holography, like photography, is a way to record the world around us. Both use light to make recordings, but instead of two-dimensional photos, holograms reproduce three-dimensional shapes. The shape is inferred from the patterns that form after light ricochets off an object and interferes with another light wave that serves as a reference.


When created with X-ray light, holography can be an extremely useful method for capturing high-resolution images of a nanoscale object something that is so small, its size is measured in nanometers, or billionths of a meter.

So far, X-ray holography has been restricted to objects that form crystals or relied on careful positioning of the sample on a surface. However, many nano-sized particles are non-crystalline, short-lived and very fragile. They may also suffer changes or damage during an experiment when positioned on a surface. Aerosols, exotic states of matter, and the smallest forms of life often fall into these categories and therefore are difficult to study with conventional imaging methods.

In a recent study featured on the March 2018 cover of Nature Photonics, researchers developed a new holographic method called in-flight holography. With this method, they were able to demonstrate the first X-ray holograms of nano-sized viruses that were not attached to any surface.

The patterns needed to create the images were taken at the Linac Coherent Light Source (LCLS), the X-ray free-electron laser at the Department of Energy’s SLAC National Accelerator Laboratory. Nanoviruses have been studied at LCLS without a holographic reference, but the interpretation of the X-ray images required many steps, relied on human input and was a computationally challenging task.