Yes, a Canon EF mount for your iPhone 4. Because, you know, sometimes you are just walking and see something cool and think, "Man, if only I could use this 70-200 zoom lens that I always keep in my back pocket with my iPhone, I could totally make that shot."
Update: This looks like the real deal. A whole lot of work has been going on in an area called "light fields". The CEO of Lytro is a man named Ren Ng. His doctoral dissertation at Standford was titled "Digital Light Field Photography". This is complex stuff, and it's a little mind bending, but here's the basics: The camera is indeed a tight aperture camera, so that if it the light were hitting a normal digital sensor camera, the entire frame would be in focus. But it's not a normal sensor. It is a very special sensor that is able, at each and every pixel, to capture not only the color and intensity of the light, but the direction of every ray which is hitting that pixel.
Think about it like this: Imagine that every pixel of your camera were an eyeball that is able rotate itself in every direction, and record every individual ray of light which is hitting it, rather than just the combined color and brightness of all of the rays put together. Instead of using an eyeball, each pixel has a micro lens in front of it that is capturing all directions at once, and storing the information. The way they do this is clever, and maddeningly simple: You divide up the pixels of your sensor into blocks. You place a facetted lens over each block of pixels that allows light to hit each pixel in the block only from a specific direction that that facett is facing. So each pixel in your light field camera is really a bunch of pixels that capture information from different angles, combined into a single pixel in your image.
So now, instead of just having X,Y resolution, you have XYAB resolution. XY determines the position in the field of view as you see it. AB determines the direction of the light hitting each pixel in your field of view.
Yes, that means that you can think of the sensor as a 4D sensor. The information is stored in a 4-dimensional mathematical space. Now, of course, once you store it that way, it is impossible to show the human eye. It has to be converted into a regular 2D image for display on a computer screen. But mathematicians have known how to do that for way longer than there have been computer screens. It's a neat trick called a "Fourier Transform".
when you put all this together, you can literally pick which 2D slice of that image space you want to view, extract it, do some fancy mathematical transformations to convert it back to a viewable image, and tada, you have an image which is focused at the plane of your choice.
But this also means that megapixels just became extremely important, because a single pixel on your sensor is now only able to gather light and color information that is hitting it from a single direction. So this also means that the first commercial versions of this technology will have a way lower megapixel count than you are used to seeing in a digital camera.
I'm still not quite ready to believe this is real, but my skepticism is eroding. Evidently there is real science behind it, from research done at Stanford.
The short of it is: Lytro claims to have developed a camera that can capture all planes of an image simultaneously, and allow you to pick your focal point in software. In the example images I have seen, the closest areas seem much sharper than the further areas, which actually lends to the credibility, because it would be hard to fake.
Click to focus. Double-click to zoom.
Based upon the subject matter, there is no way these are multiple-exposure images taken at different focal depths, and then blended together. There's way too much motion in many of the images to pull that off.
The other way this could be faked would be to capture a single frame at a tight aperture, and then create separate images in Photoshop with false focus. However, if that were the case, you would expect the entire frame to be tack sharp, but it's not. The further you get from the lens, the more definition you lose in these images. Also, it has a weird grid pattern to it that is visible up close. In the sample images, you can see it if you double-click to zoom.
So, if it's real, it will radically change photography as we know it. Not right away, if these images are any indication, but as the technology is iterated, it would no doubt improve.
This is the Queen's residence when she is in Edingburgh.
Oh, and "rood" is the old english word for "cross", so Holyrood = Holy Cross. An oddity of the heavier scottish dialects is that, for whatever reason, they still retain a number of old english words.
Chinese scientists at the University of Bejing claim to have genetically engineered cows to produce human breastmilk.
Something about this sounds like "lost in translation", as if there were a mangled news release out of China, that news.com.au picked up and ran with.
Choice quote from the article:
"'It's good,' said worker Jiang Yao. 'It's better for you because it's genetically modified.'"
No, this is not satire, it's a real notice, on a public bulletin board. Therefore I did not bother to blur the names.
Pardon the sub-standard picture quality. All I had was my iPad at the time.
Just remember: Whatever you have to do to get them there, you have to keep doing to keep them there.