(评论)
(comments)

原始链接: https://news.ycombinator.com/item?id=38881981

此类技术引起了人们对隐私侵犯和各种实体(特别是政府和执法机构)可能滥用的担忧。 随着更先进的监视形式的出现,个人了解自己的权利和针对侵入性监视技术的保护变得越来越重要。 特别是,应该有明确的法律框架来规范此类设备的安装和操作,并对违规者进行严厉的处罚。 此外,消费者必须精明地检测和拒绝具有隐藏监视功能的产品和服务。 鉴于监视和间谍活动日益增长的趋势,公民必须要求这些行业进行更严格的审查和问责,以确保这些侵入性技术继续受到有意义的监督机制的约束。 最终,个人隐私不应该也不可能以牺牲个人自由为代价。

相关文章

原文
Hacker News new | past | comments | ask | show | jobs | submit login
Zeiss's "Holocam" turns glass windows into cameras (digitalcameraworld.com)
486 points by toss1 15 hours ago | hide | past | favorite | 213 comments










For the people kinda worried: this is a highly specialized piece of glass that is extremely complicated to manufacture at present and must, due to the laws of thermodynamics, not be 100% transparent. It's not going to allow surveillance through existing glass installations in any form, just possibly new ones if there's room for the support equipment and through the use of 4-5 digit piles of cash.

Any camera glass like this will have at least a mild tint and will be used in specialty applications. It'll also have pretty horrible SNR, resolution, and low light performance.

Currently the structural component of this tech is mainly used in extremely high end aerospace applications (various heads up display type systems) so it's unlikely you'll ever run across one of these within the next decade.

Nasty remote sensing tech people can be worried about right now: RF surveillance from various combinations of mmWave, wall penetrating radar, and wifi interferometry. Add in the fact that your IPhone has mac randomization but every other device you own including your car's TPMS doesn't. Also Geiger mode lidar is fun, one company I worked for mapped the inside of a random person's house with it as a demo.



>For the people kinda worried: this is a highly specialized piece of glass that is extremely complicated to manufacture at present and must, due to the laws of thermodynamics, not be 100% transparent. It's not going to allow surveillance through existing glass installations in any form

If you're worried that your Airbnb host is going to use it to spy on you (which was mentioned in the article), unless you already scrutinize every nail hole, photo frame, electrical appliance, electrical outlets, smoke detector, etc, this doesn't open up a new vulnerability.

Pinhole cameras with a lens as small as 2mm are already readily available and cheap, no ones going to use an expensive "window camera" to spy on you when they already have so many other options.

Perhaps those that fear government surveillance or other well funded adversaries may have cause for concern, but few of us are in that category.



>and must, due to the laws of thermodynamics, not be 100% transparent.

It's perhaps worth noting, however, that it wouldn't be unusual for a window to have only ~60% transmittance in visible spectrum.



> due to the laws of thermodynamics, not be 100% transparent.

How useful is this statement though? Regular glass isn't 100% transparent either, even in just the visible spectrum. Shouldn't we be more concerned with the delta in the visible spectrum is if we're concerned about easy identification? (before mentioning that plenty of glass is purposefully tinted and dynamic tinting is an application here) And reasonably, couldn't we, theoretically, pick up a decent signal from simply capturing the reflections around the glass edge? I mean we can now do 3D reconstructions from pointing a camera at a mirrored ball. I'm sure it'd be very noisy, but there is signal. I mean to have the capability of projecting you'd have the ability to do the reverse too given that I doubt the internal structure of the glass would be (that) directionally dependent. Right? I can be missing something, it's been awhile since I've done optics.



I was just thinking to make ALL the glass the same tint, regardless of specialty.


Thanks. The original description made this seem like far-future technological magic. A system that can somehow analyze a random pane of glass and derive all the transformations needed to use it as a high-precision waveguide? I actually had a manager ask me to develop such a thing, and I asked him how many dozen optics PhDs I could hire to accomplish this feat.


Apparently the number is finite.


> Add in the fact that your IPhone has mac randomization but every other device you own including your car's TPMS doesn't.

The truly pathetic thing is that virtually all of these devices could use RPA but don't, because nobody remembered to flip that flag from "n" to "y"



I agree with everything you said, but the example on the CES floor does not have an apparent tint.


I assure you it is semi opaque. A one way mirror is a primitive example of this. You can make the reflection increasingly transparent but it will be tinted up until the point it doesn't reflect anything.

It may not be noticeably visible to the human eye under most lighting conditions, though.



>> but the example on the CES floor does not have an __apparent__ tint.

> I assure you it is semi opaque

> It may not be __noticeably visible__ to the human eye under most lighting conditions, though.

I'm not sure you two are disagreeing. Seems like you're just reinforcing their point. I mean if someone said glass is opaque and their evidence is that you can't see through glass and your evidence is to point at UV light... well... you'd be technically correct but you're talking from a different ballpark.



Typically people can see better in the dark than imaging sensors can (only considering visible spectrum here), especially small sensors. If you redirect the light somewhere, what would you see? How much light would you need to divert?

If the image is grayscale and has very high gain/noise/ISO, then I imagine with a low enough frame rate you could avoid noticable tinting. You would likely still notice the tint in comparison to regular glass, I'm too lazy to do the napkin math though.

Practically speaking, I would expect it to be strictly more noticeable than glass for holographic projection. If true, it's likely noticeable.

But the eye is fickle. You don't notice the tint of, say, a car window when you've been inside for a bit. Or sunglasses.

Here's an arbitrary Google search result for in-glass HUDs: https://www.lumineq.com/applications/automotive

They state their glass is 70% transparent. That's definitely noticeably opaque.



> in the dark

I don't think how you framed this is accurate. The quality is a function of the pixel size and of course the exposure time. I mean those are the two main variables that control how much light hits the sensor, but we'd have to get into sensitivity to make direct comparisons (human eyes are VERY sensitive and some evidence that the average eye is sensitive to a single photon, but you're not going to see well with that and this is besides the point).

Definitely not a straight forward calculation and I'm feeling too lazy to go grab my optics books. But guesstimating here, there should be enough light considering you can see light and distortions when looking at the side of a pane. So I'd lean on it being more about the quality of the signal. Which glass is highly structured but I'm assuming that this type of glass has some unique optical properties to specifically make the side panels have higher quality signals. My understanding is that they are projecting from the side, so that's what I'm inferring from -- basically just reverse the photon direction through the medium and I don't have reason to believe that quality is directionally dependent (should I?). My whole spitballing is dependent on this understanding.

> it's likely noticeable.

Well one user said they didn't notice it.

> They state their glass is 70% transparent. That's definitely noticeably opaque.

That's not uncommon for Low-E glass, so maybe not very noticeable, especially in many different environments. But yeah, I think if you compared side by side you'd notice. I think we're just using different criteria and we probably decently align?

Idk, I am mostly thinking aloud here, so I do want to portray that I don't have high confidence here. It's been 10 years since I've been in an optics lab lol. But I've built microphones with really weak signals before and they are useful. Distorted, but useful. (Definitely not fun being forced to run experiments at 3am and find out someone is walking around on the other side of the building...) So I can't see why this couldn't (theoretically) be similarly done for a window? I definitely don't think this would work with a typical pane of glass, especially considering how they're cut, but this does seem like specialty glass specifically built for directing photons directed from an edge to the pane face. Any idea if it can only display to one face? (I'm sure you can invert img but projection face probably matters for a camera?)



Neither do most storefront windows and yet they're often made to reduce the transmission of UV light to protect displayed goods from sunlight or intentionally darkened so they're less transparent when the display is not lit up. You just don't notice it normally or dismiss it as an effect of ordinary glare, which is the point.


I agree that this shouldn't be anywhere near the top of people's privacy concern list. A $1 traditional digital camera can already be hidden very easily and this probably costs thousands of dollars at least if you could even get it.

It's still creepy though.



> and through the use of 4-5 digit piles of cash.

5 figures doesn't sound like much for an organized crime network



Where could someone surreptitiously place a wildly expensive pane of glass they couldn’t already put an $20 hidden camera?


I read it as the thickness of the deck of bills (thousands or more) so if they are 20s or 100s, that's be some serious funding.


> It's not going to allow surveillance through existing glass installations in any form, just possibly new ones if there's room for the support equipment and through the use of 4-5 digit piles of cash.

Total surveillance is just a matter of time...



I agree. I had this worrying idea (realisation..?) that one day, maybe triple digit years away, maybe sooner, that tiny cameras and mics the size of grains of salt will be everywhere. They will cost nothing to produce, be self-charging, interconnected to each other and created to ‘reduce crime’ or ‘make you safer’. And in the same way as forever chemicals, you can’t get rid of them. Trillions of them, in fields, the ground. Spreading around stuck to your shoes, on your car tires.

Just a crazy idea, but I think that if they could make that happen today then they would. And that part is the main point - There is no limit to surveillance anymore. I live in the UK, that realisation is in my face every day. Can’t even take a trip to Tesco without being run through facial recognition.

There is no care for the concept of privacy anymore. All the richest companies in the world don’t make their money from caring about our privacy.



I believe I read a science fiction book around this topic. Postsingular by Rudy Rucker I believe, related to nanotechnology. Can't remember most of it, but delved a little into how it affects relationships re: every thing being visible always to everyone. Not sure I'd entirely recommend it, but still interesting to see the thoughts and outcomes others come up with in regard to these types of potential technological changes.


With a sufficient culture shift, I could imagine a world where the capability exists but is outlawed and shunned. Surveillance capitalism, and blanket security Surveillance feel possible to overthrow politically. And to limit somewhat through legislation.

That leaves targeted surveillance, emergency surveillance, and war-time surveillance. Those will probably not be limited, though they are inherently more limited.



Politicians and high ranking unelected officials also will want privacy so truly omnipresent surveillance would be concomitant with jamming and obfuscation technologies. There’d just be endless arms race.


You can't know how much it improves in a year, two, five.


So what you say is: this early prototype might be hard to see if it's not close up?

I mean, this will be worse if the tech advances no?



The article is poorly written, as it only discusses the camera component. Strangely, they chose stock images of holographic and optical displays, but didn’t mention that even once.

The Zeiss site is a much better read:

https://www.zeiss.com/corporate/en/about-zeiss/present/newsr...

In summary:

1) ZEISS unveils holographic Smart Glass at CES 2024, both for displays/projection/filtering, but also another component which is a holographic camera

2) The holocam works by utilizing coupling, decoupling, and light guiding elements to redirect incident light to a concealed sensor, eliminating the need for visible cutouts or installation spaces in visible areas.

3) ZEISS doesn’t plan to be manufacturer, so other companies can use the tech



>nd light guiding elements to redirect incident light to a concealed sensor,

So you can't just stick on any existing window or pane of glass? That's good.



I don't think so. From the article:

> The Holocam technology "uses holographic in-coupling, light guiding and de-coupling elements to redirect the incoming light of a transparent medium to a hidden image sensor."

That suggests, at least to me, that you'll need something more than just a simple sheet of glass. There's probably some engineering required to allow light to be guided and redirected towards what sounds like a typical camera sensor.



Yes I interpret it as guiding light inside the glass such that the sensor is on the lateral aspect of the glass embedded in a glass frame perpendicular to the target image.

Sounds like they embedded a light splitter with some sort of periscope style lense inside a glass pane.

So not actually bolted oblique to the glass as GP suggested.



in-coupling and de-coupling of light, anyone care to explain?


Thanks for linking to that.

> coupling, decoupling and light guiding elements to divert incident light to a concealed sensor

So, there's a camera in the dash looking up at the windshield and focusing where it expects to see a face, thereby using the windshield as a reflector? And maybe there's some additional etching and deposited films in the windshield to support the angles required?

And perhaps you can put cameras elsewhere, and similarly subtly modify the windshield or other glass to look at other things as well?



It collects through the edge, and the glass can be (apparently) flat: https://www.youtube.com/watch?v=NORPeCcIXRQ


Edit: theres an actually good demo video showing the real state of the tech rather than mockups: https://www.youtube.com/watch?v=NORPeCcIXRQ buried below in the comments so just surfacing higher. everything else is artists lying to you.

---

> Glass surfaces can also generate energy. The microoptical layer in the window pane absorbs incident sunlight and transmits it in concentrated form to a solar cell. This combines the advantages of conventional windows – natural light and an unrestricted view – with the additional benefit of efficient energy production.

what? holy shit?



That video is pretty cool. Truly an invisible camera.


Nah, this will only be good enough for sensor-level power (like 5W from a whole window). Only useful in very limited circumstances. It's not going to replace normal solar power.


but like, thats enough for the camera to power itself. pretty cool no? we can deploy this tech without plugging it in.


>deploy this tech without plugging it in

you mean because it can also power a wireless transmitter or a large memory array storage? cuz a hologram that's not "plugged into something" might turn out to not be that useful.



Since you can run a PI on that level of power it seems enough?


I guess it might be not enough if the day is not sunny enough


5W is nothing to sneeze at.


5W is enough to power an ESP32


>ESA and NASA space missions have carried this trailblazing ZEISS technology on board for many years. It is also well established in the semiconductor and medical technology sectors.

Huh. Seems like it should be fairly easy to find info on then... though some googling around makes me think they might just be referring to their more general diffraction gratings and whatnot.





This looks really cool.

Could you put a polarizer on this and have that filler out the other side of the display, so a smart window is one way visible? That could make smart glasses and headsets much better.



I doubt you'd even need to. The fact that it's not a uniform grey blob means they're already controlling how the light leaves the device. I suspect it's already directional.


First, holy crap!

Second, #3 is always a red flag for me. It is sometimes code for "we can do this in the lab but we have no idea how one would manufacture it." A similar analogy is "we've got this great idea for a program you can license but no one here knows how to actually code it up."

Third, the impact of this going mainstream would be hard to underestimate. All those people working on transparent displays like they do in sci-fi movies? Yup they could do that. A video conference system with solid eye contact (mentioned in a couple of places) sure you could do that too. A mirror that could show you wearing different clothes? Yup I could see how that would be coded.

That #3 though. That is what tempers my enthusiasm. Did I miss any announcement that they had a display at CES? (or was it just an announcement) If the former I would seriously consider flying over to Vegas to check this out.



Here is a demo video from someone who was there: https://www.youtube.com/watch?v=NORPeCcIXRQ (from another hn comment)


> Second, #3 is always a red flag for me.

Does Zeiss manufacture any public facing camera currently ? All their general public facing projects are collaborations with Sony or Vivo or other makers, and I think they only manufacture lenses, so nothing with a high quality chip in it.

It's probably another story on the medical side, but this doesn't fit a highly specialized, peofessional only niche with a PC doing the processor on the side.



If it was anyone but Zeiss I'd call it a red flag too, but they're basically Old Gods of Glass.


I don't disagree. One way to know if it can live up to the hype is that it will be the lighthouse feature of the next iPhone which will both not have a notch for the front facing camera, it will do "touch anywhere" touch ID and process all of the screen operating gestures visually rather than capacitively so that you can operate you phone while wearing ski gloves if you want too.


It's a reverse light guide. We've been beam forming for a long time, it's unsurprising that the reverse is possible (imaging through a light pipe).

The principal issue will be gathering enough energy. A well lit source like a bathroom mirror (mirror behind the light guide) could work pretty well I'd wager. If the light guide is too efficient then it will appear opaque, so there is a trade-off.

I find "turns any window" pretty misleading. Unless I'm missing something this needs very special glass or at the very least a special coating/laminate.

For folks worried about privacy, it will almost always be more convenient and cost-effective to install a tiny spy camera somewhere.

Zeiss isn't going to aspire to sell cheap glass on razor thin margins.



> Zeiss isn't going to aspire to sell cheap glass on razor thin margins.

Especially true since Zeiss isn't planning on selling these at all. They're selling licenses to the tech.

I doubt that undermines your privacy argument in any meaningful way however. Even if the license was free, the cost of producing the components is certainly magnitudes higher compared to the cost of current tech that accomplishes similar goals.



As I posted below, Microsoft's Applied Sciences Group did something similar (I guess) back in 2011:

https://www.microsoft.com/applied-sciences/uploads/publicati...

https://www.microsoft.com/applied-sciences/projects/the-wedg...



I remember a friend on a team adjacent to the surface table team talking about them trying to do this when they switched from projection tables to tables with screens in them. As I recall with the projector based tables (the ones that looked like old cocktail arcade tables) I think they were using an ir camera to detect items and touches on the 'screen'.


Yes, the original Surface tabletop used IR cameras. Its successor, the Samsung SUR40 had in-cell sensing, i.e. photosensors embedded into the LCD so that it could capture a 960x540 image of the table surface.


> It's a reverse light guide. We've been beam forming for a long time, it's unsurprising that the reverse is possible (imaging through a light pipe).

Beamforming in RF frequencies is old, beamforming at optical wavelengths is pretty new shit. Doing the reverse, that is receiving a signal, is brand spanking new fresh out the cow hot shit at any frequency.



No, it's not really new. Small form factor is relatively new but even then the issue lies in yield and quality control.

Projector light engines include potentially many light-beam forming condensing and projection lenses precisely to concentrate light into a uniform quadrilateral. That's not new. The industry continues to advance, though.

Even holographic projection isn't new. This is just that, except the light is (sort of) taking the reverse path.

Technically there is no reason why it can't also project light outwards simultaneously. However light guides aren't really reversible like that: the light usually exits through a small fraction of the guide's external surface area. Reversing that means the entire external surface area is potentially collecting light, which would result in some undesirable caustics in all scenarios I can imagine. Light engines are in part designed to account for this (by reflecting or sinking light into an absorber) but this is still pretty different from what Zeiss is promoting.

For a small permanent installation where you are in control of the lighting I could see this working relatively well, but I have a hard time imagining you could get close anything resembling photo quality without a lot of environmental treatment. Conversely holographic projection is pretty doable on a mobile platform like a headset.

This is pretty new only in that it's probably at least an order of magnitude more difficult to accomplish and thus hasn't been viable up until now. Fundamentally none of the concepts are new, as far as I can tell.

As an example look at your nearest window and imagine it divided into a grid of uniform pixels. Each pixel is a small mirror that reflects a point of focus (wherever you are) to a another, smaller point on an imaging sensor somewhere in the frame. This would look pretty jarring (and jagged) to most people, until you layer a complimentary piece of glass on top of it to make the exterior flush. The two pieces of glass would be high enough precision that once you put them together they are effectively one piece of glass. Ta da.

This is effectively the same process as making any other multi-lensed glass, and that's Zeiss's wheelhouse.



> Zeiss isn't going to aspire to sell cheap glass on razor thin margins.

why not?



I assume if you’re on HN you work in tech.

As a side hustle, why don’t you start building and selling desktop PC’s to local businesses, competing with HP, Dell, etc on margin? That’s also tech, right?

While building cheap PCs at razor thin margins is adjacent to whatever your tech job is, and probably something you’re able to do, it probably wouldn’t be the most profitable use of your time



It's not what they do, like they probably won't start selling lawnmowers either.


Whilst I kind of see where you and the sibling comment is coming from, Zeiss do make glass- they make precision glass. Making cheap glass would be quality diversification, which is not exactly unheard of.


It sounds reasonable, but if they diversified quality, it would mean making precision glass and then also somewhat less precise glass. They will not start mass producing large panes of low margin glass. It's not what they're good at and it's not a lucrative market. Like Intel isn't going to start making jellybean parts.

They could license their name to some existing glass company, but they still wouldn't really be the ones producing it, and I think Zeiss has avoided diluting their brand name like this so far.



I don't disagree with you, but the way the GP comment made a joke of the idea was unfair IMHO.


I doubt Zeiss makes their own glass. Most optics companies outsource that to one of the big glass fabs: Schott, Ohara, CDGM.


Zeiss and Schott are both owned by the Carl Zeiss foundation; the origin story is somewhat covered by Material World by Ed Conway. It's a fantastic book and well worth a read.


WHAT!? This is my day job, but I'm embarrassed to admit I never realized that!

Thanks for the book rec, I'll probably start digging into it this weekend.



They don't have to. They're the best in the world, and they know it. They price accordingly.


because they're Zeiss. They made the lenses for the photos on the moon, they're not small potatoes.


Zeiss isn't in the business of selling cheap stuff on thin margins, no matter the product.


jiminy christmas!!!

I lit (pun) came to gripe about light pipes...

However here is you solution:

3D print lenses in a header for threads to be arbitrarily placed in a [COMPOUND] lens and pull that feed with AI spatial mapping (yes these are easy now)... EYE

And you can make these in many increments - micro even... "Hey NSa, super simple optical prince Rupert drops on a composite eye" (self destrucing fiber lens when discovered)



Years ago I was able to visit Zeiss in Oberkochen. They had a fantastic headquarters with a few older lithographs. I think a couple of the instruments were 80 millions dollars or so.

There’s a facility across from the Autobahn that was so sensitive trucks going by would throw off their machines. They had to put padding on the autobahn to prevent it. This was after they put the foundation on some kind of suspension. My coworker said they hire the most PHDs in Europe.



Feels like it would be easier to buy land not next to a highway :-)


None of the PhDs are in Facilities Management.


You need to attract PHDs and account for logistics. If you want to attract top talent you need to be in a desirable place or be a desirable place with access to desirable places.


Sounds like LIGO, but they're in the US. They had to put the AC unit for their entire facility on suspension because their instruments were so sensitive. And they ask people to not accelerate or decelerate so quickly when driving around the campus.


> Given the current fear around hidden cameras in Airbnbs, the idea of every single window (or even shower door) in a rental property being able to spy on you is a little disconcerting.

While there are some really interesting potential applications for this tech, it is also more than a little disconcerting.

The ubiquity of camera phones and the emergence of tech like those Meta glasses is already pushing us to disconcerting (albeit interesting and in some cases very useful) places, but some of these cutting edge concepts worry me. WiFi seeing through walls also comes to mind…



There are already microscopic cameras that can be bought for thousandths of the price


I'm surprised this isn't the top comment. I'm all for the benefits of this tech, and hadn't even thought about the airbnb style implication.

People didn't like that Google Glass could always be filming, now we don't even have a physical camera.

Rayban/Meta (I believe) have a sensor to detect that the wearer has not attempted to cover the light which shows that the camera is in use, but how will that work when every piece of glass is a camera.



After some searching I found a patent I think may be related to this https://patents.google.com/patent/WO2020225109A1/en because it uses the phrase "Holocam", is German and was filed by Audi (the press release mentions automotive applications as the primary initial use case). It's a translation from German which makes it a bit tougher to parse than the usual patent.

The total lack of any deeper information beyond the bold yet vague claims in the press release is frustrating. The PR makes it sound like a miraculous breakthrough destined to change everything. The source release on the Zeiss site only adds two bits of info.

> "The transparency of the holographic layer has only a minimal effect on the brilliance of the image reproduction. It is also possible to detect spectral components as additional information to complement the visible image. The resulting data provide insights into environmental contamination such as air pollution and UV exposure."

However, experience shows that in reality bold+vague claims like this inevitably come with significant trade-offs and constraints which limit its applications (little things like cost, power, fidelity, size, speed, etc). This is especially true in early implementations of new tech. That said, it may still be both interesting and useful. Unfortunately, we have no way to even think about how it might be useful because Zeiss marketing has chosen to play 'hide the ball' instead of just releasing a technical explainer outlining relevant trade-offs, limitations, etc.

If I was talking to someone from Zeiss my first questions would be about how much the additional components impact the optical characteristics of the glass, what the resolution of the resulting image data is, how large are the components needed at the edges and how far away can they be from the capture zone? Then, of course, how the output of the resulting imaging system maps into traditional camera/lens metrics like f-stops, aperture, imager size/density, gain, focal length, etc. Zeiss is an optics company after all.



Thanks for finding this.

The paper mentioned in the Description section can be read here: https://www.academia.edu/52566311/_title_Volume_phase_hologr...

My guess is that each of the items in patent diagram labeled 20 and 22 are gratings of this sort, perhaps with the fringes angled at 45 degrees.

Some additional info may be gleaned from the one patent citing this one, invented by one of the co-inventors of the latter: https://patents.google.com/patent/DE102019206354A1/en



I spent a few minutes looking for more info, doesn't seem that Zeiss has published anything but a press release yet: https://www.zeiss.com/corporate/en/about-zeiss/present/newsr...




This video is nothing but holographic displays. The article is about an invisible layer camera sensor.


Thanks for sharing. More of an animation than a proper video


I found a recording of it here (from IAA 2023): https://youtu.be/NORPeCcIXRQ


For sure, was disappointed with the original article for only linking generic keywords back to their own site. At least the CES one is from the horse's mouth.


A video of an animation is still a video.


Hence why I said "proper" video


It seems to be their "multifunctional smart glass"? https://www.zeiss.com/oem-solutions/products-solutions/multi...




> Glass surfaces can also generate energy. The microoptical layer in the window pane absorbs incident sunlight and transmits it in concentrated form to a solar cell. This combines the advantages of conventional windows – natural light and an unrestricted view – with the additional benefit of efficient energy production.

This is pretty cool.



Agree. Obviously a few orders of magnitude in cost reduction would be required... but this seem like it could have interesting potential for energy generation in high-rise buildings, which currently have near-zero solar footprint. With that said, the lack of any mention regarding efficiency makes this part of the press release seem like a bit of smoke and, erm, mirrors.




Do a search on "non line of sight imaging" (aka NLOS). Here's just one public paper.[1] If this is what is being published, what do you think the intelligence agencies have?

1: https://www.pnas.org/doi/full/10.1073/pnas.2024468118



> This means that everything from the window in your car to the screen on your laptop to the glass on your front door can now possess an invisible image sensor.

Retailers, marketers, and data brokers are salivating.



I can't wait until the windows in our homes plaster ads over everything every time we look outside.

It'll sure be distracting when it's the windshields of our cars, but I do look forward to the legal drama when companies get sued for painting their "holographic" ads on top of the adspace other people already paid to pollute with their own advertisements.



I'm not sure where you live, but in many places there's strict city/county level ordinances restricting signs (Particularly lit up signs) and advertising. There's a reason people's backyards aren't littered with billboards.

Anyone that's experienced a cracked windshield understands that this won't be going anywhere outside of some very niche and expensive cars.



> Anyone that's experienced a cracked windshield understands that this won't be going anywhere outside of some very niche and expensive cars.

Don't do the whole windshield. My car uses a bit of plastic on the dash to reflect a display. I love it!

https://i.imgur.com/jXfWf5b.png



I think it may be more cost effective to put ads in subsidized smart glasses and contact lenses. (A little dystopian, I know.)


how would an image sensor display ads?


> Holographic 3D content permits more design, branding, guidance and information functions. For example, side and rear windows can be used for eye-catching Car2X communications. It is also possible to black out window glass or make projected text and images visible only from the inside or outside. Video content is also supported. (https://www.zeiss.com/corporate/en/about-zeiss/present/newsr...)


This video seems to show what they're talking about better than that article:

https://www.youtube.com/watch?v=NORPeCcIXRQ





I've read through all the comments here but I still don't have the slightest idea of how this works.

I see some references to "light guides" and words like "coupling" but I don't know what those mean at all, and all of my googling is not helping to explain how light guides embedded in a thin, mostly transparent layer could possibly be used to project a hi-res holographic 3D light field out of a piece of glass. How big is each guide? What is it exactly -- what is it actually made of? How is it shaped? How are they arranged? How are they illuminated? How do you manufacture something like this?

Can anyone ELI15 how this works?



Don't think of it like a transparent camera. Think of the window as a giant external periscope lens, connected to the rest of the camera with such thin fiber optics that you can't notice.


Sure, but how do you build fiber optics like that? Especially so each "pixel" has to shoot out hundreds (?) of separate fibers to cover every possible angle to achieve the holographic effect?

For just a 1MP display, we're talking like hundreds of millions of fibers? That fit in some thin transparent layer? How?



I also am a bit over my head here but I’d guess the major innovation is around manufacturing glass in a way that optic channels are a natural part of the glass’ characteristics as it’s developed. We’re probably talking fiber optics at a molecular level.


"Webcams that enable you to look anywhere on your screen."

Minor aside, but does anyone actually care about this? Forever ago, I was told to try and look into the camera in order to project eye contact during video calls, but now that just seems like a cultural hangup that arose from people not being familiar with video calling. Now that it is more ubiquitous, I feel like we all have collectively agreed that the eye contact thing is unnecessary?



Consider the possibilities when the screen-glass camera view could be different for each of the participants on your videoconference screen - so as you shift your eyes from one to another, they get different experience of eye contact.

Staring into your camera on a multiparty call means everyone feels like they are getting your eye contact, which devalues it. Eye contact moving between participants is the gel that holds conversations together and it's why video calls are still stuck in the conference call era in terms of talking over one another.



It's not just cultural. We've evolved to recognize eye-contact. Newborn babies immediately know when you look at them. Eye contact with a dog communicates dominance.

You've just acclimated to losing that signal during video chats. Bring it back, and you'll have a richer experience.

More generally, we have evolved special neural hardware to recognize gaze, with a special case for when gaze is directed at our eyes. Gaze is important because it's a sign of where people are attending. And as Herb Simon said: "in the information age, attention is the scarce resource."



> but now that just seems like a cultural hangup that arose from people not being familiar with video calling.

Eye contact is about as old as humankind. So maybe in 300,000 years we can check if we get over it with video calls.

It is not that communication is not possible without it. It just removes an additional layer.

Eye contact conveys all kind of useful meta data. Depending on the circumstances and other verbal and non-verbal cues it can mean "I'm listening to you", "oh, maybe let's push more on this topic", "have you also caught that?", "perhaps it would be wiser to not talk about that", "i'm specifically addressing you" or "can you help me out here?". And these are just the ones immediately applicable in business settings. Let's not even talk about all the ways it can be used during flirting.

And it works on a subconscious level. I regularly DM role playing games, both online and in-person. The lack of eye contact is a serious impediment in online games. One can use eye contact, or the lack of it to judge engagement. One can signal with eye contact turns or who an NPC is talking to. It can also be used to help less talkative players seize amazing role playing moments.

Can we get by without it? Sure. We can communicate anything using a 1-bit communication channel using morse code. It is a bit like dancing in shackles. Can be done, just leaves something extra on the table.



To clarify I didn't mean that all eye-contact is cultural, I meant specifically during video calls. I was told that if I didn't look into the camera during my interview, it would be perceived as rude or unprofessional.


No, we haven’t agreed that. I continue to feel somewhat rude when not looking at the camera during a conversation. And it matters to nonverbal communication.


I also still habitually look at the camera (due to said training early in my career), but I think that the important thing is I don't think it is rude when other people don't do it. And in my experience, the vast majority of people I interact with don't either, which is why it feels like we have agreed that it is not important.


Idk, I’ve noticed when I’m on sales calls, some sales people make a very conscious effort to look into their webcam, and the effect of them “looking at me” in the video feels:

A)Initially, a little bit odd, because 98% of people don’t do it/ I’m not use to seeing that

B) Then feels pleasantly good/ natural.

I’m not going to make a decision on a performative behavior like looking at the webcam - at least, not consciously - but I’m sure if/ when this becomes the norm, webcams NOT focused on eyes will seem very “off”



No, and I'm absolutely not willing to give up the mechanical camera shutter I have on my laptop for that feature. They could add a mechanical shutter to the hidden camera in the frame but why do all of this.

Edit: Should have said "mechanical lens cover" instead of "mechanical camera shutter"



Just so you know, in photography, "mechanical camera shutter" can mean something very different from what I think you mean.


This is an odd, almost reverse nitpick. The word "shutter" is general-purpose. It's literally a thing that shuts the opening. There's one inside an SLR camera, as you point out. There's often one on a webcam. They are common on windows on houses.


All words are general-purpose, but when we arrange them they adopt new meaning. I'm not nitpicking, and just pointing out that the meaning of those words as arranged may not be perceived as the author originally intended.

Alternatively, I'd suggest a different arrangement of words like "lens cover", "webcam cover", "webcam cover slider", or "webcam privacy shield" to be more accurate.

Of course we were able to grock what "mechanical camera shutter" meant in this context, although at first I was confused and wondered if in fact there were webcams with mechanical focal plane or leaf shutters.



This is beyond even science fiction. I could have never imagined something like this was even possible - I still can’t. Is there a demo of this tech in action?




Rough translation of video transcript (YT transcript translated by deepl.com):

  Lenses.
  This year's theme is now holograms
  and it's pretty funny that this is
  is called a holocam, and it's pretty funny.
  If you look at it right now, if you look at the spring right here.
  you can see a little bit of blue
  you can see a little bit of blue, which is actually
  what is it?
  it's actually acting as a camera, so
  Now, if you listen to the explanation
  what we call holography.
  is that it's now reflecting light, so we can gather that light
  and you can collect that light and act like a camera
  like a camera, so for example, you can now
  what?
  the meat.
  it can recognize the blue color
  and only recognize the blue part.
  Jace's
  holocam. It's quite
  interesting.


great find, this is much better than the article


holy shit, the glass camera in action.. Scary


That was a truly horrible web-page. Does anyone have a link to the actual technology?


Looks cool. The unfortunate thing, is that, whenever an "exciting new camera tech" is announced, the image quality is often debatable.

But even relatively poor image quality might be cool, if you had a camera built into your monitor, so you could do direct-eye-contact video calls.



Yeah it sounds like a great option to place a camera invisibly in the middle top half of your screen so you can actually look a person in the eye when videoconferencing. No more weirdly looking down or to the side for everyone.

I'm sure it can be used for creepiness as well but I see the benefits too.



If I look at my camera lens when I'm talking and I look at the screen while you're talking, and then you do the same, then there's no issue. Am I the only one that does this? (I know the answer. I am. At least at my company, I am.)


This feels normal if you make videos but it's pretty unnatural.


iPhones solve the camera angle by changing the position of the pupil in software instead.


Having the screen be the camera 1) removes the need for the holepunch as well as magic island workaround 2) could clearly be used in tandem to improve the effect, even potentially reducing the required computation. Sure, Google also fixes images in post, but that's still not a substitute for better cameras. You're always information limited.


Given that this will probably only produce a relatively low res, high SNR image, it might be the sweet spot is to use this tech to capture a true eye-contact perspective from the glass in front of a screen, then neural net combine it with images from offset cameras, to create a 'true' through-the-screen view.


why stop there? one rendering for each eyeball


Are you sure? I remember that being present briefly in a beta several years ago, but then it was removed. I don't remember it ever coming back.


Some searching says it was trialed in iOS 13 betas, but didn't make it into the final cut. It was released with iOS 14 though.

https://ios.gadgethacks.com/how-to/disable-facetimes-creepy-...



Oh, TIL.


It’s a bit uncanny valley though, right?


Nvidia Broadcast has a similar feature but it's really bad and completely unusable if you wear glasses. I don't have an iPhone to compare.


I’ve never noticed it, so apparently not.


1986 solution vs cyberpunk solution

how did we even get here?



I believe Nvidia offers something similar with Nvidia Broadcast[0]. [0] https://www.nvidia.com/en-us/geforce/broadcasting/broadcast-...


This doesn't solve the issue, because it looks like the person is always making eye contact! You don't want that—you want it to look like the person is making eye contact only when they are actually making eye contact. Constant eye contact is exhausting.


It does what??? How have I never heard about this before? Is this deployed right now? Is it always enabled? I have so many questions, don't even know where to start. That sounds amazing.


Which versions of iphones do this?


Any model newer than iPhone X (XR/XS and up).

It's a Facetime exclusive, though - no API so other videoconferencing apps can't use that.



[flagged]



I never understood bigger and more numerous camera bumps until I used optical zoom and took photos at night. Image quality with thick lenses is incomparably superior to that of thin lenses and 3 image sensors collect 3x as many photons as 1 so you don't have to hold the phone still for such a long time. Ideally the entire backside of my phone would be lenses


and your 1500€ phone still sucks compared to my 500€ DSLR, lol


Yes but I have my iPhone everywhere and the pictures the 15 pro max takes are beyond ‘good enough’.

Even the zoom is pretty good nowadays.

Often the choice isn’t iPhone photo or DSLR photo it’s iPhone or no photo at all.

It’s a right faff getting the DSLR out and carrying it around. I brought mine to NYC over Xmas and ended up using my phone mostly.

Also no one I know prints photos out, they look at them on their phone, iPad or maybe TV if they care enough or have a screensaver slideshow.

I’d love to use my DSLR more and every year I tell myself I’m going to but I just… don’t.



I tried to go on holidays with my dslr… what a pain. So heavy (unless you buy a $3000 mirrorless) for photos I never print and the times I print them no one ever take the time to look at them. They stay in a closet until thrown out. The quality from my iPhone is not that great as a 4K monitor wallpaper compared to my DSLR. But given the fact that it weights nothing and on holidays the iPhone allows me to text, use GPS and google maps, play, read articles, watch videos and allows an immediate editing of my photos in raw and created a shared album right away … it’s a no brainer anymore


Buy any compact camera with a 30x optical zoom and it will take better pictures than a smartphone. Maybe not all of them, but you can zoom on details from a distance, take pictures of animals that would flee if you try to get close, etc. The last one I bought about 10 years ago was about 300 Euro, maybe 200. It has wifi to backup pictures to my phone.


You're right about the strengths of the phone camera, and I also take many more pictures with it. But printing large prints, framing and hanging pictures is absolutely worth doing, much more than printing 4×5s to flip through. I love having them and everyone who comes into my home takes some time to look at them.


Just buy a used mirrorless camera, it’ll set you back a couple hundred, not 3k.

New technology is fun and all, but even decade old digital cameras are more than good enough for good holiday snaps that you might want to crop and print.



I regularly use my DSLR and print photos with my own canon photoprinter. i bought a 70mm-300mm lens for a bronyconvention (galacon) and it is simply impossible to get images of that quality with a phone.

and the 50mm f1.8 is really great considering the low price.



Yes and I manually sort through thousands of sunflower seeds every year to get the good ones for my breeding program but our obscure hobbies aren’t the norm.


Now I want to know more about your sunflower breeding program :)


..until you find out it's for bronycon snack packs


For the everyday needs of the average non-enthusiast consumer, I'd argue that an iPhone is simply better than a DSLR or mirrorless - not just more portable or more convenient, but capable of producing reliably better images.

Sure, the DSLR is the obvious choice if you're a serious photographer, but most people aren't serious photographers. If they buy a DSLR or mirrorless camera, they're going to use the kit lens and leave the mode dial on auto. For people who just want to point and shoot, the iPhone's computational brilliance shines through.

The iPhone isn't so much a camera as a generative algorithm that happens to use image sensor data as a prompt. That's infuriating if you're a photographer who just wants full control over a big sensor, but it's tantamount to magic if you don't know what an f-stop is and have no inclination to learn.

I'd bet that if you gave my mother an iPhone and a DSLR, she'd get consistently better images from the iPhone, even if we gave her a one-day crash course on photography first. Sure, she might fluke the odd decent photo with the DSLR, but 90% of the time, the iPhone's algorithmic guesswork is going to beat better imaging hardware with dumber software.



So we should call those shots snapshots when made on a phone. Well, the only thing you need to know when shooting with dslrs is to move the dial to Auto/P. If you want to play with DOF, move it to Av. If you want to shoot the dog, move it to Tv and 1/500. If you want to shoot an airshow move it to 1/1200 or higher. Is that too much? :D


I'm getting serious "you can build Dropbox yourself quite trivially" vibes here. I mean, people don't even turn their flash off in broad daylight…


It's really that easy. Ppl like to complicate things A LOT. Just to look more important.


I'm going to tell your mother you said that.


Phone-cameras produce muddy trash. Ever tried to make a good portrait with difficult light? That was the reason i began photographing seriously, because the photos i made with my phone on the galacon a year prior were trash.

And yeah u have to learn how to properly use a dedicated camera which is why i only shoot with manual settings, for casuals phones are enough.



What phone? I've never owned one that took worse photos than a DSLR in auto mode under normal lighting


Phone cameras haven't stopped improving in the last couple of years. Your data might be outdated


Your 500 euros DSLR sucks compared to my 1500 euros phone for gaming. Do you carry your DSLR in your pocket every time you leave the house? These is obviously a matter of specialized vs general use.


Yeah I seriously doubt that. Not without a lens that costs 3x-4x times the camera. If you spent 500€ on a kit that makes iPhone or Pixel photos look bad... you bought it stolen, lol.


I bought it new from a camera-store lol. Its the Canon EOS 2000D with the Canon EF 50mm f1.8. (I don't count my other lenses, bought them later).

And yes, it is better than any phonecamera.



It's doing a whole hell of a lot more and is much smaller. I'll take the tradeoff as a casual camera user.


Your 500€ DSLR isn’t a €300 phone, a €300 camera, a €300 navigation device, a €300 Sonos Remote, a €300 PDA all in one.


The best camera is the one you have in your pocket when you need to take a picture.


Quality is probably limited so this would be best for a front facing camera.


I'm pretty sure they meant for the front cameras. The camera bumps are here to stay, I'm afraid.


So embedded micro light pipes? Cool idea. I wonder if it creates any kind of noticeable perturbation in the glass. Like if you embedded this in a window and were looking through a portion of the glass that contained these light pipes would you be able to notice them? Might be a problem for putting this type of glass in front of a display.


Looks pretty cool, though it is not quite Slow Glass [1].

[1] https://en.wikipedia.org/wiki/Light_of_Other_Days



... Does not turn glass windows into cameras. From the headline I thought this might be a possibility.

Requires Zeiss Multifunctional Smart Glass(tm)



Having played with light fields and photography, I have questions.

There has to be an imaging sensor somewhere. Where is it?

I strongly suspect most of this "magic" will condense out of this cloud into a puddle of marketing sponsored bovine excrement.



> The Holocam technology "uses holographic in-coupling, light guiding and de-coupling elements to redirect the incoming light of a transparent medium to a hidden image sensor."

It looks like it still requires a sensor, it can just be hidden in the frame of the glass.



Sure it has a sensor, it's not magic. But OTOH if it can capture the POV of a real line of eye contact with another image projected _through_ the camera, the effects are transformative, it doesn't matter whether ultimately there is a sensor somewhere or not


So... more mass surveillance tech coming from every window in every building?


Nah , It would be easier and cheaper just to stick tiny regular cameras everywhere.


Am I alone in thinking this is cool but not inherently useful? The only use case where this might be useful is in video calls to make ‘eye contact’ more realistic but I’m sure there are software fixes (altering pupils)

In most cases such as the windscreen you can just have a camera on the frame. Why does it need to utilise the glass itself?



> Why does it need to utilise the glass itself?

I think you’ll find that for a lot of people, it looks cool/elegant/stylish is a deciding factor.



They're specifically marketing to car manufacturers in the press release video. Which is unfortunate because I don't need more non-haptic buggy interfaces in my car. The technology itself is amazing even though the holo-cam appears to be just greyscale and rather blurry right now. The ability to embed the optics in/near (hard to tell from the marketing) the edge of the glass is awesome as you don't need a projector that is separate (by much) from the display surface.


I'm super excited about whether something like this could be used to prevent bird strikes on windows. If anyone wants to grow the market for this technology there are likely square miles worth of glass that could have this installed to prevent birds from being killed.


Are there any numbers on how many megapixels the camera is, and how many DPI the display is? (What are the upper bounds for these over time?)

The demo looks greyscale, and super low res and low FPS.



>The demo looks greyscale, and super low res and low FPS.

I think then intention is it just needs to be enough to know where the users face is and project a HUD at them.



There is a video on youtube, camera resolution a bit poor quality but game changer.

https://www.youtube.com/watch?v=NORPeCcIXRQ



The telescreen received and transmitted simultaneously. Any sound that Winston made, above the level of a very low whisper, would be picked up by it; moreover, so long as he remained within the field of vision which the metal plaque commanded, he could be seen as well as heard. There was of course no way of knowing whether you were being watched at any given moment.


Would be good to have a formal notification on glass stating 'this window/screen/etc contains a camera that may broadcast your image'


Given that Zeiss proposes this technology for vehicles, windows, etc., I'm curious about the effects of water droplets in direct contact with the surface changing the index of refraction locally (vs air).


Sounds like this could work great for eye-tracking in VR/AR headset optics?


I'm curious if we'll finally get videoconference technology that lets you look people in the eye.


Or make eye-contact, rather? Interesting to consider. Though I imagine a camera behind a screen could do this, too.


Can some correct me in saying that this is equivalent to having a 20+ inch sized sensor? The focal length would be extraordinary


I'm just excited to have a cheapish webcam embedded in my monitor. That way I can look at people while I'm on zoom. That's it.


Get legislation against all hidden cameras.


Wonder how long intelligence agencies have had this.


Implants which modulate and reflect incoming radio signals back to the radar device, especially powered by the radio wave itself, is nothing new in clandestine surveillance.


They'd probably put it in lightbulbs (to get a long lasting power source and good viewing angle).


They'd probably pay a contractor $60 million to develop it for the military and not actually wind up with anything of use


One comes standard with every shower door.


Apple rejoices, I guess. This brings their vision of getting rid of the notch closer to reality.


I’ve said many times, eye contact will be the killer app of VR…

… but it will be ironic if 2D beats the VR companies to it, using technology like this.

What an utter failure on the part of Meta that would be. With such a huge head start and they still don’t sell eye contact (except on their Pro device? Maybe?)



If you look at it where does it show your eye looking once captured?


Changing rooms will never be the same.


Let the mass surveillance begin!


too late


Is there on the market any device that should be able to detect this tech if it is used as hidden cams?


Reads like press-release wank. So not a Lytro, but imagining the possibilities of reflection-based computed image reconstruction. They accomplished getting their brand out there without offering anything new. Check out their about us:

> From time to time we also publish advertorials (paid-for editorial content) and sponsored content on the site.

https://www.digitalcameraworld.com/news/about-us



Is it just me, or does their car stuff look a lot like the BMW Neue Klasse stuff?


It honestly doesn't seem that useful to me.


Pretty cool and hella alarming lol.

Next someone will tell me they have a way to “ssh” into the universe and get process details on every entity in it including every human and what’s happening.

Kind of like in the matrix with The architect.



I guess the closest thing you can get to that -- ssh into somewhere and get details on every human -- is to ssh into the government's database on everyone remotely of interest, or even accidentaly connected, (in essence almost everyone), both made of computer and human agent collection and analysis.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact



Search:
联系我们 contact @ memedata.com