albert_e 9 hours ago

Practically --

I feel hardware technology can improve further to allow under-the-LED-display cameras .... so that we can actually look at both the camera and the screen at the same time.

(There are fingerprint sensors under mobile screens now ...and I think even some front facing cameras are being built in without sacrificing a punch hole / pixels. There is scope to make this better and seamless so we can have multiple cameras if we want behind a typical laptop screen or desktop monitor.)

This would make for a genuine look-at-the-camera video whether we are looking at other attendees in a meeting or reading off our slide notes (teleprompter style).

There would be no need to fake it.

More philosophically --

I don't quite like the normalization of AI tampering with actual videos and photos casually -- on mobile phone cameras or elsewhere. Cameras are supposed to capture reality by default. I know there is already heavy noise reduction, color correction, auto exposure etc ... but no need to use that to justify more tampering with individual facial features and expressions.

Videos are and will be used for recording humans as they are. The capturing of their genuine features and expressions should be valued more. Video should help people bond as people with as genuine body lanuage as possible. Videos will be used as memories of people bygone. Videos will be used as forensic or crime scene evidence.

Let us protect the current state of video capture. All AI enhancements should be marketed separately under a different name, not silently added into existing cameras.

  • jrussino 8 hours ago

    I agree with your philosophical stance, in general, but this particular use case is one that I've been wanting for years and where I think altering the image can be in some ways more "honest" than showing the raw camera feed.

    With an unfiltered camera, it looks like I'm making eye contact with you when I'm actually looking directly at my camera, and likewise it looks like I'm staring off to the side when I'm looking directly at your image in my screen.

    A camera centered behind my screen might be marginally better in that regard, but it still wouldn't look quite right.

    What I'd really like to see is a filter for video conferencing that is aware of the position of your image on my screen, and modifies the angle of my face and eyes to more closely match what you would actually see from that perspective (e.g. it would look like I'm making direct eye contact when I'm looking at/near the position of your eyes on my screen).

    You could imagine this working even for multiple users, where I might be paying attention to one participant or another, and each of their views of me would be updated so that the one I'm paying attention to can tell I'm looking directly at them, and the others know I'm not looking directly at them in that moment.

    • wruza 2 hours ago

      Would be funny if everyone on your screen gave a side eye to the bottom right corner where the currently speaking person is.

      Jokes aside, I think you're absolutely right. Online interactions have dynamic geometry, so mounting a camera behind a screen will just not cut it, unless the entire screen is a camera. Also, some people might prefer projecting/receiving no eye contact at all, at times, in situations. And vice versa.

      Philosophical stance here is purely traditionalist, it decides on behalf of people. What people would like to use, that should exist. "Videos are and will" is a strange claim, assuming its claimer has neither control over it nor any sort of affirmation that it is going to be true.

      • albert_e 35 minutes ago

        Once we have technology to put a camera under a screen without sacrificing display quality ... we will not stop at one camera.

        There will be an array of cameras covering say every 2x2 inch square of your screen.

        Just see how many cameras are on todays phones. Same can happen with new camera tech too.

        Also there will be a huge commercial driver to put multiple cameras under the screen -- all apps and marketers can track your precise gaze. Ads will pause unless you are actually watching them. I will hate it but it feels inevitable.

  • xattt 22 minutes ago

    I always thought that under-screen cameras would come as a bug-eye lens, with the sensors between pixels. The pitch of modern mini-LED displays seems to have enough space between pixels to fit them in.

  • vitorsr an hour ago

    > I know there is already heavy noise reduction, color correction, auto exposure etc ... but no need to use that to justify more tampering with individual facial features and expressions.

    Critically, the enumerated computational processing units are global transformations, while tampering is inherently a local, "contentful" transformation.

  • YeahThisIsMe 2 hours ago

    I agree with this.

    I don't actually want the person I'm talking to to appear to be looking directly into my eyes because it's weird - it means they're looking at the camera and not at me on the screen, talking to them.

    • smeej an hour ago

      Somehow I've apparently made a different adjustment to this than most people. My therapist was commenting on it the other day, how I do look directly into the camera when I want her to see me as making "eye contact," rather than looking directly at where I see her eyes.

      She's taking this as an autistic adaptation NT people are less likely to make, like my gestures are practiced and tailored for the sake of the other, not my own sake. I want to "look in her eyes" to make a point, because that's one of the ways you show people you're making an important point, not to see how she's responding to what I'm saying.

      I haven't done any of it on purpose. It's apparently just how I've adapted to the weird communication space of having a gap between actually looking at someone's eyes and being seen to be looking at someone's eyes.

  • TowerTall 2 hours ago

    > under-the-LED-display cameras

    If people laugh with their mouths open, wouldn’t a camera placed below the LED display capture the inside of their mouths, and the rest of the time just point straight up their noses?

    • albert_e an hour ago

      I meant the camera will be invisible and BEHIND the screen .... just not visible as a punch hole/notch.

      I think some mobile phones have already done this...where they are able to put a camera behind the pixels.

  • aitchnyu 7 hours ago

    Will we have video with sensor signature for evidence purposes? One high court in India rejected any video evidence as a potential deepfake.

  • yieldcrv 8 hours ago

    Or buy a specialty device for replicating the real world

    Its been half a decade already from when I first noticed iphones cant capture a red world when wild fires are messing up the air quality, had to break out an ILC (DSLR without the SLR) to capture the world more congruently to how I see

    • lloeki 3 hours ago

      > iphones cant capture a red world when wild fires are messing up the air quality

      s/iPhones/the iPhone Camera.app/

      Apps like Halide and Pro Camera have no trouble handing you over control of white balance. I've captured both faint aurora borealis and red/brown hue when sand and dust is brought over to inland Europe by scirocco with great success.

  • sadcherry 6 hours ago

    > Videos are and will be used for recording humans as they are. The capturing of their genuine features and expressions should be valued more.

    Controversial stance, but for the same reason I reject wearing makeup.

    Girls, you are beautiful as you are! No need to fake it! Most guys don't do that either and everybody is perfectly fine with that too.

    • irjustin 5 hours ago

      The line is very long and blurry the whole way. The extremes are completely naked 100% of the time with zero grooming and the opposite is eugenics or genetic engineering body/facial features (is what i've come to believe?).

      Isn't it okay to feel good about looking good, sure (i love dressing up and doing my hair for occasions)! but obviously that can turn very problematic very fast. Honestly, I wish I knew where to draw the line in the sand. Is it makeup? piercings? nice clothes? surgery?

      Just a parent with two daughters who has more questions than answers.

      • InDubioProRubio 5 hours ago

        Surgery is permanent, life-long change- beauty, is relevant for 20years+

    • exitb 5 hours ago

      Makeup is a personal preference. What OP talked about is subtly and transparently putting AI in a pipeline where we don't expect it. And it's not hypothetical, rather it already happens. Video meeting software is doing all kinds of sound rejection based on an unknown set of rules, even though none of us enabled that as a feature.

    • HeatrayEnjoyer 5 hours ago

      Please, please, tell me this is sarcasm.

      • master-lincoln 2 hours ago

        I don't think it was. And I agree: make up is like putting a mask on to hide who you really are because society taught you that you are more valuable this way. People might think they do this for themselves, but it has been put into their mind by media and adverts. This is not healthy and also wasted resources.

    • voidUpdate 5 hours ago

      Guys, you don't need to modify cars ever! They're fine as they are!

      • DidYaWipe 5 hours ago

        Are you seriously advancing that as a valid comparison?

    • ndndjdjdn 5 hours ago

      Next up. Stop taking showers people!

      • master-lincoln 2 hours ago

        How is this a fair comparison? There are health benefits to hygiene, there are none from make-up

        • wruza 38 minutes ago

          Yeah, looks don’t get you anywhere in this world as a woman. /s

          …We may talk all day how bad and unfair that is, but none of that changes the reality for an average person out there.

kleiba 3 hours ago

It would be great to have a feature where my closed eyes are replaced with open eyes looking at the camera - then I could sleep through boring meetings.

qwertox an hour ago

Looks somewhat creepy.

The normal thing is not to uninterruptedly look at a person (which the camera is supposed to be). For example when you make a gesture of trying to remember something by looking somewhere else.

Retr0id 10 hours ago

Does what it says on the tin, but honestly I find the "uncorrected" video more comfortable to watch.

  • mvoodarla 8 hours ago

    Original dev here. I tend to agree for this particular demo video as I'm reading a book and I don't blink in the original.

    The model tries to copy the blinks of the original video so it's possible that in other conditions, you'd notice less of this.

    Fun to see this feedback though, definitely something worth improving :)

    • patrickhogan1 8 hours ago

      BTW your main site is throwing an error. Probably want to edit since your post is growing.

      https://www.sievedata.com/

      Application error: a client-side exception has occurred (see the browser console for more information).

      • mvoodarla 7 hours ago

        Original dev here. Unable to replicate this on my end, try refreshing?

    • mlhpdx 8 hours ago

      I likewise find the “corrections” uncanny. It’s not just the one with the book.

  • ddfs123 8 hours ago

    I think it's just that naturally nobody is keeping 100% eye contact ( except maybe like TV news reporter ), it feels like an interrogation.

    • XorNot 8 hours ago

      This is what I realized is uncomfortable about camera on group meetings in teams - I can't mute other people's video, and so it feels intensely weird to have a wall of people staring blankly at you.

      • prmoustache an hour ago

        You can totally turn off incoming video on msteams. What you can't is have it as a default setting afaik.

      • RheingoldRiver 8 hours ago

        > I can't mute other people's video

        you can switch to another tab, use a miniplayer, in some apps u can focus one person's screen and if you choose someone who has a static avatar up you'll barely see other people's faces.

        The nuclear option is to install PowerToys [0] and put something always on top (im a fan of the hotkey winkey+space to toggle always-on-top on and off) in the exact position of the other video feeds. notepad or something.

        [0] https://learn.microsoft.com/en-us/windows/powertoys/

  • karlgkk 9 hours ago

    There are other implementations that do a better job, such as Apple and Google's. They also are less willing to correct eye contact when it's "out of range" so to speak.

  • hanniabu 9 hours ago

    I think it's the lack of subtle movement, it's too strict and really locks the pupils front and center

    • lloeki 3 hours ago

      You mean frequent saccades?

      https://en.wikipedia.org/wiki/File:This_shows_a_recording_of...

      or the occasional look away? (for which there appears to be a feature for that)

      > Look Away: enable_look_away helps create a more natural look by allowing the eyes to look away randomly from the camera when speaking

      I expect both to be different: while saccades do happen when occasionally looking _away_ from a person, they also happen when looking _directly at_ one person because we don't constantly stare at a very specific unique and precise point on their face.

      For the demo video, try enable_look_away = true, look_away_offset_max = 10, look_away_interval_min = 1 and look_away_interval_range = 1 (then submit), which from the result I got should really be the default for a more natural result.

lemonad 2 hours ago

I can definitely see the use case as it is annoying having to choose between actively looking at participants of a meeting on screen or _appear_ to look at the participants by gazing into the camera and not actually looking at them.

I sometimes use an Elgato Prompter to better enable eye contact during meetings. The camera and lens is mounted behind the screen so looking at the screen is also looking at the participants. The downside is that the screen is tiny and you leaning forward to read, say, a document does not look that great on camera. So either you have to zoom it substantially or read it on another screen, thus looking away from the participants. In this case though, you are not looking at the participants and faking that eye contact in this case would be kind of weird.

xnx 9 hours ago

Nvidia has free Broadcast software with an eye contact feature: https://www.nvidia.com/en-us/geforce/news/jan-2023-nvidia-br...

It's from January 2023, so I don't know if they've improved it further since then.

The video conferencing software providers have been way to slow to put whoever is speaking top-center (near where the camera typically is).

DidYaWipe 5 hours ago

Creepy and misguided. Do people stare at you fixedly and unwaveringly during in-person conversations?

And if they do, do you like it?

  • lloeki 3 hours ago

    > Look Away: enable_look_away helps create a more natural look by allowing the eyes to look away randomly from the camera when speaking

    For the demo video, try enable_look_away = true, look_away_offset_max = 10, look_away_interval_min = 1 and look_away_interval_range = 1 (then submit), which from the result I got should really be the default for a more natural result.

    • qwertox an hour ago

      Usually looking away is part of a gesture which involves the context, like facial muscles and the information being shared ("Hmm, when was this?": makes the eyes looks up)

nicholasbraker an hour ago

I assumed this was a roadmapped feature on (at least) Facetime on OSX/IOS. I never saw any implementation of it, but I see value in such feature. Also for Teams etc.

froh 34 minutes ago

wow that's creepy :-)

technically cool, however I'd rather prefer some semi transparent mirror set up.

such a set up keeps the eyes alive.

richdougherty 9 hours ago

Kudos to the dev for coming up with the eye position fixing solution.

Building further on this idea, I wonder if instead of changing the image to look at the camera, we could change the "camera" to be where we're looking.

In other words we could simulate a virtual camera somewhere in the screen, perhaps over the eyes of the person talking.

We could simulate a virtual camera by using the image of the real camera (or cameras), constructing a 3D image of ourselves and re-rendering it from the virtual camera location.

I think this would be really cool. It would be like there was a camera in the centre of our screen. We could stop worrying about looking at the camera and look at the person talking.

Of course this is all very tricky, but does feel possible right now. I think the Apple Vision Pro might do something similar already?

  • newaccount74 4 hours ago

    There is already a lot of research on the 3D reconstruction and camera movement part, for example this SIGGRAPH 2023 paper: https://research.nvidia.com/labs/nxp/lp3d/

    In order for this to work for gaze correction, you'd probably need to take into consideration the location of the camera relative to the location of the eyes of the person on the screen, and then correct for how the other person is holding the phone, and it would probably only work for one-on-one calls. Probably need to know the geometry of the phone (camera parameters, screen size, position of camera relative to phone)

    Would be amazing, not sure how realistic it is.

  • mvoodarla 8 hours ago

    This is an interesting idea. We are a little farther off from being able to do this but agree it would look really cool.

  • scotty79 5 hours ago

    I think you'd get a lot by just transforming eyes so the gaze is relative to the virtual camera located on the screen at the place of the face of a person you are talking to. This way you get eye contact only when you are looking on their face on the screen, but not when you look somewhere else.

blkhawk 4 hours ago

During corona I build a fold down thing that put my webcam at eye-level on my monitor. turns out with a large enough monitor it really isn't that bad to have the camera in front of it.

not_a_bot_4sho 8 hours ago

I've never seen an implementation of this that wasn't super creepy past the initial tech demo

Bengalilol 5 hours ago

I, for some reason, prefer the original video. I may have an eye contact problem. Otherwise, the feature is nice and almost perfect: there could be some spaces where eye contact shouldn't be always on, I bet this would make it more human.

patrickhogan1 8 hours ago

Really cool application.

Just a heads up – your main website is showing an error. You might want to fix it since your post is gaining traction. Here's the link: https://www.sievedata.com/

The error message reads: 'Application error: a client-side exception has occurred (check the browser console for more details).'

  • mvoodarla 7 hours ago

    Original dev here. Unable to replicate this on my end, try refreshing?

jbverschoor an hour ago

Isn’t this built in FaceTime?

boomskats 3 hours ago

The thing with eye contact, though, is that it is worthless if you are never able to look away. When it's artificial like this, it's worse than not being there at all. It's just creepy. It was the same with nvidia's implementation a couple of years ago. It was just weird.

I do appreciate that this is a problem worth solving though, and I spent a lot of my time during COVID worrying about the negative impact that normalising loss of eye contact would have on the social interactions of our younger generations.

Back in 2021, I took one of those £50 teleprompter mirrors that YouTubers use, put a 7in raspberry pi display in the slot where you're meant to put your phone, and made it my 'work calls display' for a couple of days. The interesting thing is that the only people that noticed without me pointing it out were completely non-technical, and when they did they complemented me on the quality of my webcam rather than the fact I was looking straight at them; they could tell something was better, but couldn't quite put their finger on it. Which is funny because I'm sure being stuck behind a cheap perspex one way mirror made my actual camera quality a bit worse.

I remember I got to the point where I started playing with cv2 trying to do realtime facial landmark detection on the incoming feed and having a helper process shift the incoming video window around the little screen so that it would keep the bridge of the other person's nose (the point I naturally made eye contact with) pinned to the bit of the screen that was directly in front of the webcam lens. Then one morning I walked into my office, saw this monstrosity on my desk, realised I was nerd sniping myself and gave up.

One thing I do remember though is how odd it felt looking at yourself in a mirror without your image being mirrored. Not sure my brain was ready for that one after thousands of years of looking at itself in mirrored surfaces.

Bit of a weird pic but the only one I can find: https://pasteboard.co/BXE6zhbpOD7E.jpg

  • lloeki 2 hours ago

    > One thing I do remember though is how odd it felt looking at yourself in a mirror without your image being mirrored. Not sure my brain was ready for that one after thousands of years of looking at itself in mirrored surfaces.

    Feynman has a good explanation for that: https://www.youtube.com/watch?v=msN87y-iEx0

    But it doesn't go deeper as to why we're perceiving ourselves that way, for that we have to dive into biology, neurology, bilateral symmetry, and the fundamentals as to how, as bilaterally symmetric beings, we're able to orient ourselves in a 3D world.

    (I recall reading a paper or watching some video about that, but can't find it anymore)

  • blitzar 2 hours ago

    I wanted to do this but got stuck in the rabbit hole of picking out telepromters, screens and sizes. In the end my solution was to mount my webcam in the middle of the monitor (with the other party partially obscured). Previously my technique was to look at the camera not the screen (or have the other party in a very small window at the top of my screen) so partially obscured is an improvement!

thekevan 9 hours ago

From a development standpoint, this is cool.

But the resultant video has a tad bit of uncanny valley going on.

I'd rather learn from the guy on the right.

  • mvoodarla 8 hours ago

    Original dev here. Agree this video looks like uncanny valley but it's likely because the lighting of the original video is off + I baggy eyes (I was sleep deprived).

    Would recommend trying it on other videos, it is surprisingly good. Although there definitely are areas to improve.

boiler_up800 9 hours ago

Looks really good and seems fast. My guess would be that this effect needs to be 99% or else people will notice something / although they may not be sure exactly what.

AyyEye 8 hours ago

Only a techbro would think that "eye contact" means just synthesizing eyes. It's a high bandwidth communication medium and synthesizing it removes what little we had. Yes I know this isn't the first, no I don't think any of this reality-meddling is any less creepy.

advisedwang 8 hours ago

This is the real killer feature of Google's project starline, although they also achieve a 3D display.

vintagedave 10 hours ago

The results here in their sample video look _really good_: other tech I’ve seen in the past looked “wrong”. But the sample input is not one I’d characterize as looking away from the screen. Eyes move around like the person is thinking. The result video only looks more focused. It’s effective in carrying focus (it really does matter when someone looks directly at you), but it’s making tiny changes.

> Limitations

> Works best with frontal face views and moderate head rotations.

> Extreme head poses or gaze directions may produce less accurate results.

There it is. To use this I’d like to see an example showing it stop adjusting when “extreme” aka normal head poses are used. If it can handle real behavior and improve eye tracking in the optimal case so it’s seamless adjusting / not as someone moves around, that would be a good product.

  • EdwardDiego 10 hours ago

    My issue is the usual with laptop cameras, if I'm looking at you, my eyes are looking downwards, and it's very awkward speaking into the camera without seeing your face as I speak.

isuckatcoding 10 hours ago

Cool but why…?

  • karlgkk 9 hours ago

    Apple does this on the iPhone, by default. When you're looking at someone's face on FaceTime, it modifies the position of your eyes to be looking directly at the camera - so the person on the other end sees you looking at them.

    • s4i 4 hours ago

      Really? We FaceTime a lot with my wife and I keep telling her that I can see her looking at her own face in the corner instead or me. Is that tech accurate enough to tell that the person is looking at themselves and not the other participant, and then not correcting the eyes if that’s the case?

      Anyway, I’d much prefer if Apple didn’t silently alter the eye direction of people calling me.

ta8645 6 hours ago

Great. Now, do the rest of me sitting in the seat. If you don't need my real eyes, you don't need any of the real me. We can discuss, whatever it is, in email.

jedisct1 3 hours ago

Doesn't FaceTime already do that?

jpeggtulsa 9 hours ago

10 cents per minute of video... Pass.

AStonesThrow 10 hours ago

This is unfortunate, and perhaps more pernicious than obvious deep fakes, is a video filter that lies to the recipients.

Several years ago during the pandemic, I enlisted a job coach to get me hired. One of her paramount concerns was my eye-contact with the camera. She said it's so important. Am I paying attention? Am I an honorable man who maintains eye contact when I'm in a conversation? If I look away, am I collecting my thoughts, or prevaricating?

Many supervisors, managers, and teachers will judge their employees by whether they can pay attention during meetings, or if they're distracted, in their phone's screen, looking at keyboard, glancing off at children or spouse. Even more important, if you're meeting your wife and she can't even maintain your attention, what kind of husband are you?

If you employ a gadget to lie about this, then I hope they fire you and find someone who'll be honest. I hope your wife sends you to sleep on the sofa.

  • karlgkk 9 hours ago

    > If you employ a gadget to lie about this

    This has been enabled on iPhones, by default, for like 5 years now. You never even noticed.

    Their implementation only does a small adjustment, which works so well that most people don't even know it's being done.

    • olyjohn 9 hours ago

      If we never noticed it, do we even need it? I don't use FaceTime, but have never been bothered by where people are looking in any other video conferencing software.

    • bravetraveler 8 hours ago

      > You never even noticed

      I have seen three cameras in use in nearly a decade. They were all in interviews. I'm not avoiding opportunities, either. Legitimately 4+ hours a day

      Might be fair to say not many cared to see/be seen

  • allenu 8 hours ago

    That reminds me of a few months into the pandemic, one of the VPs at the company I was working at was presenting in a Zoom-based all-hands. I remember that he was very clearly looking directly into the eye of the camera as opposed to looking at his monitor's video feed like everyone else. I remember thinking that it felt a little bit weird and unnatural and very performative, like a politician, since he very obviously intentionally wanted to come across as more human by looking directly at the audience, although at the same time it was a fake look since he wasn't looking directly into the eyes of any one person, but a camera.

    Perhaps other people didn't think about it as deeply as I did and maybe it did have the intended effect, but I remember I didn't see him or anyone else doing the same thing in any future all-hands.

  • function_seven 9 hours ago

    I would go so far as to say the uncorrected gaze is a lie. When I’m on a videoconference, I am looking directly at whoever is speaking, but the camera’s physical placement tells the “lie” that I’m looking down at something else. This is because we haven’t figured out a good way of placing the camera literally wherever the eyes of the other party show up on the screen. So the camera is, by necessity, in the wrong position for video conferencing. But if we can fix it in software, then we can mitigate the “lie” somewhat.

    This is especially true for my set up, where I have two screens side-by-side with the camera replaced right between them. I just stare at the camera because otherwise it looks like I’m looking way off to the left or right. If I do look at the people who are talking, what they see is me looking off at “something else.” That’s a lie! :)

    • AStonesThrow 9 hours ago

      This is true, and unfortunate, but for the past 100 years, everyone has known that to make eye contact with a camera, you look into its lens. The instantaneous display of output is very recent, and if you ask a professional actress or news anchor what they do in the studio, they will tell you that they're trained to look into the camera lens, no matter what's on the monitors.

      I contend that it's unproductive to train consumers otherwise. Yeah, we could look at the screen and have software correct it. Or, we may eventually integrate lenses into screens so that they're placed exactly right. But it seems kludgy to do this software fix. Just train people to look in the right place. (I hate iPhones and I'm unable/unwilling to do Facetime with them. Please use Meet or Teams.)

      I'm gradually building skills that let me be aware of what's on the screen without having to stare into it. Having a relaxed, wide field of vision helps with many things. Glasses are counterproductive here.

  • maximilianroos 9 hours ago

    Sounds like the coach helped you maintain eye-contact with the camera. But if we get a tool to do this, then we're lying. Would you say the coach helped you lie?

    • CGamesPlay 9 hours ago

      That doesn't even make sense. The lie is that you're not doing the thing you are projecting as doing. You just said the coach helped the poster do the thing they projected as doing.

  • niij 9 hours ago

    edit: studio_seven said it better than I could. You're confused on what the perspective is with videoconferencing. There is no hardware with a camera in the middle of the screen; so you're always "looking away" to some degree.

    • AStonesThrow 9 hours ago

      No, I'm not confused at all. As I pointed out, the standard for 100 years: if you want eye contact, you look into the camera lens. The only thing that's changed recently is the availability of a direct, instantaneous monitor to distract us.

      Furthermore, if this corrects only someone who's looking directly at the screen, it'd be tolerable. But does it also correct eyes looking at a keyboard, eyes looking at a smartphone screen, eyes looking at a wayward toddler? That's worse.

      Also... ten cents per minute? That's highway robbery!

  • Der_Einzige 8 hours ago

    The fact that your feathers are rustled is what make it all the more delicious and delightful that it exists.

    All attempts by folks to subvert the freedom to direct one's attention where they want to are tyrannical in nature. If you can't detect it's happening, it effectively did not have a negative externality. The tree did not make a sound if no one heard it.

    This is the same thought that is used to justify not letting cashiers sit while they bag groceries. Those who think this love the taste of boots in their mouth.

    I hope that they fire those who refuse to get with the times on AI and embrace ludditism, and I hope your wife considers her future with you after the economic ruin that such practices will bring upon your family.