Removing hostile Windows updates

As a consequence of their recent effort to boost the numbers of Windows 10 installs using any means possible, however questionable, and turn their paying customers into beta testers, Microsoft have been especially hostile to their users as of late, installing nag-screens and “telemetry” code (also known as “spyware”) under the guise of important updates to existing installs of previous versions of Windows. While I would happily eschew Windows for Linux on all the machines I use, and have largely done so, the idea of avoiding Windows in totality is, sadly, not yet practical―the common-use machines I maintain in our lab required it for various reasons, and even I still keep a Wintendo partition.

There are plenty of discussions around about what to do about this. Here's a fine example. Though this is intended for my own reference, I've had success with the following:

wusa /uninstall /kb:2952664 /norestart /quiet
wusa /uninstall /kb:2976978 /norestart /quiet
wusa /uninstall /kb:2977759 /norestart /quiet
wusa /uninstall /kb:2990214 /norestart /quiet
wusa /uninstall /kb:3021917 /norestart /quiet
wusa /uninstall /kb:3022345 /norestart /quiet
wusa /uninstall /kb:3035583 /norestart /quiet
wusa /uninstall /kb:3044374 /norestart /quiet
wusa /uninstall /kb:3068708 /norestart /quiet
wusa [...]

Read this post

Hey look-it, I'm on the TV!

So it seems my face has now graced (or disgraced, perhaps) North American television. Some folks from DMG Productions were in the lab a while ago gathering footage for a segment on IQC for Innovations with Ed Begley, Jr. Though my supervisor fielded the actual spoken material, you can spot me in the background of various “action shots” discussing clearly very important things™ with students and colleagues.

Here's the segment in question, first broadcast on Discovery Channel, May 25, 2015.

I've similarly been on Australian TV before, so that makes two continents that have had to deal with my mug on air.

Read this post

Published in Physical Review A: Using weak values to experimentally determine “negative probabilities” in a two-photon state with Bell correlations

It's well known that quantum entangled systems can exhibit correlations that go beyond those that could be seen if Nature worked in intuitive “classical” ways. However, as Richard Feynman noted, classical theory can support exhibiting such correlations if we invoke negative probabilities to describe their properties. What he did not do was specify how these negative probabilities ought to be chosen, and without any justification, an infinite number of different combinations could be chosen that will satisfy the relevant equations.

The concept of negative probabilities seems nonsensical because they cannot actually be observed—indeed, they cannot be observed even within the framework of quantum theory due to the effects of measurement back-action. Here, we show how they can instead be inferred through the use of weak measurements, where a meter is only weakly coupled to the property of interest, thereby avoiding the back-action problem. Each individual weak measurement has a high uncertainty, but by measuring many instances of a larger ensemble, an average can be found that implies a specific set of anomalous (i.e. beyond 0–1) probabilities. With an experimental demonstration, we thus give an empirically justified method for choosing the anomalous probabilities that allow the classical model [...]

Read this post

Accessing machines on a home network with sshuttle

You might have noticed that I'm running a little Raspberry Pi, acting as a server for my website as well as some other small server-ish tasks. This machine is actually on my home network and I also use it as the front-face to that network for incoming connections. There are other machines on this network, and while they are behind a NAT and so not addressible from the outside world, this is fine most of the time. But on the odd occasion where I'd like to directly address any other machine on that network, I have to do so through the Raspberry Pi. Depending on what it is I'm trying to do, exactly, that can be tricky.

I've just discovered sshuttle. It acts similarly to a VPN, using SSH under the hood to transport TCP packets through a server that you specify. The cool thing is that it doesn't require any complicated pre-configuration on the server—just Python. All you have to do is run it on the client machine you want to connect from. Nifty!

Debian has it packaged: run sudo aptitude install sshuttle. The binary itself apparently installs under /usr/sbin, which doesn't seem to be in a [...]

Read this post

Published in Nature Photonics: Experimental three-photon quantum nonlocality under strict locality conditions

Quantum mechanics implies properties of Nature that clash with our intuitive notions of how the universe ought to work. Testing these properties (to see if quantum mechanics is, indeed, true) involves generating entangled quantum states of two or more particles and measuring them under a number of strict conditions. While work is progressing to meet all of these conditions when using only two particles, no one has yet met even one of these conditions for more than two particles, which is considerably more difficult experimentally. Here, we conduct an experiment where we meet two of the most challenging conditions—namely measurement locality and freedom of choice—while generating triplet entangled photon states. We demonstrate that quantum mechanics wins out over intuition, measuring a violation of Mermin's inequality outside the classical bound by nine standard deviations.

C. Erven, E. Meyer-Scott, K. Fisher, J. Lavoie, B. L. Higgins, Z. Yan, C. J. Pugh, J.-P. Bourgoin, R. Prevedel, L. K. Shalm, L. Richards, N. Gigov, R. Laflamme, G. Weihs, T. Jennewein, and K. J. Resch
Nature Photonics 8, 292–6 (2014)

Also check out the News and Views article in the same issue, written [...]

Read this post

On backups/redundancy

Recent events gave me cause to consider my personal data backup and redundancy strategy for my Debian installs. Or, more accurately, it caused me to amend my half-baked and semi-implemented existing approach so that I won't lose data or have to reconfigure things from memory/scratch in the event of a hard disk failure.

My present “backup” approach is really somewhere between a time-limited backup and redundant storage. Essentially, I use Unison to synchronize my home folder (with certain sub-folders ignored, e.g. certain git repositories, config and thumbnail cache folders) between my desktop and my netbook. I have to run Unison manually, so I end up synchronizing my data every week, give or take. This effectively kills two birds with one stone: I get to have local copies of my important data as up-to-date as my last sync for when I'm on the road and using my netbook, and the delayed redundancy gives me some protection should anything go wrong with either hard drive.

I'll be clear: what I'm doing here is something not-quite a backup (safe storage of historical editions of files) and not quite proper redundancy (up-to-date duplication of data). The frequency of synchronization is the key [...]

Read this post


← Previous | Page 5 of 9 | Next →