Brian Lovin
/
Hacker News
Daily Digest email

Get the top HN stories in your inbox every day.

dsign

Oh but it will get worse. Legislation to force companies to install survtech in their devices/apps is already being pushed left and right. We are still screaming a little about it, but I think it's a matter of time before it gets normalized and the state goes for the next level, which will be to prosecute individuals who try to evade the surveillance net. The recent case with GrapheneOS[^1], while still far from being an example of it, it is sufficient to inspire some legislators...

[^1] https://www.androidauthority.com/google-pixel-organized-crim...

microtonal

That's why we need to get as many people on surveillance-free devices as quickly as possible. 400K users [1] may be easy to ignore or make suspect, 4M is a little harder, 40M is a serious blip on the radar, 400M is a major force (one can dream).

If you do not like surveillance capitalism (which enables government surveillance), get a compatible phone and install GrapheneOS now. Help family and friends get set up tomorrow. Make it a force too large to reckon with before the legislation is there (legislation is somewhat slow, so there is a window of opportunity).

[1] https://x.com/GrapheneOS/status/2047321144601071673

coldtea

People have been bending over for policies and changes with way more impact to their everyday lives and livelihoods, and they'll rise up for this? That's daydreaming.

microtonal

Maybe not, but I have seen a quick rise in interest in GrapheneOS and other AOSP-based alternatives among tech people. I think the current state of the US has done a lot to make people more motivated.

undefined

[deleted]

dyauspitr

I could care less about surveillance on my phone or the internet. I can just stop using the internet for anything besides the necessities. It’s physical, IRL surveillance that is a nightmare. You can’t escape it, there’s no way to opt out and consent doesn’t matter.

nextaccountic

Phone surveillance is IRL surveillance. That's because your phone connects to cell towers that exist in the real world, and they can snitch on your precise location in real time.

Consent never mattered btw.

coldtea

"I don't care for surveillance on devices/internet because I can always cut off myself from the thing 8 billion people use and has become absolutely essential, and often mandated or strongly pushed, for work, banking, and even government interactions"

i_love_retros

So you do care somewhat about surveillance on your devices then!

dyauspitr

Where did I say that? The necessities? All the necessities are browser based. I can do that on Kali if I absolutely needed too.

frankharv

I agree.

Even if you walk by a FLOCK camera you are catalogued.

Sickening bravado of these privacy stealing folk.

2ndorderthought

The time to resist against these policies and technologies was 2-5 years ago.

Every single person in the US's future, safety, rights and freedom is currently at stake. There is no more time left to wait and see how things play out.

gmuslera

More like 13 years ago, when Snowden revelations made the reach of this public. Nothing was done, and this kept expanding till today state of things. No one should be surprised.

And over the domestic surveillance, that had some complaints back in that time, there is the point of foreign surveillance and intervention, that had no slowdown back then, so you can figure out where that should be today. At least Americans have some saying on their government and policies, but for the rest of the world is just the new normal.

htx80nerd

I'm old enough to remember AT&T room 641A

znpy

> More like 13 years ago, when Snowden revelations made the reach of this public. Nothing was done, and this kept expanding till today state of things. No one should be surprised.

Yeah, obama was president at the time.

A lot of fanfare and then nothing happened.

People were also being deported by ICE, in larger quantities, but that didn’t even make the news.

It’s always “weird” when the same action get different a connotation depending on who’s president…

majormajor

A couple things you're ignoring or underplaying for some stupid political-score-keeping reason:

1) Many were upset, especially here and in the general tech media, with the Snowden information. Is "a lot of fanfare and then nothing happened" worse to you than "no fanfare and nothing happened"? The fanfare regardless of who was in office on that info is telling, there, no?

2) Many of those policies went back well before Obama

Not sure why you're trying to deflect "we should be fighting this" into "Obama bad, actually!" when the evidence is very clear that it crosses parties, has crossed parties for decades, and will almost certainly continue to if the status quo is maintained.

Possibly because you don't want to fight it?

majormajor

It's an explicit policy of the Trump admin to not just increase the volume of deportations (regardless of if they've hit their goals yet), but also increase the speed and disruptiveness of them (picking people up when they're at their regularly scheduled "trying to do it the right way" appointments, for instance), and reduce judicial process and oversight.

It's very intentionally NOT the same action, because they're looking for more red meat for the base to distract from any number of other failed promises on affordability, jobs, etc. They've really been unable to do much there other than, at best, "stay on or close to the trend line from 2023-onward as the covid-induced supply chain bullwhips and demand whiplash effects started to recede."

Have you considered that one can protest against those changes independently of doing math on how many happened in 2013? Or that they might also take into account certain notable other actions on immigration taken by the Obama administration as a balancing factor?

If anything, doesn't that suggest that the Trump admin's moves to bypass legal safeguards are unnecessary and are just increasing the militarization of the federal government for nothing?

coldtea

>People were also being deported by ICE, in larger quantities, but that didn’t even make the news.

That's because most news are (or were) partisan with a liberal bias.

Just like a republican bias makes you miss the fact that whether Obama was deporting more in larger quantities, Trump has moved the Overton window about what ICE is allowed to do, how blatantly they can do it, and what they get away with even if it's still illegal.

wat10000

Ever think that maybe it’s not the deportations that are the problem, but the murders and other human rights abuses?

And the fact that there was a lot of fanfare over Snowden rather undermines your point. People did make a big deal about it. It didn’t go anywhere because at the end of the day, the establishment on both sides is in favor of that stuff. It didn’t get any more action after Obama left office.

an0malous

Flock is a YC company. I don’t think the resistance will be organized on HN in spite of its ostensibly hacker ethos

gehwartzen

When it comes to Flock in particular I’ve been seeing a lot more in terms of resistance and pushback in local Reddit communities. At least in my cities sub I see posts regarding anti-flock messaging or related activities at least once a week now.

davidw

Yeah, where I live, one guy was like "You know what, I've had it". He then started organizing within the community and got a big crowd to show up at a city council meeting, and we ended up getting rid of the Flock cameras. Yay!

2ndorderthought

There are enough normal people here it is still worth trying.

notfromhere

Well yeah, YC is a tech incubator plugged pretty deep into the SV hivemind, and the leading figures of it seemed to have decided that fascism is a better alternative to any kind of regulation on their activities.

undefined

[deleted]

michaelsshaw

[flagged]

tomhow

Just a few days ago:

“Regardless, it's acceptable here to mock climate deniers, capitalists (landlords, CEOs, Billionaires), SUV or truck drivers, religious fundamentalists, various flavors of conservatives…” [1]

Both these positions are examples of an effect that dang called the “notice dislike bias” [2].

From reading the discussions here every day for years, there’s more criticism of Flock, Musk and major tech figures/companies than there is support.

Regardless of that, it’s not cool to sneer at things on HN, including the rest of the community. This is a site for curious conversation, not intellectual strutting and preening. Curiosity and humility are intrinsically linked. Not everyone plays chess but can still benefit from learning about its concepts, even if you feel it’s beneath you. I’ve been in tech for many years and had never heard that knowing all about chess was inherent to the “hacker ethos”.

[1] https://news.ycombinator.com/item?id=47932456

[2] https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

b00ty4breakfast

I can't speak to years past but HN is an off-shoot of a VC firm so it's not too surprising that the culture here is based in the California Ideology milieu.

spacechild1

The HN community is not a monolithic entity. Yes, there are many libertarian SV folks, but there are also plenty of people (like myself) who despise that culture and push back on a regular basis.

estimator7292

Lot of faux-intellectual buzzwords in there, bud

JumpCrisscross

> time to resist against these policies and technologies was 2-5 years ago

The time to resist the next crop of policies and technologies is today.

And I disagree the ground was more fertile for action in Covid. The silver lining to the AI companies’ PR and political ineptitude is that there is widespread, bipartisan pushback against tech in all stripes.

Ekaros

The time was 30 years ago. Back then anyone responsible should have been properly dealt with.

coldtea

And yet people will wait, and things will play out in the worse scenario.

pessimizer

It's been a lot longer than that. You may have forgotten to include the encroachments that you supported; you seem only to have started with the disaster under Biden.

For example, I never hear about how hard librarians* fought against "National Security Letters" after 9/11. How quaint it is now to imagine that people thought that there should be a fundamental right to be able to read freely and without disclosing what you read to anyone, especially governments?

Technology has only made this cheap to do at scale.

For people who may not be familiar, the government insisted on the right to go into libraries and get a list of the books you've read. Hell, it's basically just a "pen register**," and the culture not only gave up on resisting that this data be considered private, but forgot why anyone would have ever thought that way.

Now we're arguing about forced digital attestation, but we're barely arguing about digital ID anymore ("of course" we need that), or even remember that most people were against federal identification in the US. Federal identification failed at every point to gain any support; it was pushed hard and failed during the Clinton admin, finally passed with everything else of this nature after 9/11, and then it was resisted and ignored enough to force deadlines to be pushed farther and farther back - it's been 30 years of RealID at this point.

There's no evidence that the population ever supported federal ID. The idea was forced upon them, and they just waited a generation for people to forget that the government once didn't even know or care that many people existed. 30 years from now, it will probably be weird trivia that the census was done anonymously: "You mean you didn't have to sign it under penalty of perjury? What would be the point of the data if you didn't know who it belonged to?"

In 5 days, May 27, 2026, you'll have to pay a fee of $45 in order to get on a plane for not having Real ID.

It's so obvious that these claims of necessity are always just excuses for a power grab. British Labour, who spent decades supporting huge amounts of immigration and then calling everyone racist who thought it was too much, now like Trump uses the prevention of illegal immigration as a reason to impose digital ID on everyone. They're xenophobes when it comes to tracking everyone's movements, but xenophiles when they needed to lower wages. Vote Tory, then! Nope, they supported and oversaw every element of all of this. None of this stuff ever sees a ballot.

[*] https://www.library.illinois.edu/ala/2024/10/07/15-years-of-...

[**] https://en.wikipedia.org/wiki/Smith_v._Maryland

lotsofpulp

Enemy of the State came out in 1998, and the capabilities in that movie were not far fetched, just lacking in bandwidth.

dw_arthur

A surveillance state was always inevitable once wireless networking, GPS, and cameras were ubiquitous. If you say this isn't true, show me anywhere in the world with these technologies that is not headed down this path.

notfromhere

its inevitable if you do nothing to organize politically against it.

senexes

This makes for nice political slogans but Hank Asher had the entire state of Florida DMV records in the early 90s and we did nothing. Then we did nothing after 9/11 and the patriot act. We did nothing between 9/11 and Snowden. We did nothing after Snowden. We have literally done nothing in 35 years but now is the time to start? Snowden is probably in political office in a society that had the will to do something about this.

At a deeper level, I think people would need to care more about the outcomes and higher order effects of political decisions and not just the emotional weight of political slogans. The fundamental problem is that is not the society we live in.

bigbadfeline

Many other reasons to do it too.

htx80nerd

>"if you do nothing to organize politically against it"

how does one politically organize against a billion dollar industry which is friends with, and donates to, the ruling class?

they do whatever they want and we just post about it online and click 'like' or post emojis.

bigbadfeline

> how does one politically organize against a billion dollar industry which is friends with, and donates to, the ruling class?

You're mixing the places of horse and cart here - the ruling class is ruling because it's organized. Organization comes first, the presence of other organizations, be them ruling or not, has little bearing on the process.

> we just post about it online and click 'like' or post emojis.

That's what you do without organization. It still helps though, getting to the truth isn't easy these days.

laughing_man

It was really tiny, inexpensive cameras and wireless networks. Cameras are everywhere now. They're so cheap they're almost free, and it doesn't require an expert to install them.

ch4s3

It’s inevitable that some country would do it, but not inevitable that any given nation would do so, except maybe the CCP.

uoaei

Europe is, compared to the US, doing a lot more for protection of private data. That includes strict guardrails on what data can be collected and how it is used.

Secret courts still exist but the phenomenon of random Flock employees spying on children in locker rooms at gyms is so much harder to get away with in a system with a modicum of decency.

Chat control was actually shot down, and that was the UK not Europe (anymore).

Laws are different in different places. The world is not composed of America and other-Americas.

coffeeling

Chat control was shot down, and will be proposed again. And again. And again. Until the people/governments vote correctly, and then dismantling it won't be up for a constant vote.

xboxnolifes

Saying something was shot down isnt that strong of an argument. The US government has proposed and shot down surveillance laws hundreds of times, until one finally passes.

uoaei

Ok, sure. You want more words to say the same thing, here you are.

It got vociferous support from the highest levels of government even though the deception ("protect kids!") was so blatant and transparent, and it wasn't until a legion of privacy and in particular tech-literate advocates raised concerns in mass media together with an awareness campaign about the dangers of unchecked surveillance structures that it was finally... shot down.

mon_

Chat Control was proposed and rejected in the European Union

uoaei

You're right, mixed up the names, in UK they called it Online Safety Act.

zug_zug

Uh France? It annoys me when people say this stuff is "inevitable." No, many countries have forcibly "reshaped" their government (French revolution, American revolution, etc etc) and nobody has any basis for saying it won't happen again, perhaps many more times.

TFNA

French Revolution is largely regarded as a tragedy. It led first to the Terror, and after that a series of new monarchies over the following century.

Revolutions in most countries have generally replaced one faction of the ruling class with a competing faction of the ruling class, with little actual change for the people.

bigbadfeline

> It annoys me when people say this stuff is "inevitable."

"Resistance is futile" is an old slogan of them Borgs.

thaumasiotes

A scene from the Chinese 1980s period drama "Like a Flowing River 2":

Lei Dongbao, party secretary of a small village, is courting the owner of a restaurant in a nearby city. He persuades her to let him care for her young son over the weekend.

As he's heading back to his village on his motorcycle with the boy seated behind him, he drives by some women resting in the shade by the side of the road. One of them remarks to another, "Why does the secretary have a child?"

By the time he arrives at his office, all of his subordinates - and one of their wives - have turned out to meet him and say hello to the child.

https://www.basicinstructions.net/basic-instructions/2019/9/...

> Citizens, on the other hand, don’t like red light cameras because they don’t want to be fined. They complain that the cameras are an invasion of their privacy. I don’t buy that because I grew up in a small town, and as such I understand that privacy is a myth.

JumpCrisscross

What’s the fix? What’s a simple rule change that would, at the very least, take these data out of law enforcement’s hands outside the most-necessary situations?

undefined

[deleted]

2ndorderthought

You may not realize it but this isn't even about law enforcement. It's also about tech companies having the data. What they will do with it, who they will sell or leak it too.

It's about the amount of data. It's about what it can be used for from military adjacent organizations under a fascist regime. Whether you think the us is headed toward fascism or not, what if it did? That's the point.

JumpCrisscross

> this isn't even about law enforcement. It's also about tech companies having the data

One is a clear and present danger. The other is a hypothetical danger. Both deserve being addressed. But if only one is going to get political capital, it should be the first.

(I've worked on technology privacy issues. My takeaway is the public is broadly fine with the tradeoff. Folks in tech are not. But folks in tech with strong views on privacy are politically useless due to a combination of self-defeating laziness and nihilism.)

bigyabai

Hypothetical my ass. It's only hypothetical in the same way the Sword of Damocles could "hypothetically" kill someone. Every spook in the three-letter agencies has known this for decades, and now the lawful intercept weapon has been turned on them with Salt Typhoon. How anyone can call the threat "hypothetical" is beyond wishcasting and downright dishonest.

You cannot change the rules to fix this. You can only change your personal habits. I wish it wasn't like this, but none of those agencies can be held accountable by design.

microtonal

You may not realize it but this isn't even about law enforcement. It's also about tech companies having the data.

This. The lesson of the past decades is: if some organization has the data, eventually it becomes too attractive not to (ab)use it. Even Apple, which sold itself as a privacy-first company is slowly adding more and more ads. Squeezing out more profits is just too attractive with the pile of data that they are sitting on. Similarly, bad governments will require access to the data if they can.

Employees inside companies should push back collection of data as much as possible (the GDPR helps a lot in Europe). If you do not have the data, you cannot use it in a user hostile-way in the future and governments cannot request data that you do not have. If you have to store data, go for end-to-end encryption.

Citizens should try to escape the Apple/Google duopoly (e.g. by installing GrapheneOS), block trackers, and only install the necessary apps (no app = no easy tracking). For apps that you do need, revoke as many sandbox privileges as possible.

_factor

An open source community driven surveillance network that alerts the community when it is accessed by a select list of “trusted” governing officials. Clearly outlined access rules that are policy driven, technically controlled and auditable.

Sure Flock, we buy your safety pitch. We just don’t trust you.

JumpCrisscross

> surveillance network that alerts the community when it is accessed by a select list of “trusted” governing officials

This is the worst of all worlds. Actual criminal investigations get thwarted or the reporting requirement gets diluted to the point of being useless (“someone looked for something today!”). And a burden of vigilance shifted onto the public.

_factor

And it will be public and someone can be held accountable. Heck put an AI in it that scans for a list of items and reports when they see it. An actual investigation will have public pressure to access data. Lax policies will show the increased usage.

Funding the police is the burden of vigilance already on tax-payers. We’re already approach the worst of worlds. Your perspective just points to human organizations being unsustainable, not this concept in particular.

gzread

None, because they are above the rules. You need actual enforcement.

Or the other guy's community network idea but it would have to also publish the realtime activities and whereabouts of all politicians who voted against making this illegal.

Much like the law that stopped video rental companies from telling what their customers were renting, that passed after some politicians had their video rental histories leaked.

JumpCrisscross

> they are above the rules

They’re above the rules for a political cycle because we’re shifting to a system of spoils. That doesn’t change that everything they’re doing right now is legal. (Outside ICE. They’re a warren of criminality right now.)

thrance

No "simple rule", I'm afraid. Push money out of politics and aggressively redistribute wealth to curb inequalities, that's the only way to weaken the reactionary and authoritarian ideals currently flourishing. Until then, surveillance is a given.

mindslight

The straightforward broad brush fix is a US port of the GDPR. Make mass surveillance commercially unlucrative, and most of the data currently available to the government won't be collected in the first place. Furthermore, it's a basic line in the sand that gives individuals an idea that privacy is an actionable right, not just something to powerlessly complain about.

That this culture shift would need time to trickle down into positive bans on surveillance performed by the government (eg Flock), or requiring audit trails for government use of commercial data that still gets collected, shows how far we're behind.

(I use the word "port" to indicate that we need to avoid letting lobbyists stuff it full of loopholes and regulatory capture the way everything else is. Heck I think we could do worse than copying the text verbatim and letting the courts sort it out)

mnicky

Yeah. I really like the main idea behind GDPR, which is that data containing PII is the property of the person it describes, not of the companies that process the data to provide services.

This means that I, as the owner of my data, can refuse to provide it for some use cases, request its deletion, etc. It’s my data after all.

estimator7292

The older and more jaded I get, the more I think that the only way to fix this mess before we all die of climate change is to dump the entire US government off a cliff and write a new constitution.

As the founding fathers intended.

JumpCrisscross

> the only way to fix this mess before we all die of climate change is to dump the entire US government off a cliff and write a new constitution

We don't have public consensus on major questions, in my opinion, to make this a fruitful endeavour.

One thing we need is a political movement to push for Constitutional amendments. My five are, in decreasing order of priority, (1) multi-member Congressional districts, (2) striking the pardon power, (3) abolishing the electoral college and creating a referendum requirement for major legislation, (4) changing the first sentence of Article II to "the President shall execute the laws of the United States," and (5) permitting the Congress to charter independent agencies for up to 20 years.

convolvatron

i think we should be starting this discussion! how about fixing the commerce clause? gerrymandering?

vostrocity

One idea I haven't seen much discussion on is "provably beneficial surveillance" [1], which builds off of Nick Bostrom's vulnerable world hypothesis. It seems like the best path forward.

>We can turn that conventional wisdom on its head, by reframing it as a question: is it possible to do surveillance and consequent policing in a way that is (a) compatible with or enhances liberal values, i.e., improving the welfare of all, except those undermining the common good; and also (b) sufficient to prevent catastrophic threats to society? I call this possibility Provably Beneficial Surveillance. It's a concept expanding on an old tradition of ideas, including search warrants, due process, habeas corpus, and Madisonian separation of powers, all of which help improve the balance of power between institutions and individuals. In particular, all those ideas help enable surveillance in service of safety, while also taking steps to prevent abuses of that power.

1. https://michaelnotebook.com/optimism/index.html

bigyabai

Salt Typhoon is the refutation to this. Building and enforcing a "lawful intercept" system formally codifies an exploit chain for your adversaries to use. If you don't want your politicians and dignitaries being blackmailed by foreign opposition, don't even consider this type of system for widespread development.

Let America be the canary in this particularly toxic coal mine, and refuse similar systems wherever you are locally.

hackable_sand

No discussion because it's a bad idea

Try a little harder. You got this

2ndorderthought

Nope. That's not how any of this is trending at all. Being optimistic is good for getting through tough times. Albeit sometimes. It might help people sleep at night but sleeping our way into technofacism won't make it any better for us or our children.

vostrocity

Did you have a better path forward?

I point to Michael Nielsen's commentary on Vulnerable World Hypothesis [1] again:

>do you think inexpensive, easy-to-follow recipes for building catastrophic technologies will one day be found, given sufficient understanding of science and technology?

With every increase in technology and science, the probability increases, and as a result, society will necessitate ever more surveillance. The reason provably beneficial surveillance is important to discuss is that we need a careful middle path between totalitarianism and outright catastrophe. It is the opposite of "sleeping our way" into technofascism.

1. https://michaelnotebook.com/vwh/index.html

2ndorderthought

I disagree on a fundamental level. Crime is down. It's been trending down since the 90s. The 90s to the early 2000s ushered in more technological change than the century prior as far as the common person is concerned.

There's no need for mass surveillance and there never will be.

"Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.", spoken by someone who knew better and just so happened to help found this country.

kmeisthax

"Provably beneficial surveillance" is the wrong framing.

What you're trying to say is that the harms of surveillance are diminished when the underlying power is distributed enough that cops have to justify themselves in order to access the surveillance powers. That's why we have a 4th Amendment that demands cops get warrants before doing searches and seizures. Think of the difference between a store with a security camera that records to a local network DVR, and the same store but they bought some Ring cameras and send it to Amazon's servers. The former is the necessary amount of surveillance to prove a crime happened, the latter is just enabling abuse.

vostrocity

I think it is a new framing that merits discussion.

Example case is the school shooter in Canada that OpenAI knew about but chose not to warn authorities of (presumably because OpenAI wants to balance safety and privacy).

OpenAI (or any other big tech) has extreme concentration of power and knows more about its users than any government authority.

At what point should OpenAI alert authorities?

I would much rather have "provably beneficial surveillance" than OpenAI having an arbitrary black box policy or for government authority to have direct backdoor to all OpenAI data.

adrian_b

All the known history of humans is evidence against the possibility of existence of "beneficial surveillance".

This is a utopian idea of the same kind as the idea of theoretical communism.

The communist theory argued that because the owners of assets can use their power in nefarious ways against the others this can be easily solved by dispossessing them of their assets and transforming all such private assets into assets that belong to the common property owned by all people. Then all assets will be used for the welfare of the entire society.

The fallacy of this theory was that when something belongs to all people it is impossible for all people to manage it directly. So there must be a layer of relatively few middlemen who manage the assets directly.

In all the communist societies, instead of managing the assets for the common good, those middlemen have succeeded to become the de facto owners of the assets, despite not being de jure their owners. And then they managed the assets according to their personal interests, like any capitalist billionaire.

The only difference was that the communist elite was much less secure in their positions than rich capitalists, because not being the legal owners of a company or of other such valuable assets meant that they could lose their privileges at any time if their boss in the communist party hierarchy no longer liked them and sent them to an inferior position.

This hierarchical dependence ensured that the communist elite had to obey more or less whatever the supreme leader ordered. Except for this obedience, there was no real difference between a communist economy and the extreme stage of monopolistic capitalism, despite what the naive theory of communism hoped to achieve by nationalizing everything of value.

Similarly, I see no hope for a theory of "beneficial surveillance". Such beneficial surveillance could exist only if it were controlled by good-willing people. But this will never happen, like in practical communism, some of the worst people will be those who would succeed to control it.

vostrocity

I'm intrigued by Michael Nielsen's thoughts on cryptography applied to synthetic biology risk.

I'll quote his notes on using cryptography to maintain a balance of privacy and safety:

>To help address such concerns, it's been proposed that synthesis screening should use cryptographic ideas to help preserve customer privacy, while still ensuring safety. Let me mention three such ideas, some of which have already been implemented in a prototype system built by the SecureDNA collaboration. The first idea is that the screening itself should be done with an encrypted version of the sequence data, to help preserve customer privacy. The synthesis step would still require the raw sequence data, but such encryption would at least prevent centralized screening services from learning the sequence being synthesized. Second, as mentioned above, screening for exact matches and homologous sequences won't catch everything, especially as de novo design becomes possible. So it's also been proposed that an encrypted form of the sequence data should be logged and kept after synthesis. That data could not routinely be read by the synthesis company or screening service. However, suppose some later event occurs – say, some new pandemic agent is found in the wild. Then it should be possible to check whether that agent matches anything in the encrypted synthesis records. In the event such a check was needed, a third party authority could provide a kind of "search warrant" (a private key of some sort) to decrypt the data, and identify the responsible party. The third idea is to use cryptography to ensure the screening list remains private, and can even be updated privately by trusted third parties, without anyone else learning the contents of the update. Taken together, these three ideas would help preserve the balance of power between customers and the synthesis companies, while contributing to public safety and enabling imaginative new synthesis work to be done.

>Indeed, cryptographers are so clever that they've devised many techniques you might a priori deem impossible, or not even consider at all. Ideas like zero knowledge proofs, homomorphic encryption, and secret sharing are remarkable. As software (and AI) eats the world, cryptography will increasingly define the boundaries of law.

vostrocity

You mentioned communism, and I'll add to that since I've lived under communism. It's a great idea in theory that doesn't work in practice because of human limitations.

It doesn't work, because A) the reason you said: government officials favor themselves, and B) the knowledge problem: the economy is far too complex for a small group of officials to plan what everyone else should be doing.

An interesting idea that emerges now is an AI-moderated socialism. If A) AI can be trusted to not favor itself, and B) AI has perfect knowledge of each human (our needs, what we're good at, etc.), I can imagine an AI-moderated socialism to work.

An ideal future I can imagine is a world with many AI-moderated polities, and humans have freedom to move between them. AI-moderated polities share some global standards on safety, trade, and conflict resolution but otherwise have differing policies so humans have the freedom to find the one that they most prefer.

redanddead

This is a global phenomenon, not just the US. It's also accelerating, because precedent in one Commonwealth or EU country spreads very quickly. It feels like lawmaking in general, legislation, regulation is accelerating globally.

givemeethekeys

Fear sells. Everyone is afraid of getting sued and being denied insurance. The answer is cameras!

Dividing people to vilify each other over race, religion, gender, ethnicity and even politics is incredibly profitable. Once they're afraid of their neighbors, they'll happily pay someone to protect them at every turn.

juliusceasar

Somehow they seems to miss the criminals at the top...

triage8004

Let's just stop with the illegal data mining of Americans, okay?

cmrx64

and soon from space? radio engineering breakdown of starlink radar capabilities, it’s a pretty impressive bird if you were designing it only for that: https://youtu.be/jbp3kdJZ1_A

caycep

you know it's bad when even ol' Rupert is worried

Daily Digest email

Get the top HN stories in your inbox every day.