Posts Tagged ‘privacy’

Quantum computers need vastly fewer resources than thought to break vital encryption – Ars Technica

April 19, 2026

thought ZK-proof was interesting…

https://arstechnica.com/security/2026/03/new-quantum-computing-advances-heighten-threat-to-elliptic-curve-cryptosystems/

QT:{{”
In a move that’s turning heads in security circles, Google isn’t releasing the algorithmic improvements that make this achievement possible. Instead, the researchers released a zero-knowledge proof that mathematically proves the existence of the algorithmic
enhancement without disclosing it.

“The escalating risk that detailed cryptanalytic blueprints could be weaponized by adversarial actors necessitates a shift in disclosure practices,” the authors explained. “Accordingly, we believe it is now a matter of public responsibility to share refined resource estimates while withholding the precise mechanics of the underlying attacks.” The researchers, who said they consulted with the US government in forging the new policy, went on to say that “progress in quantum computing has reached the stage where it is prudent to stop publishing details of improved quantum cryptanalysis to avoid misuse.”
“}}

Advanced Anti-Detect Browser for Web Scraping and Multiple Accounts Managing

April 6, 2026

How to Be Anonymous on Telegram: 2025 Essential Tips & Tools for Enhanced Privacy
Want to use Telegram anonymously? This guide provides comprehensive tips from registration to daily use, including virtual phone numbers, privacy settings, VPN/proxies, and recommends Nstbrowser to help you

QT{{”
Conclusion: Avoid using your real phone number; opt for reliable virtual number services.

This is the first and most crucial step to achieving Telegram anonymity. Do not use your real phone number to register a Telegram account. You can choose from the following methods to obtain a virtual phone number:

Paid Virtual Number Services: It is recommended to use reputable, stable paid services such as Google Voice, TextNow (available in some regions), Hushed, or Burner. These services typically offer one-time or long-term virtual numbers that can receive SMS verification codes. Paid services are generally more reliable, and the risk of numbers being abused is lower.
“}}

Nstbrowser – Advanced Anti-Detect Browser for Web Scraping and Multiple Accounts Managing
https://www.nstbrowser.io/en/blog/be-anonymous-on-telegram

Programming Differential Privacy

April 5, 2026

https://programming-dp.com/

Some snippets I liked….

QT:{{”
implement a function to check whether a dataframe satisfies
k-Anonymity, we loop over the rows; for each row, we query the dataframe to see how many rows match its values for the
quasi-identifiers. If the number of rows in any group is less than , the dataframe does not satisfy -Anonymity for that value of , and we return False. Note that in this simple definition, we consider all columns to contain quasi-identifiers; to limit our check to a subset of all columns, we would need to replace the df.columnsexpression with something else.

A function which satisfies differential privacy is often called a mechanism. We say that a mechanism satisfies -differential privacy if for all neighboring datasets and , and all possible sets of outputs (where refers to “sets of outputs of the mechanism.”)

The parameter in the definition is called the privacy parameter or the privacy budget. provides a knob to tune the “amount of privacy” the definition provides. Small values of require to provide very similar outputs when given similar inputs, and therefore provide higher levels of privacy; large values of allow less similarity in the outputs, and therefore provide less privacy.

Note that is typically a randomized function, which has many possible outputs under the same input. Therefore, the probability distribution describing its outputs is not just a point distribution.

In the definition of ε-differential privacy, the probability is taken over the randomness of the algorithm itself—that is, over the internal randomness used by the privacy mechanism to produce an output.

The important implication of this definition is that ’s output will be pretty much the same, with or without the data of any specific individual. In other words, the randomness built into should be “enough” so that an observed output from will not reveal which of or was the input. Imagine that my data is present in but not in . If an adversary can’t determine which of or was the input to , then the adversary can’t tell whether or not my data was present in the input – let alone the contents of that data.

According to the Laplace mechanism, for a function which returns a number, the following definition of satisfies -differential privacy: …
The Sensitivity of a function is the amount ’s output changes when its input changes in a minimal way. Intuitively, for a simple function with one numeric input, we think of the scenario where the input increases or decreases (changes) by 1.

However, more generally, define it in terms of adjacent dataset inputs.

Two datasets are said to be adjacent if they differ in the data of exactly one individual. This could mean adding or removing a single row (in the add-remove model) or changing a single row (in the substitution model). This notion of adjacency defines the smallest possible difference between datasets, and it forms the basis for reasoning about privacy guarantees.

The global sensitivity of a function is then generally defined as the maximum amount its output can change between any input pair of adjacent datasets.

Sensitivity is a complex topic, and an integral part of designing differentially private algorithms; we will have much more to say about it later. For now, we will just point out that counting queries always have a sensitivity of 1: if a query counts the number of rows in the dataset with a particular property, and then we modify exactly one row of the dataset, then the query’s output can change by at most 1.

Thus we can achieve differential privacy for our example query by using the Laplace mechanism with sensitivity 1 and an of our choosing. For now, let’s pick . We can sample from the Laplace distribution using Numpy’s random.laplace.

sensitivity = 1
epsilon = 0.1
adult[adult[‘Age’] >= 40].shape[0] + np.random.laplace(loc=0, scale=sensitivity/epsilon)


Sequential Composition

The first major property of differential privacy is sequential composition [11, 12], which bounds the total privacy cost of releasing multiple results of differentially private mechanisms on the same input data. Formally, the sequential composition theorem for differential privacy says that:

Sequential composition is a vital property of differential privacy because it enables the design of algorithms that consult the data more than once. Sequential composition is also important when multiple separate analyses are performed on a single dataset, since it allows individuals to bound the total privacy cost they incur by
participating in all of these analyses. The bound on privacy cost given by sequential composition is an upper bound – the actual privacy cost of two particular differentially private releases may be smaller than this, but never larger.
“}}

cost of privacy

March 14, 2026

game theory papers

https://www.science.org/doi/10.1126/sciadv.abe9986

Wan, Z., Vorobeychik, Y., Xia, W., Liu, Y., Wooders, M., Guo, J., Yin, Z., Clayton, E. W., Kantarcioglu, M., & Malin, B. A. (2021). Using game theory to thwart multistage privacy intrusions when sharing data. Science Advances, 7(50), eabe9986.
https://doi.org/10.1126/sciadv.abe9986

Guo, J., Clayton, E. W., Kantarcioglu, M., Vorobeychik, Y., Wooders, M., Wan, Z., Yin, Z., & Malin, B. A. (2023). A game theoretic approach to balance privacy risks and familial benefits. Scientific Reports, 13(1), 6932. https://doi.org/10.1038/s41598-023-33177-0

They seem to be more focused on the cost to the attacker

Metabolites and you – People leave molecular wakes that may give away their secrets | Science and technology | The Economist

January 3, 2026

https://www.economist.com/science-and-technology/2020/02/13/people-leave-molecular-wakes-that-may-give-away-their-secrets

Opinion | This Is the 21st-Century Arms Race. Can America Keep Up?

December 10, 2025

The Editorial Board. (2025, December 11). Opinion | This is the 21st-Century arms race. Can America keep up? The New York Times.
https://www.nytimes.com/interactive/2025/12/09/opinion/editorials/us-china-military-ai-tech.html

QT:{{”
Something strange happened at the meeting between President Joe Biden and President Xi Jinping of China in a mansion south of San Francisco on Nov. 15, 2023. After a working lunch, as the two leaders rose to leave, an aide to Mr. Xi signaled to one of the Chinese president’s bodyguards, who approached the table, took a small bottle out of his pocket and quickly sprayed down every surface that Mr. Xi had touched, including what remained of the almond meringue cake on his dessert plate.

The purpose, the Americans concluded, was to remove any trace of Mr. Xi’s DNA that his hosts might collect and exploit. “This is the way they’re thinking,” said an official who attended the meeting, “that you could design a disease that would only affect one person.” …
This year two major companies — OpenAI and Anthropic — warned that if nothing is done, A.I. will soon be able to assist bad actors attempting to create bioweapons. Students at M.I.T. used chatbots to come up with four pandemic pathogens. The A.I. explained how to generate them from synthetic DNA; it suggested companies that were unlikely to screen orders for the DNA; and it recommended that if the students lacked the skills to do all this, they could contact a research organization. This was done in one hour.
“}}

2nd part of a multi-part series

‘Biometric Exit’ Quietly Expands Across U.S. Airports, Unnerving Some – The New York Times

December 7, 2025

https://www.nytimes.com/2025/09/26/travel/airports-biometric-exit-program.html

Arrests in Louvre Heist Show Power of DNA Databases in Solving Crimes – The New York Times

November 9, 2025

https://www.nytimes.com/2025/11/03/world/europe/louvre-heist-dna-databases.html

Protecting Human Genomic Data When Developing Generative Artificial Intelligence Tools and Applications | Grants & Funding

November 8, 2025

https://grants.nih.gov/news-events/nih-extramural-nexus-news/2025/05/protecting-human-genomic-data-when-developing-generative-artificial-intelligence-tools-and-applications

Protecting Human Genomic Data When Developing Generative Artificial Intelligence Tools and Applications | Grants & Funding

November 8, 2025

https://grants.nih.gov/news-events/nih-extramural-nexus-news/2025/05/protecting-human-genomic-data-when-developing-generative-artificial-intelligence-tools-and-applications