Publications
(please cite as "Chua, E. Y. S" or "Chua, Eugene Y. S.")
2025
Decoherence, Branching, and the Born Rule for a Mixed-State Everettian Multiverse (joint work with Eddy Keming Chen, Synthese, accepted)
In Everettian quantum mechanics, justifications for the Born rule appeal to self-locating uncertainty or decision theory. Such justifications have focused exclusively on a pure-state Everettian multiverse, represented by a wavefunction. Recent works in quantum foundations suggest that it is viable to consider a mixed-state Everettian multiverse, represented by a (mixed-state) density matrix. Here, we discuss the conceptual foundations for decoherence and branching in a mixed-state multiverse, and extend arguments for the Born rule to this setting. This extended framework provides a unification of `classical' and `quantum' probabilities, and additional theoretical benefits, for the Everettian picture.
A family of arguments for black hole evaporation relies on conservation laws, defined through symmetries represented by Killing vector fields which exist globally or asymptotically. However, these symmetries often rely on the idealizations of stationarity and asymptotic flatness, respectively. In non-stationary or non-asymptotically-flat spacetimes where realistic black holes evaporate, the requisite Killing fields typically do not exist. Can we `de-idealize' these idealizations, and subsequently the associated arguments for black hole evaporation? Here, I critically examine the strategy of using `approximately Killing' fields to de-idealize black hole spacetimes and approximately extend conservation laws to non-idealized cases. I argue that this approach encounters significant challenges, undermining the use of these idealizations to justify the evaporation of realistic -- rather than idealized -- black holes, and raising questions about the justified use of such idealizations.
2024
The Time in Thermal Time (Journal for General Philosophy of Science, accepted)
Invited contribution to Special Issue "On Time in the Foundations of Physics", eds. Andrea Oldofredi and Cristian Lopez
Preparing general relativity for quantization in the Hamiltonian approach leads to the `problem of time,' rendering the world fundamentally timeless. One proposed solution is the `thermal time hypothesis,' which defines time in terms of states representing systems in thermal equilibrium. On this view, time is supposed to emerge thermodynamically even in a fundamentally timeless context. Here, I develop the worry that the thermal time hypothesis requires dynamics -- and hence time -- to get off the ground, thereby running into worries of circularity.
LLM-Powered Psychiatry from Back to Front (The Blog of the Linde Center for Science, Society, and Policy, Caltech)
Non peer-reviewed, shortened, version. Full paper to follow.
The increasing demand for mental healthcare and the persistent shortfall in supply for mental healthcare providers creates a problem of access, something that large language models (LLMs) seem poised to ameliorate. However, we sketch out some potential ethical risks which may arise for LLM-powered psychiatry, starting from LLMs' back-end stochastic nature, to hyperparameter tuning, fine-tuning and prompting. We end with some questions about whether LLMs can really resolve the problem of access.
2023
Winner of the 2022 Mary B. Hesse essay award and the 18th Robert K. Clifton Prize in Philosophy of Physics
Taking the formal analogies between black holes and classical thermodynamics seriously seems to first require that classical thermodynamics applies in relativistic regimes. Yet, by scrutinizing how classical temperature is extended into special relativity, I argue that it falls apart. I examine four consilient procedures for establishing the classical temperature: the Carnot process, the thermometer, kinetic theory, and black-body radiation. I show how their relativistic counterparts demonstrate no such consilience in defining relativistic temperature. As such, classical temperature doesn’t appear to survive a relativistic extension. I suggest two interpretations for this situation: eliminativism akin to simultaneity, or pluralism akin to rotation.
Invited contribution.
2022
Degeneration and Entropy (Kriterion: Journal of Philosophy 36(2), 2022)
Invited contribution to Special Issue "Lakatos's Undone Work: The Practical Turn and the Division of Philosophy of Mathematics and Philosophy of Science", eds. S. Nagler, H. Pilin, D. Sarikaya
Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations – superfluity and authoritarianism. I show how these criteria augment the account in Methodology of Scientific Research Programmes, providing a generalized Lakatosian account of progress and degeneration. I then apply this generalized account to a key transition point in the history of entropy – the transition to an information-theoretic interpretation of entropy – by assessing Jaynes’s 1957 paper on information theory and statistical mechanics.
2021
No Time for Time from No-Time (with Craig Callender, Philosophy of Science 88(5), 2021)
Programs in quantum gravity often claim that time emerges from fundamentally timeless physics. In the semiclassical time program time arises only after approximations are taken. Here we ask what justifies taking these approximations and show that time seems to sneak in when answering this question. This raises the worry that the approach is either unjustified or circular in deriving time from no–time.
Does Von Neumann Entropy Correspond to Thermodynamic Entropy? (Philosophy of Science 88(1), 2021)
Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical mechanical and thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite-particles case, and the infinite-particles case.
2020
(Video of talk available in gallery.)
Explainability algorithms such as LIME have enabled machine learning systems to adopt transparency and fairness, which are important qualities in commercial use cases. However, recent work has shown that LIME's naive sampling strategy can be exploited by an adversary to conceal biased, harmful behavior. We propose to make LIME more robust by training a generative adversarial network to sample more realistic synthetic data which the explainer uses to generate explanations. Our experiments demonstrate that our proposed method demonstrates an increase in accuracy across three real-world datasets in detecting biased, adversarial behavior compared to vanilla LIME. This is achieved while maintaining comparable explanation quality, with up to 99.94% in top-1 accuracy in some cases.
2017
The laws of classical logic are taken to be logical truths, which in turn are taken to hold objectively. However, we might question our faith in these truths: why are they true? One general approach, proposed by Putnam and more recently Dickson or Maddy, is to adopt empiricism about logic. On this view, logical truths are true because they are true of the world alone – this gives logical truths an air of objectivity. Putnam and Dickson both take logical truths to be true in virtue of the world’s structure, given by our best empirical theory, quantum mechanics. This assumes a determinate logical structure of the world given by quantum mechanics. Here, I argue that this assumption is false, and that the world’s logical structure, and hence the related ‘true’ logic, are underdetermined. This leads to what I call empirical conventionalism.