top of page

Works in Progress

Time's Emergence and Physical Coherence (draft available upon request!) 

It is said that time disappears in quantum gravity. Yet time seems to exist in our world. This raises a question of how, if at all, time exists. One response is to ’walk the middle way’ between fundamentally timeless physics and manifestly temporal reality by deriving time from timeless physics. If successful, the middle way explains why time emerges non-fundamentally, despite timeless physics. However, Baron, Miller & Tallant (2022) recently argued that this approach faces metaphysical incoherence: the metaphysics of emergence requires spatiotemporality,
and can’t be coherently applied to a fundamentally non-spatiotemporal world. I augment this worry and argue that the middle way also risks physical incoherence. Explanatory projects in physics seeking to derive time from timeless reality might employ temporally laden concepts, running into circularity. I illustrate this worry with two proposals for time’s emergence: the semiclassical and thermal time programs..

The Time in Thermal Time (draft available upon request!)

Attempts to quantize gravity in the Hamiltonian approach lead to the 'problem of time'; the resultant formalism is often said to be `frozen', non-dynamical, and fundamentally timeless. To resolve this problem, Connes & Rovelli (1994) suggest the adoption of a `thermal time hypothesis': the flow of time emerges thermodynamically from a fundamentally timeless ontology. While statistical states are typically defined to be in equilibrium with respect to some background time,  Connes & Rovelli propose that we instead define time in terms of these statistical states: statistical states define a time according to which they are in equilibrium. To avoid circularity, we better have a good conceptual grasp on notions such as `equilibrium' and `statistical state' which are independent of time. Here, however, I argue that these concepts either implicitly presuppose some notion of time, or cannot be applied justifiably yet to the fundamentally timeless context.

Quasi-Stationarity: The Impossible Process (draft available upon request!)

​An ubiquitously used idealization in physics, e.g. black hole physics, is quasi-stationarity. Prominently, Hawking (1975) employed this idealization in making his argument for black hole evaporation. The core idea is to assume that a system is evolving ‘so slowly’ that it can be modeled dynamically as a sequence of stationary systems. Here, I argue that quasi-stationary processes, taken literally, are impossible in the same vein as Norton’s (2016) evaluation of quasi-static processes. Furthermore, while Norton vindicates the widespread use of quasi-staticity by showing how this ‘impossible’ idealization can be readily ‘de-idealized’ for use in quotidian thermodynamical reasoning, I argue that Hawking’s (1975) argument for black hole evaporation cannot yet be justified in a similar fashion. One would need to provide a procedure for finding an approximately globally conserved energy via approximate Killing fields, but this remains an open question in general relativity.

Decoherence, Branching, and the Born Rule for a Mixed-State Everettian Multiverse (joint work with Eddy Keming Chen) (preprint available here

In Everettian quantum mechanics, justifications for the Born rule appeal to self-locating uncertainty or decision theory. Such justifications have focused exclusively on a pure-state Everettian multiverse, represented by a wave function. Recent works in quantum foundations suggest that it is viable to consider a mixed-state Everettian multiverse, represented by a (mixed-state) density matrix. Here, we develop the conceptual foundations for decoherence and branching in a mixed-state multiverse, and extend the standard Everettian justifications for the Born rule to this setting. This extended framework provides a unification of 'classical' and 'quantum' probabilities, and additional theoretical benefits, for the Everettian picture.

Check, Please: De-idealizing De-idealization 

It is doubtless that scientific inquiry inextricably involves the use of idealizations: for ease of calculation and representation, we assume the absence of friction, the perfect sphericity of cows, or the impeccable rationality of human beings. Idealizations are, strictly speaking, false. Yet they play a crucial role in many of our best sciences in representing all sorts of phenomena, from coffee cups to black holes. Much ink has thus been spilled over how to justify them. A predominant view focuses on justifying these idealizations via de-idealization procedures. A more recent cluster of views, due to Potochnik (2017) and Knuttila & Morgan (2019), pushes back by arguing that these idealizations stand on their own and need no explicit de-idealization, and that de-idealization is too demanding, respectively. I’ll argue that this disagreement hinges on a strong philosopher’s construct of de-idealization, an idea that itself needs to be de-idealized. I explicate two weaker senses in which idealizations can be de-idealized.

Putting Pressure under Pressure: On the Status of the Classical Pressure in Relativity

Much of the century-old debate surrounding the status of thermodynamics in relativity has centered on the search for a suitably relativistic temperature; recent work by Chua (2023) has suggested that the classical temperature concept – consilient as it is in classical settings – ‘falls apart’ in relativity. However, these discussions have still tended to assume an unproblematic Lorentz transformation for – specifically, the Lorentz invariance of – the pressure concept. Here I argue that, just like the classical temperature, the classical concept of pressure breaks down in relativistic settings. This situation suggests a new thermodynamic limit – a ‘rest frame limit’ – without which an unambiguous thermodynamic description of systems does not emerge. I end by briefly discussing how thermodynamics, in requiring preferred frames, bears on the idea of so-called symmetry-to-reality inferences.

bottom of page