PXL_20220426_225504774-min_edited.jpg

I am a Singaporean PhD candidate in Philosophy at UCSD.

My CV can be found here. 


My primary interests center on philosophy of physics and philosophy of science, with particular interests in the history and philosophy of thermodynamics. I am especially interested in studying how thermodynamic concepts - such as equilibrium, temperature, and entropy - get extended past their original domain of applicability. I am also interested in the problem of time in quantum gravity, and various proposals for resolving it. Recently, I began studying the justifications behind the use of approximations and idealizations in physics, and science more generally. 


On the side, I dabble in formal epistemology, data ethics and the philosophy of data science, and the philosophy of logic and mathematics. 


I completed my undergraduate studies in philosophy at Wolfson College, University of Cambridge, where I was trained in most areas of analytic philosophy (metaphysics, logic, epistemology, language, ethics, etc.). I then spent some time at the Munich Center for Mathematical Philosophy before coming to UCSD for my PhD. For what it's worth, my Erdös number is 5.


When I am not doing philosophy, I can be found playing Magic: the Gathering at my local game store, playing video games, or skateboarding.

 

News

psa2022_edited.jpg

My essay, 'T Falls Apart: On the Status of Classical Temperature in Relativity', has just been accepted for publication in the PSA2022 supplemental volume of Philosophy of Science. 

My essay, 'T Falls Apart: On the Status of Classical Temperature in Relativity', won the Robert K. Clifton memorial book prize for best paper in philosophy of physics at the 2022 Logic, Math, and Physics graduate conference held at University of Western Ontario.

uwo clifton prize.jpg
Image by Aldebaran S

For Winter-Spring 2022 I was a pre-doctoral fellow at the Beyond Spacetime project funded by the Templeton Foundation, working with Nick Huggett at the University of Illinois Chicago. 

My essay and work-in-progress, 'Do Black Holes Evaporate?', won the 2021 UCSD departmental essay prize for best essay by a graduate student.

Best essay 2021_edited_edited.jpg
ipe.png

For 2019-2020, I was a data analytic governance and accountability fellow at the Institute of Practical Ethics at UCSD, where I worked on explainability algorithms with data scientist colleagues from Singapore and Japan.

 

Education

2018 -

PhD Philosophy, 
University of California San Diego

2017 - 2018

MA Logic and Philosophy of Science, 
Munich Center for Mathematical Philosophy (Incomplete)

2014 - 2017

BA & MA (Cantab) Philosophy,
University of Cambridge

 

Publications

Taking the formal analogies between black holes and classical thermodynamics seriously seems to first require that classical thermodynamics applies in relativistic regimes. Yet, by scrutinizing how classical temperature is extended into special relativity, I argue that it falls apart. I examine four consilient procedures for establishing the classical temperature: the Carnot process, the thermometer, kinetic theory, and black-body radiation. I show how their relativistic counterparts demonstrate no such consilience in defining relativistic temperature. As such, classical temperature doesn’t appear to survive a relativistic extension. I suggest two interpretations for this situation: eliminativism akin to simultaneity, or pluralism akin to rotation.

Degeneration and Entropy

(2022, in Lakatos’s Undone Work: The Practical Turn and the Division of Philosophy of Mathematics and Philosophy of Science, special issue of Kriterion: Journal of Philosophy, edited by S. Nagler, H. Pilin, and D. Sarikaya.)

Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations – superfluity and authoritarianism. I show how these criteria augment the account in Methodology of Scientific Research Programmes, providing a generalized Lakatosian account of progress and degeneration. I then apply this generalized account to a key transition point in the history of entropy – the transition to an information-theoretic interpretation of entropy – by assessing Jaynes’s 1957 paper on information theory and statistical mechanics.

No Time for Time from No-Time

(with Craig Callender, Philosophy of Science 88:5, 2021)

Programs in quantum gravity often claim that time emerges from fundamentally timeless physics. In the semiclassical time program time arises only after approximations are taken. Here we ask what justifies taking these approximations and show that time seems to sneak in when answering this question. This raises the worry that the approach is either unjustified or circular in deriving time from no–time.

Does Von Neumann Entropy Correspond to Thermodynamic Entropy? 
(Philosophy of Science 88:1, 145-168, 2021)

Conventional wisdom holds that the von Neumann entropy corresponds to thermodynamic entropy, but Hemmo and Shenker (2006) have recently argued against this view by attacking von Neumann's (1955) argument. I argue that Hemmo and Shenker's arguments fail due to several misunderstandings: about statistical mechanical and  thermodynamic domains of applicability, about the nature of mixed states, and about the role of approximations in physics. As a result, their arguments fail in all cases: in the single-particle case, the finite-particles case, and the infinite-particles case.

Improving LIME Robustness with Smarter Locality Sampling 
(2020, with Sean Saito, Rocco Hu and Nicholas Capel, AdvML '20: Workshop on Adversarial Learning Methods for Machine Learning and Data Mining, KDD2020, August 24, 2020, San Diego, CA.)

(Video of talk available here.) 

Explainability algorithms such as LIME have enabled machine learning systems to adopt transparency and fairness, which are important qualities in commercial use cases. However, recent work has shown that LIME's naive sampling strategy can be exploited by an adversary to conceal biased, harmful behavior. We propose to make LIME more robust by training a generative adversarial network to sample more realistic synthetic data which the explainer uses to generate explanations. Our experiments demonstrate that our proposed method demonstrates an increase in accuracy across three real-world datasets in detecting biased, adversarial behavior compared to vanilla LIME. This is achieved while maintaining comparable explanation quality, with up to 99.94% in top-1 accuracy in some cases.

An Empirical Route to Logical ‘Conventionalism’
(2017, In: Baltag A., Seligman J., Yamada T. (eds) Logic, Rationality, and Interaction. LORI 2017. Lecture Notes in Computer Science, vol 10455. Springer, Berlin, Heidelberg.)

The laws of classical logic are taken to be logical truths, which in turn are taken to hold objectively. However, we might question our faith in these truths: why are they true? One general approach, proposed by Putnam and more recently Dickson or Maddy, is to adopt empiricism about logic. On this view, logical truths are true because they are true of the world alone – this gives logical truths an air of objectivity. Putnam and Dickson both take logical truths to be true in virtue of the world’s structure, given by our best empirical theory, quantum mechanics. This assumes a determinate logical structure of the world given by quantum mechanics. Here, I argue that this assumption is false, and that the world’s logical structure, and hence the related ‘true’ logic, are underdetermined. This leads to what I call empirical conventionalism.

Is Logic Empirical? Logical 'Conventionalism' from an Empirical Standpoint
(2017, Aporia Vol. XVII, Recipient of the Aporia Essay Prize)

The laws of classical logic are taken to be logical truths, and logical truths are taken to objectively hold. However, we might question our faith in these truths: why are they true? One often avoided approach is logical conventionalism, because it makes the logical truths dependent on somewhat intersubjective linguistic conventions. Another approach, proposed by Putnam (1975) and more recently Dickson (2001) or Maddy (2007), is to adopt empiricism about logic. On this view, logical truths are true because they are true of the world alone – this gives logical truths an air of objectivity unlike logical conventionalism. Putnam and Dickson both take logical truths to be true in virtue of the world’s structure, and the structure of the world is to be understood to be given by our best empirical theory, quantum mechanics. As it turns out, the structure of quantum mechanics apparently makes true the laws of quantum logic, and falsifies (one half of) the distributive law, something which was taken to be a logical truth under classical logic. Empiricists take this to indicate that the distributive law was not a logical truth to begin with. However, this argument assumes that there is a single determinate structure of the world prescribed by quantum mechanics. In this essay, I argue that this assumption is false, and that the structure of the world is underdetermined in quantum mechanics. Likewise, the choice of ‘true’ logic, as given by the world’s structure, is also underdetermined. This leads to what I call empirical conventionalism: the world alone fails to determine our logical truths. We need something broadly intersubjective, and thus less than objective, to fix our choice of logic even under empiricism. An attempt to avoid one form of conventionalism has thus led us back to another.

 

Drafts and Works in Progress

The Time in Thermal Time

Attempts to quantize gravity in the Hamiltonian approach lead to the 'problem of time'; the resultant formalism is often said to be `frozen', non-dynamical, and fundamentally timeless. To resolve this problem, Connes & Rovelli (1994) suggest the adoption of a `thermal time hypothesis': the flow of time emerges thermodynamically from a fundamentally timeless ontology. While statistical states are typically defined to be in equilibrium with respect to some background time,  Connes & Rovelli propose that we instead define time in terms of these statistical states: statistical states define a time according to which they are in equilibrium. To avoid circularity, we better have a good conceptual grasp on notions such as `equilibrium' and `statistical state' which are independent of time. Here, however, I argue that these concepts either implicitly presuppose some notion of time, or cannot be applied justifiably yet to the fundamentally timeless context.

The Map Is Still Not the Territory: De-idealizing De-idealizations

It is doubtless that scientific inquiry inextricably involves the use of idealizations: for ease of calculation and representation, we assume the absence of friction, the perfect sphericity of cows, or the impeccable rationality of human beings. Idealizations are, strictly speaking, false. Yet they play a crucial role in many of our best sciences in representing all sorts of phenomena, from coffee cups to black holes. Much ink has thus been spilled over how to justify them. A predominant view focuses on justifying these idealizations via de-idealization procedures. A more recent cluster of views, due to Potochnik (2017) and Knuttila & Morgan (2019), pushes back by arguing that these idealizations stand on their own and need no explicit de-idealization, and that de-idealization is too demanding, respectively. I’ll argue that this disagreement hinges on a strong philosopher’s construct of de-idealization, an idea that itself needs to be de-idealized. I explicate two weaker senses in which idealizations can be de-idealized

Do Black Holes Evaporate?

Since Hawking first predicted that black holes lose mass and `evaporate' via Hawking radiation, the phenomenon has become a linchpin of black hole research. However, I will argue that we lack justification for thinking that black hole evaporation occurs. The derivation of black hole evaporation requires some global notion of energy conservation, which in turn rests on assuming that the spacetime in question has certain idealized properties; the ubiquitous ones are stationarity and asymptotic flatness. By examining these idealizations and how they cannot be appropriately `deidealized' in describing actual physical systems, I argue that we lack justification for concluding that actual black holes evaporate.

Indeterminism for Nomological Bohmian Mechanics
(Under Revision)

Given the nomological approach to interpreting the wave-function in non-relativistic Bohmian mechanics, and box-standard notions of time-reversal invariance and indeterminism, I argue that Bohmian mechanics is indeterministic. On the one hand, (1) Bohmian particles have deterministic trajectories. On the other hand, (2) Bohmian mechanics is time-reversal invariant. However, given (3) the nomological approach { on which wave-functions are interpreted as (part of) the laws, I argue that (1) and (2) cannot be true at the same time. At least one of (1) - (3) must go. I consider and reject four options: giving up time-reversal invariance (for the theory, and `partially' for the wave-function alone), giving up deterministic trajectories, and revising our definition of determinism. In the end, I conclude that we should abandon the nomological approach instead.

Accuracy, Entropy, and the Free Energy Principle
(In Preparation)

I show that the relative entropy – also known as the Kullback-Leibler divergence (Kullback-Leibler 1951) – can be construed as a normative measure of epistemic accuracy in the veins of Joyce (1998). Furthermore, I discuss the Free Energy Principle – made popular in recent years by neuroscientist Karl Friston (cf. Friston 2007) and philosophers like Andy Clark (2016) – and how it employs a formally identical measure, though as a descriptive measure of perceptual accuracy in neural processing. I then propose that we naturalize – and provide naturalistic grounds for – the normative measure by appealing to its realization in our brain.