Thermal denaturation melts were performed using an Applied Photophysics Chirascan Plus spectrometer. All experiments were performed in a 0.5 mm cuvette at a protein concentration of 20 µM in buffer containing 50 mM sodium phosphate, 20 mM NaCl, pH 7.0. Temperature scan rate was 1 °C/min, sample was equilibrated for 30 sec at each temperature increment, and the CD signal at 222 nm was averaged over 2 seconds. Data was plotted and analyzed using the equation.47 (link)
where ST is the measured signal as a function of temperature T, SN and SU are the signals corresponding to the native and unfolded baselines, mN and mU are the slopes of linear dependence of SN and SU with temperature, Tm is the midpoint melting temperature, ΔHm is the enthalpy change at the Tm, R is the universal gas constant, and T is the absolute temperature in Kelvin, respectively.