Loading...

Follow Hogg's Research | galaxies, stellar dynamics,.. on Feedspot

Continue with Google
Continue with Facebook
or

Valid

Many threads of conversation over the past weeks came together today in a set of coincidences. Conversations with Bedell (Flatiron), Pope (NYU), Luger (Flatiron), and Farr (Flatiron) ranging around stochastic processes and inferring stellar surface features from doppler imaging all overlap at stellar asteroseismic p modes: In principle, with high-resolution, high-signal-to-noise stellar spectral time series (and we have these, in hand!) we should be able not only to see p modes but also see their footprint on the stellar surface. That is, directly read ell and em off the spectral data. In addition, we ought to be able to see the associated temperature variations. This is all possible because the stars are slowly rotating, and each mode projects onto the rotating surface differently. Even cooler than all this: Because the modes are coherent for days in the stars we care about, we can build very precise matched filters to combine the data coherently from many exposures. There are many things to do here.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Tyler Pritchard (NYU) convenes a meeting on Mondays at NYU to discuss time-domain astrophysics. Today we had a discussion of a very simple idea: Use the rates of short GRBs that are observed and measured (using physical models from the MacFadyen group at NYU) to have certain jet–observer offset angles to infer rates for all the way-off-axis events that won't be GRB triggers but might be seen in LSST or other ground-based optical or radio surveys. Seems easy, right? It turns out it isn't trivial at all, because the extrapolation of a few well-understood events in gamma-rays, subject to gamma-ray selection effects to a full population of optical and radio sources (and then assessing those selection effects) requires quite a few additional or auxiliary assumptions. This is even more true for the bursts where we don't know redshifts. I was surprised to hear myself use the astronomy word "V-max"! But we still (as a group) feel like there must be low-hanging fruit. And this is a great application for the MacFadyen-group models, which predict brightness as a function of wavelength, time, and jet–observer angle.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Our five-person (Bedell, Hogg, Queloz, Winn, Zhao) exoplanet meeting continued today, with Winn (Princeton) working out the elements needed to produce a simulator for a long-term EPRV monitoring program with simple observing rules. He is interested in working out under what circumstances such a program can be informative about exoplanets in regimes that neither Kepler nor existing EPRV programs have strongly constrained, like near-Earth-masses on near-Earth-orbits around near-Sun stars. And indeed we must choose a metric or metrics for success. His list of what's needed, software-wise, is non-trivial, but we worked out that every part of it would be a publishable contribution to the literature, so it could be a great set of projects. And a very useful set of tools.

Zhao (Yale) showed me two-dimensional calibration data from the EXPRES instrument illuminated by their laser-frequency comb. It is astounding. The images are beautiful, and every single line in each image is at a perfectly known (from physics!) absolute wavelength. This might be the beginning of a very new world. The instrument is also beautifully designed so that all the slit (fiber, really, but it is a rectangular fiber) images are almost perfectly aligned with one of the CCD directions, even in all four corners of the image. Not like the spectrographs I'm used to!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Today Lily Zhao (Yale) visualized for me some of the calibration data they have for the EXPRES spectrograph at Yale. What she showed is that the calibration does vary at very high signal-to-noise, and that the variations are systematic or smooth. That is, the instrument varies only a tiny tiny bit, but it does so very smoothly and the smooth variations are measured incredibly precisely. This suggests that it should be possible to pool data from many calibration exposures to build a better calibration model for every exposure than we could get if we treated the data all independently.

Late in the day, we drew a graphical model for the calibration, and worked through a possible structure. As my loyal reader knows, I want to go to full two-dimensional modeling of spectrographs! But we are going to start with measurements made on one-dimensional extractions. That's easier for the community to accept right now, anyways!


Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Josh Winn (Princeton) and Lily Zhao (Yale) both came in to Flatiron for a couple of days today to work with Megan Bedell (Flatiron), Didier Queloz (Cambridge), and me. So we had a bit of a themed Stars and Exoplanets Meeting today at Flatiron. Winn talked about various ways to measure stellar obliquities (that is, angles between stellar-rotation angular momentum vectors and planetary system angular-momentum vectors). He has some six ways to do it! He talked about statistical differences between vsini measurements for stars with and without transiting systems.

Zhao and Queloz talked about their respective big EPRV programs to find Earth analogs in radial-velocity data. Both projects need to get much more precise measurements, and observe fewer stars (yes fewer) for longer times. That's the direction the field is going, at least where it concerns discovery space. Queloz argued that these are going to be big projects that require patience and commitment, and that it is important for new projects to control facilities, not just to apply for observing time each semester! And that's what he has with the Terra Hunting Experiment, in which Bedell, Winn, and I are also partners.

Related to all that, Zhao talked about how to make an observing program adaptive (to increase efficiency) without making it hard to understand (for statistical inferences at the end). I'm very interested in this problem! And it relates to the Queloz point, because if a time allocation committee is involved every semester, any statistical inferences about what was discovered would have to model not just the exoplanet population but also the behavior of the various TACs!

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

At lunchtime I had a great conversation with Iain Murray (Edinburgh) about two things today. One was new ideas in probabilistic machine learning, and the other was this exoplanet transit spectroscopy challenge. On the former, he got me excited about normalizing flows, that use machine learning methods (like deep learning) and a good likelihood function to build probabilistic generative models for high dimensional data. These could be useful for astronomical applications; we discussed. On the latter, we discussed how transits work and how sunspots cause trouble for them. And how the effects might be low dimensional. And thus how a good machine-learning method should be able to deal with it or capture it.

In the afternoon I spent a short session with Rodrigo Luger (Flatiron) talking about the information about a stellar surface or about an exoplanet surface encoded in a photometric light curve. The information can come from rotation, or from transits, or both, and it is different (there is more information), oddly, if there is limb darkening! We talked about the main points such a paper should make, and some details of information theory. The problem is nice in part because if you transform the stellar surface map to spherical harmonics, a bunch of the calculations lead to beautiful trigonometric forms, and the degeneracy or eigenvector structure of the information tensor becomes very clear.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I had a good conversation with with Laura Chang (Princeton) today, who is interested in doing some work in the area of binary stars. We discussed the point that many of the very challenging things people have done with the Kepler data in the study of exoplanets—exoplanet detection, completeness modeling, populations inferences— are very much easier in the study of eclipsing binary stars. And the numbers are very large: The total number of eclipsing binary systems found in the Kepler data is comparable to the total number of exoplanets found. And there are also K2 and TESS binaries! So there are a lot of neat projects to think about for constraining the short-period binary population with these data. We decided to start by figuring out what's been done already.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

My only research events today were conversations with Eilers, Leistedt, and Pope about short-term strategies.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I spent a bit of research time today writing up my ideas about what we might do with The Snail (the local phase spiral in the vertical dynamics discovered in Gaia data) to infer the gravitational potential (or force law, or density) in the Milky Way disk. The idea is to model it as an out-of-equilibrium disturbance winding up towards equilibrium. My strong intuition (that could be wrong) is that this is going to be amazingly constraining on the gravitational dynamics. I'm hoping it will be better (both in accuracy and precision) than equilibrium methods, like virial theorem and Jeans models. I sent my hand-written notes to Hans-Walter Rix (MPIA) for comments.

Read Full Article
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

I spent the day at Pheno 2019, where I gave a plenary about Gaia and dark matter. It was a fun day, and I learned a lot. For example, I learned that when you have a dark photon, you naturally get tiny couplings between the dark matter and the photon, as if the dark matter has a tiny charge. And there are good experiments looking for milli-charged particles. I learned that deep learning methods applied to LHC events are starting to approach information-theoretic bounds for classifying jets. That's interesting, because in the absence of a likelihood function, how do you saturate bounds? I learned that the Swampland (tm) is the set of effective field theories that can't be represented in any string theory. That's interesting: If we could show that there are many EFTs that are incompatible with string theory, then string theory has strong phenomenological content!

In the last talk of the day, Mangano (CERN) talked about the future of accelerators. He made a very interesting point, which I have kind-of known for a long time, but haven't seen articulated explicitly before: If you are doing a huge project to accomplish a huge goal (like build the LHC to find the Higgs), you need to design it such that you know you will produce lots and lots of interesting science along the way. That's an important idea, and it is a great design principle for scientific research.

Read Full Article

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview