Go to the U of M home page
School of Physics & Astronomy
Assay and Acquisition of Radiopure Materials

User Tools


aaac:oct2

Agenda Oct 2, 2015

White Paper

A unified story line - including perhaps a bank of slides to pass around.

A. Todd can report on his talk (I sent him some slides)
B. How did it go and how can we exploit this opportunity we have
C. More participants?
D. List of where the talks are and who is doing what.
  • Oct 2 Todd: Heliophysics Subcommittee
  • Oct 6 Keivan: Planetary Science Subcommittee (Rall) =⇒ NAC Science committee
  • Oct 8-10 Mid-decadal (Lang) no report
  • Oct 14-15 Todd: Committee on Solar and Space Physics (CSSP) on October 14-15
  • Oct 23 Prisca: Astrophysics Subcommittee (APS)
  • Nov 2 Prisca call-in NAS Science Committee
  • Nov 3-5 Space Studies Board (Maloney, NAS) delayed to April
  • Nov 13 Prisca: AAAC

Submit a new survey

Drill down and fill in the gaps on Agency Statistics

  • Michael Cooke added to the DOE Cosmic Frontier Resource Page
  • Prisca and Michael will continue to work together on pulling together the relevant data. I remind you that DOE provides a counter example to success rates - see spreadsheet at that link. Other Issues are
    • demographic data (gender, race, age) is not requested (no database). It might be inferred from the comparative review notes. Early Career rewards require < 10 yrs from PhD
    • data exists on whether it is a “new” proposals to the HEP program vs “renewal” to the HEP program. A PI moving between research thrusts (aka “frontiers”) would be considered a “renewal” in this context. Is there data on someone who puts in a resubmission of the same proposal the next year when it was rejected the first time?
    • successful awards have public information on the institution, the PI, and the total amount of the award given by HEP.
    • they do NOT have number of PIs on a grant, total funding requested in the original proposal, breakdown of funding by frontier. DOE is considering how to capture that.
    • Limited in how far back you can go: HEP began relying on the comparative review process for proposals submitted to the FY 2012 funding cycle. Some data exist from before 2012 but not as detailed and there are concerns about accuracy.
    • Agency impact: The comparative review is an improvement over the previous mail-in-reviews only process. The outcomes that we viewed were fair. (comes from the COV)
    • Agency impact: successful at getting reviewers, particularly new reviewers: 153 reviewers participated in the FY 2015 comparative review process, in which 687 reviews were completed with an average 4.9 reviews per proposal.
  • NASA Linda, Hasan, Daniel are willing to help, but not a lot of time. What is the best use of their time? Can we get better/more merit data to fill in the gaps in years and for Astro Helio and Planetary separately?
  • Helio and Planetary - need a point person who will consider what data already exists (see our long report) and what else we need. This data provides information on pre-proposal models - we need the latest data to update what we have.
  • NSF is short-handed for this work, but can be tapped to mine the data if we have a very specific question to ask. I would suggest number of unique proposers per 2 years, 4 years, 5 years to complement already existing 1 year and 3 year. Can we fit for the number of repeat proposals? Can we get this data for other agencies?
  • Put your ideas into the Agency stat link.
aaac/oct2.txt · Last modified: 2015/10/02 14:05 by prisca