At the September 2022, MIT incoming graduate student resources fair ORCD made its first in person event appearance, complete with live orchids.
Two orchids were offered as prizes for participants in our short survey – https://forms.gle/nKhn58UzXxzKafki6 . The orchids winners were
Bonnie Akhavan – MIT ORC
In conversation and in the survey we asked the graduate students what their research interests were. Interests were very broad, but there were some common themes. A number of students expressed interest in artificial intelligence, machine learning, optimization, natural language processing, and data science. Some students expressed interest in the software used in these areas, as well as medical and public health applications. There was interest in computational topics, such as distributed algorithms, numerical methods for PDEs, high performance computing, and scientific visualization. Students also listed interest in neuroscience, computational neuroscience, nuclear fusion, rheology, metrology, instrumentation, social and economic networks, dynamical systems, augmented/virtual reality, and cryptography.
When asked (qualitatively!) how much computing and data analysis their field does, students mainly responded that their field does a lot of each.
To see what experience students had coming in (as the main audience for the fair is new graduate students), we asked what their computing experience was, broken down by system: desktop/laptop, cloud, cluster, or high performance computing/supercomputing. The results are below.
In terms of computing and data skills, students are interested in learning about cloud computing, supercomputing/high performance computing, parallel computing, and data analysis. More specifically, some students expressed interest in Tableau, parallelization of PDE solvers, parallel I/O, and artificial intelligence.
Finally, we asked students what computing or data resources they would like to get access to. A number of students mentioned a cluster or supercomputer, with specific requests for GPUs accelerators and large memory machines. They also mentioned cloud resources, and access to Google Colab Pro.