검색 전체 메뉴
PDF
맨 위로
OA 학술지
e-Science Paradigm for Astroparticle Physics at KISTI
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT
e-Science Paradigm for Astroparticle Physics at KISTI
KEYWORD
e-Science , astroparticle physics , dark matter
  • 1. INTRODUCTION

    Current research can be analyzed by big data in the framework of the e-Science paradigm. The e-Science paradigm unifies experiments, theories, and computing simulations that are related to big data (Lin & Yen 2009). Hey explained that a few thousands of years ago, science was described by experiments (Hey 2006). In the last few hundred years, science was described by theories and in the last few decades, science was described by computing simulations (Hey 2006). Today, science is described by big data through the unification of experiments, theories, and computing simulations (Cho et al. 2011).

    We introduce the e-Science paradigm in the search for new physics beyond the Standard Model, as shown in Fig. 1. It is not a mere set of experiments, theories, and computing, but an efficient method of unifying researches. In this paper, we show an application of the e-Science paradigm to astroparticle physics.

    Dark matter is one of three major principal constituents of the universe. The precision measurements in flavor physics have confirmed the Cabibbo-Kobayashi-Maskawa (CKM) theory (Kobayashi & Maskawa 1973). However, the Standard Model leaves many unanswered questions in particle physics such as the origin of generations and masses, and the mixing and abundance of antimatter. Astrophysical evidence indicates the existence of dark matter (Bertone et al. 2005). It has been established that the universe today consists of 26% of dark matter and 4% of Standard Model particles, as shown in Fig. 2. Therefore, there are efforts to search for dark matter candidates using direct detection, indirect detection, and collider detection, as these could hint at new physics beyond the Standard Model. The e-Science paradigm for astroparticle physics is introduced to facilitate research in astrophysics.

    2. METHODS

       2.1 Experiment-Computing (Cyber-Laboratory)

    For experiment-computing, we constructed the socalled “cyber-laboratory” e-Science research environment (Cho & Kim 2009). In collider experiments, there are three integral parts, i.e., data production, data processing, and data analysis. First, data production is to take online shift anywhere. Online shift is taken not only in the on-site main control room, but also in the off-site remote control room. Second, data processing was performed using a data handling system. Offline shift can also be taken via a data handling system. The goal is to handle big data between user communities (Cho 2007). Third, data analysis was performed for the collaborators to work together as if they were on-site. One of the examples is the Enabling Virtual Organization (EVO) system. In collider physics, we have applied the cyber-laboratory to the CDF experiment at the Fermilab in the United State and Belle experiment at KEK in Japan.

       2. 2 Computing-Theory

    Computing simulation is an important addition to experiment-computing. The Large Hadron Collider (LHC) experiments show that computing simulations need more Central Processing Unit (CPU) power than reconstructions of experimental data. Since the cross sections of particles in new physics (e.g., dark matter candidates) are much smaller (<10-6 nb) than that of the Higgs boson (10-3–10-1 nb), at least a thousand times more data is needed than that for the discovery of the Higgs boson. Therefore, the development of simulation toolkits is an urgent issue. Big data from the Monte Carlo (MC) simulation is also needed for relating the experimentally measured variables at the LHC experiments to the parameters of the underlying theories (Papucci & Hoeche 2012). The MC simulation is well suited for running on a distributed computing architecture because of its parallel nature.

    The flowchart of a simulation for an astroparticle physics model is shown in Fig. 3 (Cho et al. 2015). From the astroparticle physics model, the Feynman diagrams are generated by MadGraph/MadDM (Alwall et al. 2014) and the event simulation is generated by PYTHIA (Sjostranda et al. 2014). Then, the detector simulation is performed using Geant4 (Agostinelli et al. 2003). In the last step, the output is analyzed by MadAnalysis (Eric et al. 2013) and ROOT (Antcheva et al. 2009). Astroparticle theory groups use supercomputers for MC simulations in physics beyond the Standard Model. The need for more CPU power demands a solution to cost-effective computing (Cho et al. 2015). For this purpose, we need to study simulation toolkits. Table 1 shows a comparison of the general features of several MC simulation toolkits (Cho 2012).

    [Table 1.] Comparison of general features of several MC simulation toolkits (Cho 2012)

    label

    Comparison of general features of several MC simulation toolkits (Cho 2012)

    Geant4 is a toolkit that simulates the interaction of particles with matter. It can be applied not only to accelerator physics, nuclear physics, high energy physics, and medical physics, but also to cosmic ray research and space science (Agostinelli et al. 2003). Geant4 is more accurate than the other simulation toolkits of similar capabilities; however, it consumes much more CPU time. Therefore, we need to study parallelization and optimization of the Geant4 toolkit.

       2. 3 Theory-Experiment

    For theory-experiment, tools for both theoretical and experimental analysis were developed. Experimental results provide theoretical models and theoretical models provide experimental results. As already applied to particle physics, we applied these tools to astroparticle physics.

    3. RESULTS

    Using the cyber-laboratory for experiment-computing, dark matter production at an e+e collider is studied. Fig. 4 shows the production mechanism for dark matter in the Belle experiment (Essig et al. 2013). Fig. 5 shows a model shows that the dark photon V can decay to Standard Model particles (Batell et al. 2009). Based on this model, the dark photon in the Belle experiment has been searched (Jaegle et al. 2015). The dark photon is a hypothetical particle in recently proposed dark sector models. Assuming that the dark photon come from the prompt vertex, its production in the Belle experiment has been searched. Since there is not any observed signal, an upper limit on the dark photon cross section with the Standard Model particles can be set (Jaegle et al. 2015).

    For computing-theory, we show an example of the Graphical Processing Unit (GPU) machines used in lattice Quantum ChromoDynamics (QCD). Lattice QCD is well suited for GPU machines. Compared to a CPU, a GPU has hundreds of optimized cores to process information simultaneously. We applied simulation toolkits to study the finite volume effects in Lattice QCD (Kim & Cho 2015). It is a non-perturbative approach to solving the QCD theory (Wilson 1974). The Charge-Parity (CP) violation parameter BK (Bae et al. 2012) appears in physics beyond the Standard Model. To reduce the systematic errors, the finite volume effects must be taken into account. It is difficult to draw the fitting results because of the excessive computing resources involved (Kim et al. 2012). We used the e-Science paradigm for computing resources and developed a new algorithm to reduce CPU time. The result shows that the optimal ratio of GPU performance to peak performance is 38% (Kim & Cho 2015). This indicates that the GPU was heavily used in the calculation. This method reduced the processing time significantly (Kim et al. 2011).

    We also studied the optimization of the Geant4 simulation toolkit by comparing the CPU time consumed for various physics models. We generated one million events for each physics model and measured the CPU time. The beam was a 1 GeV uranium and the target was liquid hydrogen. Geant4 version 10.2 was tested, which was released on December 4, 2015. We used the Tachyon II supercomputer at KISTI. Fig. 6 shows the CPU time for various physics models. Then, we compared the results of the simulation for each physics model with the experimental data to determine the optimized physic model.

    For theory-experimental, we applied the method to astroparticle physics as already applied to particle physics. In the application to particle physics, we compared the results of collider experiments with those of the left-right model (Cho & Nam 2013). Using the effective Hamiltonian approach, the amplitude of B decay was obtained. The contribution of the right-handed current to CP asymmetries in B decay has been investigated and the allowed region has been suggested (Cho & Nam 2013).

    This method can be applied to astroparticle physics. A recent paper shows that there is feedback from stellar objects and a new light U(1) gauge boson related to the dark photon is proposed (Jeong et al. 2015). The small mass of the new boson can turn various kinds of low energy experiments into a new discovery machine, depending on the coupling of the new gauge boson to the Standard Model particles. It is insightful to understand the properties of each type of the new gauge boson and the current constraints on its mass. Using constraints from stellar objects, phenomenological models of particle physics have been tested (Jeong et al. 2015). The new gauge boson Z' can be produced in stars including the Sun (Jeong et al. 2015). In supernovae, the process e+eZ' → may need to be considered. In the case that the produced neutrinos cannot escape from the supernova, there exists a lower limit (Jeong et al. 2015). Based on the results from a collection of astroparticle experiments, constraints on the U(1) gauge bosons have been suggested (Jeong et al. 2015). This may provide a good example of theory-experiment in the e-Science paradigm.

    4. CONCLUSION

    As in particle physics, the e-Science paradigm that unifies experiment, theory, and computing has been introduced in astroparticle physics. For experiment-computing, we use a cyber-laboratory to study the problem of dark matter anytime and anywhere. For computing-theory, we study theoretical models using computing simulations and improve simulation toolkits to reduce CPU time. For theory-experiment, theoretical models are constructed to describe the experimental results.

    The research on dark matter and the dark photon is as an example of physics beyond the Standard Model and the application of the e-Science paradigm to astroparticle physics.

참고문헌
  • 1. Agostinelli S, Allison J, Amako K, Apostolakis J, Araujo H (2003) Geant4―a simulation toolkit [Nucl. Instr. Meth. Phys. Res.] Vol.A 506 P.250-303 google cross ref
  • 2. Alwall J, Frederix R, Frixione S, Hirschi V, Maltoni F (2014) The automated computation of tree-level and next-to-leading order differential cross sections, and their matching to parton shower simulations, and their matching to parton shower simulations [J. High Energy Phys.] Vol.2014 P.79 google cross ref
  • 3. Antcheva I, Ballintjin M, Bellenot B, Biskup M, Brun R (2009) ROOT ― A C++ framework for petabyte data storage, statistical analysis and visualization [Comput. Phys. Commun.] Vol.180 P.2499-2512 google cross ref
  • 4. Bae T, Jang YC, Jung C, Kim HJ, Kim J (2012) Kaon B Parameter from Improved Staggered Fermions in Nf = 2+1 QCD [Phys. Rev. Lett.] Vol.109 P.041601 google cross ref
  • 5. Batell B, Pospelov M, Ritz A (2009) Probing a secluded U(1) at B factories [Phys. Rev. D] Vol.79 P.115008 google cross ref
  • 6. Bertone G, Hooper D, Silk J (2005) Particle dark matter: evidence, candidates and constraints [Phys. Rep.] Vol.405 P.279 google cross ref
  • 7. Cho K (2007) A test of the interoperability of grid middleware for the Korean High Energy Physics Data Grid system [Int. J. Comput. Sci. Netw. Secur.] Vol.7 P.49-54 google
  • 8. Cho S 21 Sep 2012 Monte Carlo Introduction [The 2nd Geant4 Tutorial] google
  • 9. Cho K, Kim H (2009) Heavy Flavor Physics through e-Science [J. Korean Phys. Soc.] Vol.55 P.2045 google cross ref
  • 10. Cho K, Nam S (2013) Right-handed current contributions in B→Kπ decays [Phys. Rev. D] Vol.88 P.035012 google cross ref
  • 11. Cho K, Kim J, Nam S (2011) Collider physics based on e-Science paradigm of experiment?computing?theory [Comput. Phys. Commun.] Vol.182 P.1756 google cross ref
  • 12. Cho K, Kim J, Kim J (2015) Research and development of the evolving architecture for beyond the Standard Model [21st International Conference on Computing in High Energy and Nuclear Physics] google
  • 13. Eric C, Fuks B, Serret G (2013) MadAnalysis 5, a user-friendly framework for collider phenomenology [Comput. Phys. Commun.] Vol.184 P.222-256 google cross ref
  • 14. Essig R, Mardon J, Papucci M, Volansky T, Zhong YM (2013) Constraining Light Dark Matter with Low-Energy e+e- Colliders [J. High Energy Phys.] Vol.2013 P.167 google cross ref
  • 15. Hey T 22-25 Oct 2006 e-Science and Cyberinfrastructure [The 20th International CODATA Conference] google
  • 16. Jaegle I, Adachi I, Aihara H, Al Said S, Asner DM (2015) Search for the Dark Photon and the Dark Higgs Boson at Belle [Phys. Rev. Lett.] Vol.114 P.211801 google cross ref
  • 17. Jeong YS, KIM CS, Lee HS (2015) Constraints on the U(1)L gauge boson in a wide mass range google
  • 18. Kim J, Cho K (2015) A study on the optimization of finite volume effects of BK in lattice QCD by using the CUDA [J. Korean Phys. Soc.] Vol.67 P.307-310 google cross ref
  • 19. Kim J, Kim HJ, Lee W, Jung C, Sharpe SR 10-16 July (2011) Finite Volume Errors in BK [The 29th Internationl Symposium on Lattice Field Theory] google
  • 20. Kim J, Jung C, Kim HJ, Lee W, Sharpe SR (2012) Finite volume effects in BK with improved staggered fermions [Phys. Rev. D] Vol.83 P.117501 google cross ref
  • 21. Kobayashi M, Maskawa T (1973) CP-Violation in the Renormalizable Theory of Weak Interaction [Prog. Theor. Phys.] Vol.49 P.652-657 google cross ref
  • 22. Lin SC, Yen E (2009) e-Science for High Energy Physics in Taiwan and Asia [J. Korean Phys. Soc.] Vol.55 P.2035 google cross ref
  • 23. Papucci M, Hoeche S 27-28 Nov 2012 Present and Future Computing Requirements for Theoretical Particle Physics [Large Scale Production Computing and Storage Requirements for High Energy Physics: Target 2017] google
  • 24. Sjostranda T, Ask S, Christiansen JR, Corke R, Desai N (2014) An Intrudoction to PYTHIA 8.2 [Comput. Phys. Commun.] Vol.191 P.159-177 google cross ref
  • 25. Wilson KG (1974) Confinement of quarks [Phys. Rev. D] Vol.10 P.2445 google cross ref
OAK XML 통계
이미지 / 테이블
  • [ Fig. 1. ]  Paradigm of e-Science in astroparticle physics represented as a unification of experiment, theory, and computing.
    Paradigm of e-Science in astroparticle physics represented as a unification of experiment, theory, and computing.
  • [ Fig. 2. ]  The universe today consists of 26% of dark matter and 4% of the Standard Model particles.
    The universe today consists of 26% of dark matter and 4% of the Standard Model particles.
  • [ Fig. 3. ]  Flowchart of a simulation for an astroparticle physics model (Cho et al. 2015).
    Flowchart of a simulation for an astroparticle physics model (Cho et al. 2015).
  • [ Table 1. ]  Comparison of general features of several MC simulation toolkits (Cho 2012)
    Comparison of general features of several MC simulation toolkits (Cho 2012)
  • [ Fig. 4. ]  Production mechanism of dark matter particles in e+e? collision (Essig et al. 2013).
    Production mechanism of dark matter particles in e+e? collision (Essig et al. 2013).
  • [ Fig. 5. ]  Branching ratios for dark photon decay to the Standard Model particles V → e+e? (dashed), V → μ+μ? (dotted), V → τ+τ? (dotted-dashed), and V → hadrons (solid) (Batell et al. 2009).
    Branching ratios for dark photon decay to the Standard Model particles V → e+e? (dashed), V → μ+μ? (dotted), V → τ+τ? (dotted-dashed), and V → hadrons (solid) (Batell et al. 2009).
  • [ Fig. 6. ]  CPU time of Geant4 simulations for various physics models.
    CPU time of Geant4 simulations for various physics models.
(우)06579 서울시 서초구 반포대로 201(반포동)
Tel. 02-537-6389 | Fax. 02-590-0571 | 문의 : oak2014@korea.kr
Copyright(c) National Library of Korea. All rights reserved.