›› 2015, Vol. 30 ›› Issue (3): 639-654.doi: 10.1007/s11390-015-1550-1

Special Issue: Computer Architecture and Systems; Software Systems

• Regular Paper • Previous Articles    

An Early Evaluation and Comparison of Three Private Cloud Computing Software Platforms

Farrukh Nadeem1, Rizwan Qaiser2   

  1. 1. Department of Information Systems, Faculty of Computing and Information Technology, King Abdulaziz University Jeddah 21589, Saudi Arabia;
    2. Department of Computer Science, National University of Computer & Emerging Sciences, Lahore 54500, Pakistan
  • Received:2014-06-18 Revised:2015-01-19 Online:2015-05-05 Published:2015-05-05
  • About author:Farrukh Nadeem received his Ph.D. degree in computer science in 2009 from the University of Innsbruck, Austria. Currently, he is an assistant professor of King Abdulaziz University, Jeddah. His main research interests include performance modeling and prediction, and scheduling scientific workflows in distributed systems, particularly the Grid and the Cloud. He has been involved in several Austrian and Saudi research and development projects. Farrukh has authored more than 22 papers, including four book chapters.

Cloud computing, after its success as a commercial infrastructure, is now emerging as a private infrastructure. The software platforms available to build private cloud computing infrastructure vary in their performance for management of cloud resources as well as in utilization of local physical resources. Organizations and individuals looking forward to reaping the benefits of private cloud computing need to understand which software platform would provide the efficient services and optimum utilization of cloud resources for their target applications. In this paper, we present our initial study on performance evaluation and comparison of three cloud computing software platforms from the perspective of common cloud users who intend to build their private clouds. We compare the performance of the selected software platforms from several respects describing their suitability for applications from different domains. Our results highlight the critical parameters for performance evaluation of a software platform and the best software platform for different application domains.

[1] Vaquero L M, Rodero-Merino L, Caceres J, Lindner M. A break in the Clouds: Towards a cloud definition. ACM SIGCOMM Comput. Commun. Rev., 2009, 39(1): 50–55.

[2] Shafer J. I/O virtualization bottlenecks in cloud computing today. In Proc. the 2nd Workshop on I/O Virtualization, March 2010, pp.5–12. http://li46–224.members.linode. com/publications/papers/shafer–wiov2010.pdf, Dec. 2014.

[3] Wardley S, Goyer E, Barcet N. Ubuntu enterprise Cloud architecture. Technical White Paper, Ubuntu, 2009. http://www.ubuntu.com/sites/default/files/active/White-%20Paper%20Ubuntu%20Enterprise%20Cloud%20Architecture% 20v1.pdf, Apr. 2015.

[4] Nurmi D, Wolski R, Grzegorczyk C, Obertelli G, Soman S, Youseff L, Zagorodnov D. Eucalyptus: An open-source cloud computing infrastructure. Journal of Physics: Conference Series, 2009, 180(1): 012051.

[5] Saavedra R H, Smith A J. Analysis of benchmark characteristics and benchmark performance prediction. ACM Trans. Comput. Syst., 1996, 14(4): 344–384.

[6] Pu X, Liu L, Mei Y, Sivathanu S, Koh Y, Pu C. Understanding performance interference of I/O workload in virtualized cloud environments. In Proc. the 3rd IEEE CLOUD, July 2010, pp.51–58.

[7] Mucci P J, London K, Thurman J. The CacheBench report. Technical Report, ut-cs-98-394, University of Tennessee, 1998. http://icl.cs.utk.edu/projects/llcbench/cachebench. pdf, Jan. 2015.

[8] SchüLLER F. Grid computing with and standard test cases for a meteorological limited area model [Master Thesis]. Institute of Meteorology and Geophysics, University of Innsbruck, 2007. http://imgi.uibk.ac.at/file/ 157/download?token=KQL58LB2, Jan. 2015.

[9] Deshane T, Shepherd Z, Matthews J, Ben-Yehuda M, Shah A, Rao B. Quantitative comparison of Xen and KVM. In Proc. Xen Summit, June 2008. https://www.researchgate. net/publication/228772586Quantitative comparison of Xen and KVM/links/004635229734fe42f6000000, Jan. 2015.

[10] Reddy V V, Rajamani L. Evaluation of different hypervisors performance in the private cloud with SIGAR framework. International Journal of Advanced Computer Science and Applications, 2014, 5(2): 60–66.

[11] Palankar M R, Iamnitchi A, Ripeanu M, Garfinkel S. Amazon S3 for science grids: A viable solution? In Proc. the 2008 International Workshop on Data-Aware Distributed Computing, June 2008, pp.55–64.

[12] Stantchev V. Performance evaluation of cloud computing offerings. In Proc. the 3rd International Conference on Advanced Engineering Computing and Applications in Sciences, Oct. 2009, pp.187–192.

[13] Khurshid A, Al-Nayeem A, Gupta I. Performance evaluation of the Illinois cloud computing testbed. Technical Report, Department of Computer Science, University of Illinois at Urbana-Champaign, 2009.

[14] Iosup A, Ostermann S, Yigitbasi M N, Prodan R, Fahringer T, Epema D H. Performance analysis of cloud computing services for many-tasks scientific computing. IEEE Transactions on Parallel and Distributed Systems, 2011, 22(6): 931–945.

[15] Garfinkel S. An evaluation of Amazon's Grid computing services: EC2, S3 and SQS. Technical Report, TR-08-07, School for Engineering and Applied Sciences, Harvard University, Cambridge, MA, 2007.

[16] Juve G, Deelman E, Vahi K, Mehta G, Berriman B, Berman B P, Maechling P. Data sharing options for scientific workflows on Amazon EC2. In Proc. the 2010 ACM/IEEE International Conference for High Performance Computing, Networking, Storage and Analysis, Nov. 2010.

[17] Kobayashi K, Mikami S, Kimura H, Tatebe O. The Gfarm file system on compute Clouds. In Proc. IEEE International Symposium on Parallel and Distributed Processing Workshops and PhD Forum, May 2011, pp.1034–1041.

[18] Wang L, Zhan J, Shi W, Liang Y. In Cloud, can scientific communities benefit from the economies of scale? IEEE Transactions on Parallel and Distributed Systems, 2012, 23(2): 296–303.

[19] Walker E. Benchmarking Amazon EC2 for highperformance scientific computing. ; Login, 2008, 33(5): 18–23.

[20] Ostermann S, Iosup A, Yigitbasi N, Prodan R, Fahringer T, Epema D. A performance analysis of EC2 cloud computing services for scientific computing. In Proc. the 1st International Conference on Cloud Computing, Oct. 2009, pp.115–131.

[21] Jackson K, Ramakrishnan L, Muriki K, Canon S, Cholia S, Shalf J, Wasserman H J, Wright N. Performance analysis of high performance computing applications on the Amazon web services cloud. In Proc. the 2nd IEEE International Conference on Cloud Computing Technology and Science (CloudCom), Nov.30–Dec.3, 2010, pp.159–168.

[22] Ye X, Lv A, Zhao L. Research of high performance computing with clouds. In Proc. the 3rd International Symposium on Computer Science and Computational Technology, Aug. 2010, pp.289–293.

[23] Ahuja S P, Man S. The state of high performance computing in the cloud. Journal of Emerging Trends in Computing and Information Sciences, 2012, 3(2): 262–266.

[24] Nadeem F, Fahringer T. Optimizing execution time predictions of scientific workflow applications in the Grid through evolutionary programming. Future Generation Computer Systems, 2013, 29(4): 926–935.

[25] Gupta A, Milojicic D. Evaluation of HPC applications on cloud. In Proc. the 6th Open Cirrus Summit (OCS), Oct. 2011, pp.22–26.

[26] Juve G, Deelman E, Berriman G, Berman B, Maechling P. An evaluation of the cost and performance of scientific workflows on Amazon EC2. Journal of Grid Computing, 2012, 10(1): 5–21.

[27] Juve G, Deelman E, Vahi K, Mehta G, Berriman B, Berman B, Maechling P. Scientific workflow applications on Amazon EC2. In Proc. the 5th IEEE International Conference on E-Science Workshops, Dec. 2009, pp.59–66.

[28] Vecchiola C, Pandey S, Buyya R. High performance cloud computing: A view of scientific applications. In Proc. the 10th International Symposium on Pervasive Systems, Algorithms, and Networks, Dec. 2009, pp.4–16.

[29] Evangelinos C, Hill C N. Cloud computing for parallel scientific HPC applications: Feasibility of running coupled atmosphereocean climate models on Amazon's EC2. In Proc. the 1st CCA, Oct. 2008.

[30] Li A, Yang X, Kandula S, Zhang M. CloudCmp: Comparing public cloud providers. In Proc. the 10th ACM SIGCOMM Conference on Internet Measurement, Nov. 2010, pp.1–14.

[31] Ward J S. A performance comparison of Clouds: Amazon EC2 and Ubuntu enterprise Cloud. Technical Report, Cloud Computing Co-laboratory. University of St Andrews, SICSA DemoFEST, 2009.

[32] Tudoran R, Costan A, Antoniu G et al. A performance evaluation of Azure and Nimbus clouds for scientific applications. In Proc. the 2nd International Workshop on Cloud Computing Platforms, Apr. 2012, Article No. 4.

[33] Voras I, Orli?M, Mihaljevi? B. An early comparison of commercial and open-source cloud platforms for scientific environments. In Proc. the 6th KES International Conference on Agent and MultiAgent Systems: Technologies and Applications, June 2012, pp.164–173.

[34] Popovi? O, Jovanovi? Z, Jovanovi? N, Popovi? R. A comparison and security analysis of the cloud computing software platforms. In Proc. the 10th International Conference on Telecommunication in Modern Satellite Cable and Broadcasting Services (TELSIKS), Vol.2, Oct. 2011, pp.632–634.

[35] Garg S K, Versteeg S, Buyya R. A framework for ranking of Cloud computing services. Future Generation Computer Systems, 2013, 29(4): 1012–1023.
No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
[1] Hao Ruibing; Wu Jianping;. A Formal Approach to Protocol Interoperability Testing[J]. , 1998, 13(1): 79 -90 .
[2] Chen Gang;. Dependent Type System with Subtyping (I)Type Level Transitivity Elimination[J]. , 1998, 13(6): 564 -578 .
[3] SUN Ninghui;. Reference Implementation of Scalable I/O Low-Level API on Intel Paragon[J]. , 1999, 14(3): 206 -223 .
[4] LIU Thnpei;. Orthogonal Drawings of Graphs for the Automation of VLSI Circuit Design[J]. , 1999, 14(5): 447 -459 .
[5] Ying-Lei Song, Ji-Zhen Zhao, Chun-Mei Liu, Kan Liu, Russell Malmberg, and Li-Ming Cai. RNA Structural Homology Search with a Succinct Stochastic Grammar Model[J]. , 2005, 20(4): 454 -464 .
[6] Cai-Xia Zhang and Zhan-Yi Hu. A General Sufficient Condition of Four Positive Solutions of the P3P Problem[J]. , 2005, 20(6): 836 -842 .
[7] Jian Yu and Cui-Xia Li. Novel Cluster Validity Index for FCM Algorithm[J]. , 2006, 21(1): 137 -140 .
[8] Shan Wang, Xiao-Yong Du, Xiao-Feng Meng, and Hong Chen. Database Research: Achievements and Challenges[J]. , 2006, 21(5): 823 -837 .
[9] Qing Ai, Yan-Song Li, and Gui-Lu Long. Influences of Gate Operation Errors in the Quantum Counting Algorithm[J]. , 2006, 21(6): 927 -931 .
[10] Ji-Dong Chen and Xiao-Feng Meng. Indexing Future Trajectories of Moving Objects in a Constrained Network[J]. , 2007, 22(2): 245 -251 .

ISSN 1000-9000(Print)

         1860-4749(Online)
CN 11-2296/TP

Home
Editorial Board
Author Guidelines
Subscription
Journal of Computer Science and Technology
Institute of Computing Technology, Chinese Academy of Sciences
P.O. Box 2704, Beijing 100190 P.R. China
Tel.:86-10-62610746
E-mail: jcst@ict.ac.cn
 
  Copyright ©2015 JCST, All Rights Reserved