Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Statistical Methods for Computational Markets: Proportional Share Market Prediction and Admission Control
KTH, School of Information and Communication Technology (ICT), Computer and Systems Sciences, DSV.
2008 (English)Doctoral thesis, comprehensive summary (Other scientific)
Abstract [en]

We design, implement and evaluate statistical methods for managing uncertainty when consuming and provisioning resources in a federated computational market. To enable efficient allocation of resources in this environment, providers need to know consumers' risk preferences, and the expected future demand. The guarantee levels to offer thus depend on techniques to forecast future usage and to accurately capture and model uncertainties. Our main contribution in this thesis is threefold; first, we evaluate a set of techniques to forecast demand in computational markets; second, we design a scalable method which captures a succinct summary of usage statistics and allows consumers to express risk preferences; and finally we propose a method for providers to set resource prices and determine guarantee levels to offer. The methods employed are based on fundamental concepts in probability theory, and are thus easy to implement, as well as to analyze and evaluate. The key component of our solution is a predictor that dynamically constructs approximations of the price probability density and quantile functions for arbitrary resources in a computational market. Because highly fluctuating and skewed demand is common in these markets, it is difficult to accurately and automatically construct representations of arbitrary demand distributions. We discovered that a technique based on the Chebyshev inequality and empirical prediction bounds, which estimates worst case bounds on deviations from the mean given a variance, provided the most reliable forecasts for a set of representative high performance and shared cluster workload traces. We further show how these forecasts can help the consumers determine how much to spend given a risk preference and how providers can offer admission control services with different guarantee levels given a recent history of resource prices.

Place, publisher, year, edition, pages
Stockholm: KTH , 2008. , xii, 75 p.
Series
Report series / DSV, ISSN 1101-8526 ; 08-006
Keyword [en]
Distributed Systems, Grid Computing, Performance Analysis, Workload Modeling, Middleware, Quality of Service, Prediction, Admission Control
National Category
Information Science
Identifiers
URN: urn:nbn:se:kth:diva-4738ISBN: 978-91-7178-924-2 (print)OAI: oai:DiVA.org:kth-4738DiVA: diva2:13707
Public defence
2008-05-26, Hall C, KTH-Forum, Isafjordsgatan 39, Stockholm, 13:00
Opponent
Supervisors
Note
QC 20100909Available from: 2008-05-09 Created: 2008-05-09 Last updated: 2010-09-09Bibliographically approved
List of papers
1. Market-Based Resource Allocation using Price Prediction in a high performance computing Grid for scientific applications
Open this publication in new window or tab >>Market-Based Resource Allocation using Price Prediction in a high performance computing Grid for scientific applications
2006 (English)In: Proceedings of the IEEE International Symposium on High Performance Distributed Computing 2006, 2006, Vol. 15th IEEE International Symposium, 132-143 p.Conference paper, Published paper (Refereed)
Abstract [en]

We present the implementation and analysis of a market-based resource allocation system for computational Grids. Although Grids provide a way to share resources and take advantage of statistical multiplexing, a variety of challenges remain. One is the economically efficient allocation of resources to users from disparate organizations who have their own and sometimes conflicting requirements for both the quantity and quality of services. Another is secure and scalable authorization despite rapidly changing allocations.

Our solution to both of these challenges is to use a market-based resource allocation system. This system allows users to express diverse quantity- and quality-of-service requirements, yet prevents them from denying service to other users. It does this by providing tools to the user to predict and tradeoff risk and expected return in the computational market. In addition, the system enables secure and scalable authorization by using signed money-transfer tokens instead of identity-based authorization. This removes the overhead of maintaining and updating access control lists, while restricting usage based on the amount of money transferred We examine the performance of the system by running a bioinformatics application on a fully operational implementation of an integrated Grid market.

Keyword
Computational methods, Marketing, Multiplexing, Quality of service, Resource allocation, Statistical methods, Computational market, Computing grid, Market based resource allocation system, Quantity of service
National Category
Industrial Biotechnology Information Science
Identifiers
urn:nbn:se:kth:diva-7799 (URN)10.1109/HPDC.2006.1652144 (DOI)000239086500011 ()2-s2.0-33845897137 (Scopus ID)
Conference
The IEEE International Symposium on High Performance Distributed Computing 2006
Note
QC 20100622Available from: 2007-12-10 Created: 2007-12-10 Last updated: 2010-11-16Bibliographically approved
2. Evaluating Demand Prediction Techniques for Computational Markets
Open this publication in new window or tab >>Evaluating Demand Prediction Techniques for Computational Markets
2006 (English)In: GECON ’06: Proceedings of the 3rd Interna-tional Workshop on Grid Economics and Business Models, 2006Conference paper, Published paper (Refereed)
National Category
Information Science
Identifiers
urn:nbn:se:kth:diva-8393 (URN)10.1142/9789812773470_0001 (DOI)
Note
QC 20100908Available from: 2008-05-09 Created: 2008-05-09 Last updated: 2010-09-08Bibliographically approved
3. A Statistical Approach to Risk Mitigation in Computational Markets
Open this publication in new window or tab >>A Statistical Approach to Risk Mitigation in Computational Markets
2007 (English)In: Proceedings of the 16th International Symposium on High Performance Distributed Computing 2007, HPDC'07, 2007, 85-96 p.Conference paper, Published paper (Refereed)
Abstract [en]

We study stochastic models to mitigate the risk of poor Quality-of-Service (QoS) in computational markets. Consumers whopurchase services expect both price and performance guarantees. They need to predict future demand to budget for sustained performance despite price fluctuations. Conversely, providers need to estimate demand to price future usage. The skewed and bursty nature of demand in large-scale computer networks challenges the common statistical assumptions of symmetry, independence, and stationarity. This discrepancy leads to under estimation of investment risk. We confirm this non-normal distribution behavior in our study of demand in computational markets. The high agility of a dynamic resource market requires flexible, efficient, and adaptable predictions. Computational needs are typically expressed using performance levels, hence we estimate worst-case bounds of price distributions to mitigate the risk of missing execution deadlines. To meet these needs, we use moving time windows of statistics to approximate price percentile functions. We use snapshots of summary statistics to calculate prediction intervals and estimate model uncertainty. Our approach is model-agnostic, distribution-free both in prices and prediction errors, and does not require extensive sampling nor manual parameter tuning. Our simulations and experiments show that a Chebyshev inequality model generates accurate predictions with minimal sample data requirements. We also show that this approach mitigates the risk of dropped service level performance when selecting hosts to run a bag-of-task Grid application simulation in a live computational market cluster.

Keyword
QoS, Resource allocation, Service level management
National Category
Information Science
Identifiers
urn:nbn:se:kth:diva-8394 (URN)10.1145/1272366.1272378 (DOI)2-s2.0-34548089128 (Scopus ID)
Note
QC 20100908Available from: 2008-05-09 Created: 2008-05-09 Last updated: 2010-09-08Bibliographically approved
4. Prediction-Based Enforcement of Performance Contracts
Open this publication in new window or tab >>Prediction-Based Enforcement of Performance Contracts
2007 (English)In: Grid Economics and Business Models / [ed] Altmann, J; Veit, DJ, 2007, Vol. 4685, 71-82 p.Conference paper, Published paper (Refereed)
Abstract [en]

Grid computing platforms require automated and distributed resource allocation with controllable quality-of-service (QoS). Market-based allocation provides these features using the complementary abstractions of proportional shares and reservations. This paper analyzes a hybrid resource allocation system using both proportional shares and reservations. We also examine the use of price prediction to provide statistical QoS guarantees and to set admission control prices.

Series
Lecture Notes In Computer Science, ISSN 0302-9743 ; 4685
Keyword
admission control, proportional share, computational market
National Category
Computer and Information Science
Identifiers
urn:nbn:se:kth:diva-8395 (URN)10.1007/978-3-540-74430-6_6 (DOI)000249580900006 ()2-s2.0-38049079767 (Scopus ID)978-3-540-74428-3 (ISBN)
Conference
4th International Workshop on Grid Economics and Business Models (GECON 2007) Location: Rennes, France, Date: AUG 28, 2007
Available from: 2008-05-09 Created: 2008-05-09 Last updated: 2011-09-27Bibliographically approved
5. Autoregressive Time Series Forecasting of Computational Demand
Open this publication in new window or tab >>Autoregressive Time Series Forecasting of Computational Demand
2007 (English)Report (Other academic)
Abstract [en]

We study the predictive power of autoregressive moving average models when forecasting demand in two shared computational networks, PlanetLab and Tycoon. Demand in these networks is very volatile, and predictive techniques to plan usage in advance can improve the performance obtained drastically.

Our key finding is that a random walk predictor performs best for one-step-ahead forecasts, whereas ARIMA(1,1,0) and adaptive exponential smoothing models perform better for two and three-step-ahead forecasts. A Monte Carlo bootstrap test is proposed to evaluate the continuous prediction performance of different models with arbitrary confidence and statistical significance levels. Although the prediction results differ between the Tycoon and PlanetLab networks, we observe very similar overall statistical properties, such as volatility dynamics.

Series
Technical Report
National Category
Information Science
Identifiers
urn:nbn:se:kth:diva-8396 (URN)
Note
QC 20100909Available from: 2008-05-09 Created: 2008-05-09 Last updated: 2010-09-09Bibliographically approved
6. Admission Control in a Computational Market
Open this publication in new window or tab >>Admission Control in a Computational Market
2008 (English)In: Proceedings CCGRID 2008 - 8th IEEE International Symposium on Cluster Computing and the Grid / [ed] Priol, T.; Lefevre, L.; Buyya, R., 2008, 277-286 p.Conference paper, Published paper (Refereed)
Abstract [en]

We propose, implement and evaluate three admission models for computational Grids. The models take the expected demand into account and offer a specific performance guarantee. The main issue addressed is how users and providers should make the tradeoff between a best effort (low guarantee) spot market and an admission controlled (high guarantee) reservation market. Using a realistically modeled high performance computing workload and utility models of user preferences, we run experiments highlighting the conditions under which different markets and admission models are efficient. The experimental results show that providers can make large efficiency gains if the admission model is chosen dynamically based on the current load, likewise we show that users have an opportunity to optimize their job performance by carefully picking the right market based on the state of the system, and the characteristics of the application to be run. Finally, we provide simple functional expressions that can guide both users and providers when making decisions about guarantee levels to request or offer.

Keyword
Chlorine compounds, Computer networks, Computer systems, Decision making, Technical presentations, Cluster computing, High performance computing, International symposium
National Category
Information Science
Identifiers
urn:nbn:se:kth:diva-8397 (URN)10.1109/CCGRID.2008.82 (DOI)000270502300035 ()2-s2.0-50649124960 (Scopus ID)978-0-7695-3156-4 (ISBN)
Note
QC 20100909Available from: 2008-05-09 Created: 2008-05-09 Last updated: 2010-09-09Bibliographically approved

Open Access in DiVA

fulltext(666 kB)799 downloads
File information
File name FULLTEXT01.pdfFile size 666 kBChecksum MD5
1fb2d85314fb870418392c8ea936d2dcce9f905d9a7e480684fc6bd38423ddaf84b9b147
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Sandholm, Thomas
By organisation
Computer and Systems Sciences, DSV
Information Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 799 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 823 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf