Change search
Refine search result
1234567 101 - 150 of 371
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 101. Doorn, N.
    et al.
    Hansson, Sven Ove
    KTH, School of Architecture and the Built Environment (ABE), Philosophy and History of Technology, Philosophy.
    Design for the value of safety2015In: Handbook of Ethics, Values, and Technological Design: Sources, Theory, Values and Application Domains, Springer Netherlands, 2015, p. 491-511Chapter in book (Other academic)
    Abstract [en]

    Two major methods for achieving safety in engineering design are compared: safety engineering and probabilistic risk analysis. Safety engineering employs simple design principles or rules of thumb such as inherent safety, multiple barriers, and numerical safety margins to reduce the risk of accidents. Probabilistic risk analysis combines the probabilities of individual events in event chains leading to accidents in order to identify design elements in need of improvement and often also to optimize the use of resources. It is proposed that the two methodologies should be seen as complementary rather than as competitors. Probabilistic risk analysis is at its advantage when meaningful probability estimates are available for most of the major events that may contribute to an accident. Safety engineering principles are more suitable to deal with uncertainties that defy quantification. In many design tasks, the combined use of both methodologies is preferable.

  • 102. Douc, Randal
    et al.
    Moulines, Eric
    Olsson, Jimmy
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).
    Long-term stability of sequential monte carlo methods under verifiable conditions2014In: The Annals of Applied Probability, ISSN 1050-5164, E-ISSN 2168-8737, Vol. 24, no 5, p. 1767-1802Article in journal (Refereed)
    Abstract [en]

    This paper discusses particle filtering in general hidden Markov models (HMMs) and presents novel theoretical results on the long-term stability of bootstrap-type particle filters. More specifically, we establish that the asymptotic variance of the Monte Carlo estimates produced by the bootstrap filter is uniformly bounded in time. On the contrary to most previous results of this type, which in general presuppose that the state space of the hidden state process is compact (an assumption that is rarely satisfied in practice), our very mild assumptions are satisfied for a large class of HMMs with possibly non-compact state space. In addition, we derive a similar time uniform bound on the asymptotic L-p error. Importantly, our results hold for misspecified models; that is, we do not at all assume that the data entering into the particle filter originate from the model governing the dynamics of the particles or not even from an HMM.

  • 103. Douc, Randal
    et al.
    Moulines, Eric
    Rydén, Tobias
    Lund University.
    Asymptotic properties of the maximum likelihood estimator in autoregressive models with Markov regime2004In: Annals of Statistics, ISSN 0090-5364, E-ISSN 2168-8966, Vol. 32, no 5, p. 2254-2304Article in journal (Refereed)
    Abstract [en]

    An autoregressive process with Markov regime is an autoregressive process for which the regression function at each time point is given by a nonobservable Markov chain. In this paper we consider the asymptotic properties of the maximum likelihood estimator in a possibly nonstationary process of this kind for which the hidden state space is compact but not necessarily finite. Consistency and asymptotic normality are shown to follow from uniform exponential forgetting of the initial distribution for the hidden Markov chain conditional on the observations.

  • 104.
    Drugge, Daniel
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Allocation Methods for Alternative Risk Premia Strategies2014Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    We use regime switching and regression tree methods to evaluate performance in the risk premia strategies provided by Deutsche Bank and constructed from U.S. research data from the Fama French library. The regime switching method uses the Baum-Welch algorithm at its core and splits return data into a normal and a turbulent regime. Each regime is independently evaluated for risk and the estimates are then weighted together according to the expected value of the proceeding regime. The regression tree methods identify macro-economic states in which the risk premia perform well or poorly and use these results to allocate between risk premia strategies. The regime switching method proves to be mostly unimpressive but has its results boosted by investing less into risky assets as the probability of an upcoming turbulent regime becomes larger. This proves to be highly effective for all time periods and for both data sources. The regression tree method proves the most effective when making the assumption that we know all macro-economic data the same month as it is valid for. Since this is an unrealistic assumption the best method seems to be to evaluate the performance of the risk premia strategy using macro-economic data from the previous quarter.

  • 105.
    Dufour Partanen, Bianca
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    On the Valuation of Contingent Convertibles (CoCos): Analytically Tractable First Passage Time Model for Pricing AT1 CoCos2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Contingent Convertibles (CoCos) are a new type of hybrid debt instrument characterized by forced equity conversion or write-down under a specified trigger event, usually indicating a state of near non-viability of the Additional Tier 1 capital category, giving them additional features such as possible coupon cancellation. In this thesis, the structure of CoCos is presented and different pricing approaches are introduced. A special focus is put on structural models with the Analytically Tractable First Passage Time(AT1P) Model and its extensions. Two models are applied on the write-down CoCo issued by Svenska Handelsbanken, starting with the equity derivative model and followed by the AT1P model.

  • 106.
    Edberg, Erik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Prediktering av VD-löner i svenska onoterade aktiebolag2015Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

     

    The CEO’s remuneration is in contradiction to the union labours, set individually and independent from union agreements. The company board determines the remuneration. It’s based on an estimated valuation of variables such as job characteristics, personal qualities of the CEO, market valuation of similar tasks and the availability of possible candidates.

    The purpose of this thesis is to create a model to predict the market remuneration for a current or forthcoming CEO. Further, the compensation structure will be examined, aiming to find the compensation structure that maximizes the CEO’s performance.

    This thesis showes that it is possible to predict the CEO remuneration for employed CEOs in unlisted corporations with 64-percentage explanation rate. The variance is explained by six covariates, four covariates representing job characteristics and two related to company performance. The highest explanation rate is given by the covariate turnover, which explains just below 40-percentage of the remuneration variance.

    This study shows that the optimal compensation structure is different for different

    companies. Further, recommendations for what the variable remuneration should be

    based on, in order to maximize the CEO’s performance.

  • 107.
    Ekeberg, Magnus
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Detecting contacts in protein folds by solving the inverse Potts problem - a pseudolikelihood approach2012Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract 

    Spatially proximate amino acid positions in a protein tend to co-evolve, so a protein's 3D-structure leaves an echo of correlations in the evolutionary record. Reverse engineering 3D-structures from such correlations is an open problem in structural biology, pursued with increasing vigor as new protein sequences continue to fill the data banks. Within this task lies a statistical stumbling block, rooted in the following: correlation between two amino acid positions can arise from firsthand interaction, but also be network-propagated via intermediate positions; observed correlation is not enough to guarantee proximity. The remedy, and the focus of this thesis, is to mathematically untangle the crisscross of correlations and extract direct interactions, which enables a clean depiction of co-evolution among the positions.

    Recently, analysts have used maximum-entropy modeling to recast this cause-and-effect puzzle as parameter learning in a Potts model (a kind of Markov random field). Unfortunately, a computationally expensive partition function puts this out of reach of straightforward maximum-likelihood estimation. Mean-field approximations have been used, but an arsenal of other approximate schemes exists. In this work, we re-implement an existing contact-detection procedure and replace its mean-field calculations with pseudo-likelihood maximization. We then feed both routines real protein data and highlight differences between their respective outputs. Our new program seems to offer a systematic boost in detection accuracy.

  • 108.
    El Menouni, Zakaria
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Pricing Interest Rate Derivatives in the Multi-Curve Framework with a Stochastic Basis2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The financial crisis of 2007/2008 has brought about a lot of changes in the interest rate market in particular, as it has forced to review and modify the former pricing procedures and methodologies. As a consequence, the Multi-Curve framework has been adopted to deal with the inconsistencies of the frameworks used so far, namely the single-curve method.

    We propose to study this new framework in details by focusing on a set of interest rate derivatives such as deposits, swaps and caplets, then we explore a stochastic approach to model the Libor-OIS basis spread, which has appeared since the beginning of the crisis and is now the quantity of interest to which a lot of researchers dedicate their work (F.Mercurio, M.Bianchetti and others).

    A discussion follows this study to set the light on the challenges and difficulties related to the modeling of basis spread.

     

  • 109.
    Eliasson, Daniel
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Game contingent claims2012Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract

    Game contingent claims (GCCs), as introduced by Kifer (2000), are a generalization of American contingent claims where the writer has the opportunity to terminate the contract, and must then pay the intrinsic option value plus a penalty. In complete markets, GCCs are priced using no-arbitrage arguments as the value of a zero-sum stochastic game of the type described in Dynkin (1969). In incomplete markets, the neutral pricing approach of

    Kallsen and Kühn (2004) can be used.

    In Part I of this thesis, we introduce GCCs and their pricing, and also cover some basics of mathematical finance. In Part II, we present a new algorithm for valuing game contingent claims. This algorithm generalises the least-squares Monte-Carlo method for pricing American options of Longstaff and Schwartz (2001). Convergence proofs are obtained, and the algorithm is tested against certain GCCs. A more efficient algorithm is derived from the first one using the computational complexity analysis technique of Chen and Shen (2003).

    The algorithms were found to give good results with reasonable time requirements. Reference implementations of both algorithms are available for download from the author’s Github page https://github.com/del/ Game-option-valuation-library

  • 110.
    Engsner, Hampus
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A PIT - Based approach to Validation of Electricity Spot Price Models2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The modeling of electricity spot prices is still in its early stages, with various different competing models being proposed by different researchers. This makes model evaluation and comparison research an important area, for practitioners and researchers alike. However, there is a distinct lack in the literature of consensus regarding model evaluation tools to assess model validity, with different researchers using different methods of varying suitability as validation methods. In this thesis the current landscape of electricity spot price models and how they are currently evaluated is mapped out. Then, as the main contribution this research aims to make, a general and flexible framework for model validation is proposed, based on the Probability Integral Transform (PIT). The probability integral transform, which can be seen as a generalization of analyzing residuals in simple time series and regression models, transforms the realizations of a time series into independent and identically distributed U(0,1) variables using the conditional distributions of the time series. Testing model validity is with this method reduced to testing if the PIT values are independent and identically distributed U(0,1) variables. The thesis is concluded by testing spot price models of varying validity according to previous research using this framework against actual spot price data. These empirical tests suggest that PIT-based model testing does indeed point us toward the more suitable models, with especially unsuitable models being rejected by a large margin.

  • 111.
    Eriksson, André
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Anomaly Detection inMachine-Generated Data:A Structured Approach2013Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Anomaly detection is an important issue in data mining and analysis, with applications in almost every area in science, technology and business that involves data collection. The development of general anomaly detection techniques can therefore have a large impact on data analysis across many domains. In spite of this, little work has been done to consolidate the different approaches to the subject.

    In this report, this deficiency is addressed in the target domain of temporal machine-generated data. To this end, new theory for comparing and reasoning about anomaly detection tasks and methods is introduced, which facilitates a problem-oriented rather than a method-oriented approach to the subject. Using this theory as a basis, the possible approaches to anomaly detection in the target domain are discussed, and a set of interesting anomaly detection tasks is highlighted.

    One of these tasks is selected for further study: the detection of subsequences that are anomalous with regards to their context within long univariate real-valued sequences. A framework for relating methods derived from this task is developed, and is used to derive new methods and an algorithm for solving a large class of derived problems. Finally, a software implementation of this framework along with a set of evaluation utilities is discussed and demonstrated

  • 112. Eriksson, Kimmo
    et al.
    Jansson, Fredrik
    Sjöstrand, Jonas
    Stockholms universitet.
    Bentley's conjecture on popularity toplist turnover under random copying2010In: The Ramanujan journal, ISSN 1382-4090, E-ISSN 1572-9303, Vol. 23, p. 371-396Article in journal (Refereed)
    Abstract [en]

    Bentley et al studied the turnover rate in popularity toplists in a ’random copying’ model of cultural evolution. Based on simulations of a model with population size N, list length ℓ and invention rate μ, they conjectured a remarkably simple formula for the turnover rate: ℓ√μ. Here we study an overlapping generations version of the random copying model, which can be interpreted as a random walk on the integer partitions of the population size. In this model we show that the conjectured formula, after a slight correction, holds asymptotically.

  • 113.
    Eriksson, Kimmo
    et al.
    Mälardalen University, School of Education, Culture and Communication.
    Sjöstrand, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).
    Limiting shapes of birth-and-death processes on Young diagrams2012In: Advances in Applied Mathematics, ISSN 0196-8858, E-ISSN 1090-2074, Vol. 48, no 4, p. 575-602Article in journal (Refereed)
    Abstract [en]

    We consider a family of birth processes and birth-and-death processes on Young diagrams of integer partitions of n. This family incorporates three famous models from very different fields: Rost's totally asymmetric particle model (in discrete time), Simon's urban growth model, and Moran's infinite alleles model. We study stationary distributions and limit shapes as n tends to infinity, and present a number of results and conjectures.

  • 114. Ezquiaga, J. M.
    et al.
    Zumalacárregui, Miguel
    KTH, Centres, Nordic Institute for Theoretical Physics NORDITA.
    Dark Energy after GW170817: Dead Ends and the Road Ahead2017In: Physical Review Letters, ISSN 0031-9007, E-ISSN 1079-7114, Vol. 119, no 25, article id 251304Article in journal (Refereed)
    Abstract [en]

    Multimessenger gravitational-wave (GW) astronomy has commenced with the detection of the binary neutron star merger GW170817 and its associated electromagnetic counterparts. The almost coincident observation of both signals places an exquisite bound on the GW speed |cg/c-1|≤5×10-16. We use this result to probe the nature of dark energy (DE), showing that a large class of scalar-tensor theories and DE models are highly disfavored. As an example we consider the covariant Galileon, a cosmologically viable, well motivated gravity theory which predicts a variable GW speed at low redshift. Our results eliminate any late-universe application of these models, as well as their Horndeski and most of their beyond Horndeski generalizations. Three alternatives (and their combinations) emerge as the only possible scalar-tensor DE models: (1) restricting Horndeski's action to its simplest terms, (2) applying a conformal transformation which preserves the causal structure, and (3) compensating the different terms that modify the GW speed (to be robust, the compensation has to be independent on the background on which GWs propagate). Our conclusions extend to any other gravity theory predicting varying cg such as Einstein-Aether, Hořava gravity, Generalized Proca, tensor-vector-scalar gravity (TEVES), and other MOND-like gravities.

  • 115.
    Forsman, Mikael
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A Model Implementation of Incremental Risk Charge2012Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract

    In 2009 the Basel Committee on Banking Supervision released the final guidelines for computing capital for the Incremental Risk Charge, which is a complement to the traditional Value at Risk intended to measure the migration risk and the default risk in the trading book. Before Basel III banks will have to develop their own Incremental Risk Charge model following these guidelines. The development of such a model that computes the capital charge for a portfolio of corporate bonds is described in this thesis. Essential input parameters like the credit ratings of the underlying issuers, credit spreads, recovery rates at default, liquidity horizons and correlations among the positions in the portfolio will be discussed. Also required in the model is the transition matrix with probabilities of migrating between different credit states, which is measured by historical data from Moody´s rating institute. Several sensitivity analyses and stress tests are then made by generating different scenarios and running them in the model and the results of these tests are compared to a base case. As it turns out, the default risk contributes for the most part of the Incremental Risk Charge.

  • 116. Fosgerau, M.
    et al.
    Lindberg, P. O.
    Mattsson, Lars-Göran
    KTH, School of Architecture and the Built Environment (ABE), Transport Science. KTH, School of Architecture and the Built Environment (ABE), Centres, Centre for Transport Studies, CTS.
    Weibull, Jörgen
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.). Stockholm School of Economics, Sweden.
    A note on the invariance of the distribution of the maximum2018In: Journal of Mathematical Economics, ISSN 0304-4068, E-ISSN 1873-1538, Vol. 74, p. 56-61Article in journal (Refereed)
    Abstract [en]

    Many models in economics involve discrete choices where a decision-maker selects the best alternative from a finite set. Viewing the array of values of the alternatives as a random vector, the decision-maker draws a realization and chooses the alternative with the highest value. The analyst is then interested in the choice probabilities and in the value of the best alternative. The random vector has the invariance property if the distribution of the value of a specific alternative, conditional on that alternative being chosen, is the same, regardless of which alternative is considered. This note shows that the invariance property holds if and only if the marginal distributions of the random components are positive powers of each other, even when allowing for quite general statistical dependence among the random components. We illustrate the analytical power of the invariance property by way of examples.

  • 117.
    Fransson, Viktor
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Graphical lasso for covariance structure learning in the high dimensional setting2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This thesis considers the estimation of undirected Gaussian graphical models especially in the high dimensional setting where the true observations are assumed to be non-Gaussian distributed.

    The first aim is to present and compare the performances of existing Gaussian graphical model estimation methods. Furthermore since the models rely heavily on the normality assumption, various methods for relaxing the normal assumption are presented. In addition to the existing methods, a modified version of the joint graphical lasso method is introduced which monetizes on the strengths of the community Bayes method. The community Bayes method is used to partition the features (or variables) of datasets consisting of several classes into several communities which are estimated to be mutually independent within each class which allows the calculations when performing the joint graphical lasso method, to be split into several smaller parts. The method is also inspired by the cluster graphical lasso and is applicable to both Gaussian and non-Gaussian data, assuming that the normal assumption is relaxed.

    Results show that the introduced cluster joint graphical lasso method outperforms com-peting methods, producing graphical models which are easier to comprehend due to the added information obtained from the clustering step of the method. The cluster joint graphical lasso is applied to a real dataset consisting of p = 12582 features which resulted in computation gain of a factor 35 when comparing to the competing method which is very significant when analysing large datasets. The method also allows for parallelization where computations can be spread across several computers greatly increasing the computational efficiency.

  • 118.
    Fröling, Anton
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Lahdo, Sandy
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A smoother and more up-to-date development of the income pension2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    For an apparatus as big as the pension system, the financial stability is essential. An important feature in the existing pension system is the balance mechanism, which secures the stability of the system. The balance ratio is obtained by dividing the assets by the liabilities. When this ratio drops below 1.0000, it triggers the so-called automatic balancing.

    While the existing pension system has achieved its goal of being financially stable, it has become clear that the indexation of the pensions during balancing periods has properties that are not optimal. On a short-term perspective the income pension system is exposed to the risk of reacting with a lag, or reacting unnecessarily strong. This gave rise to a new legislative proposal, issued by the government. The goal of the proposal is to obtain a smoother and more up-to-date development of the income pension, i.e. a shorter lag period, without jeopardizing the financial stability. In addition to this it is also desirable to simplify and improve the existing calculation methods. In order to compare the existing calculation methods in the pension system with the new legislative proposal, a simplified model of the existing pension system and the modified version of it, are created.

    The results of this study shows that the new legislative proposal decreases the volatility in the pensions and it avoids the deepest valleys in the balance ratio. The development of the pension disbursements in the new system has a higher correlation with the development of the average pension-qualifying income than in the current system. Moreover, the results show that the new system has a shorter lag period which makes the income pension system more up- to-date with the current economic and demographic situation.

    The financial stability is still contained, and the new system also handles variations in the inflation better than the current system

  • 119.
    Gallais, Arnaud
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    CPPI Structures on Funds Derivatives2011Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract

    With the ever-increasing complexity of financial markets and financial products, many investors now choose to benefit from a manager’s expertise by investing in a fund. This fueled a rapid growth of the fund industry over the past decades, and the recent emergence of complex derivatives products written on underlying funds. The diversity (hedge funds, mutual funds, funds of funds, managed accounts…) and the particularities (liquidity, specific risks) of funds call for adapted models and suited risk management. This thesis aims at understanding the issues and difficulties met when dealing with such products. In particular, we will deal in a great extent with CPPI (Constant Proportion Portfolio Insurance) structures written on funds, which combine the specificities of funds with particularities of such structures. Correctly assessing the corresponding market risks is a challenging issue, and is the subject of many investigations.

  • 120.
    Georgelis, Nikos
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Nyberg, Mikael
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A Scenario Based Allocation Model Using Entropy Pooling for Computing the cenarioProbabilities2013Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    We introduce a scenario based allocation model (SBAM) that uses entropy pooling for computing scenario probabilities. Compared to most other models that allow the investor to blend historical data with subjective views about the future, the SBAM does not require the investor to quantify a level of confidence in the subjective views.

     A quantitative test is performed on a simulated systematic fund offered by the fund company Informed Portfolio Management in Stockholm, Sweden. The simulated fund under study consists of four individual systematic trading strategies and the test is simulated on a monthly basis during the years 1986-2010.

     We study how the selection of views might affect the SBAM portfolios, creating three systematic views and combining them in different variations creating seven SBAM portfolios. We also compare how the size of sample data affects the results. 

     Furthermore, the SBAM is compared to more common allocation methods, namely an equally weighted portfolio and a portfolio optimization based only on historical data.

     We find that the SBAM portfolios produced higher annual returns and information ratio than the equally weighted portfolio or the portfolio optimized only on historical data.

  • 121.
    Giertz Jonsson, Fredrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Analysis and Optimization of aPortfolio of Catastrophe Bonds2014Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This Master's Thesis in mathematical statistics has the two major purposes; (i) to model and measure the risk associated with a special type of reinsurance contract, the catastrophe bond, and (ii) to analyze and develop methods of portfolio optimization suitable for a portfolio of catastrophe bonds. Two pathways of modeling potential catastrophe bond losses are analyzed; one method directly modeling potential contract losses and one method modeling the underlying contract loss governing variables. The first method is simple in its structure but with the disadvantage of the inability to introduce a dependence structure between the losses of different contracts in a simple and flexible way. The second modeling method uses a stochastic number of stochastic events representation connected into a multivariate dependence structure using the theory of copulas.

    Results show that the choice of risk measure is of great importance when analyzing catastrophe bonds and their related risks. As an example, the measure Value at Risk often fails to capture the essence of catastrophe bond risk, which in turn means that portfolio optimization with respect to the same might lead to a systematic obscurity of risk. Two coherent risk measures were investigated, the spectral risk measure and the Expected Shortfall measure. Both measures provides good representation of the risk of a portfolio consisting of catastrophe bonds.

    This thesis extends and applies a well-known optimization method of Conditional Value at Risk to obtain a method of optimization of spectral risk measures. The optimization results show that expected shortfall optimization leads to portfolios being advantageous at the specific point at which it is optimized but that their characteristics may be disadvantageous at other parts of the loss distribution. Portfolios optimized for the spectral risk measure were shown to possess good characteristics across the entire loss distribution. Optimization results were compared to the popular mean-variance portfolio optimization approach. The comparison shows that the mean-variance approach handles the special distribution of catastrophe bond losses in an over-simplistic way, and that it has a severe lack of flexibility towards focusing on different aspects of risk. The spectral risk measure optimization procedure was demonstrated to be the most flexible and possibly the most appropriate way to optimize a portfolio of catastrophe bonds.

     

  • 122.
    Gobeljic, Persa
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Classification of Probability of Defaultand Rating Philosophies2012Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Basel II consists of international recommendations on banking regulations, mainly concerning how much capital banks and other financial institutions should be made to set aside in order to protect themselves from various types of risks. Implementing Basel II involves estimating risks; one of the main measurements is Probability of Default. Firm specific and macroeconomic risks cause obligors to default. Separating the two risk factors in order to define which of them affect the Probability of Default through the years. The aim of this thesis is to enable a separation of the risk variables in the structure of Probability of Default in order to classify the rating philosophy.

  • 123.
    Grandell, Jan
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Schmidli, Hanspeter
    Ruin probabilities in a diffusion environment2011In: Journal of Applied Probability, ISSN 0021-9002, E-ISSN 1475-6072, Vol. 48A, p. 39-50Article in journal (Refereed)
    Abstract [en]

    We consider an insurance model, where the underlying point process is a Cox process. Using a martingale approach applied to diffusion processes, finite-time Lundberg inequalities are obtained. By change-of-measure techniques, Cramer-Lundberg approximations are derived.

  • 124.
    Gruselius, Hanna
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Generative Models and Feature Extraction on Patient Images and Structure Data in Radiation Therapy2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This Master thesis focuses on generative models for medical patient data for radiation therapy. The objective with the project is to implement and investigate the characteristics of a Variational Autoencoder applied to this diverse and versatile data. The questions this thesis aims to answer are: (i) whether the VAE can capture salient features of medical image data, and (ii) if these features can be used to compare similarity between patients. Furthermore, (iii) if the VAE network can successfully reconstruct its input and lastly (iv) if the VAE can generate artificial data having a reasonable anatomical appearance. The experiments carried out conveyed that the VAE is a promising method for feature extraction, since it appeared to ascertain similarity between patient images. Moreover, the reconstruction of training inputs demonstrated that the method is capable of identifying and preserving anatomical details. Regarding the generative abilities, the artificial samples generally conveyed fairly realistic anatomical structures. Future work could be to investigate the VAEs ability to generalize, with respect to both the amount of data and probabilistic considerations as well as probabilistic assumptions.

  • 125.
    Grönberg, Fredrik
    et al.
    KTH, School of Engineering Sciences (SCI), Physics, Physics of Medical Imaging.
    Danielsson, Mats
    KTH, School of Engineering Sciences (SCI), Physics, Physics of Medical Imaging.
    Sjölin, Martin
    KTH, School of Engineering Sciences (SCI), Physics, Physics of Medical Imaging.
    Count statistics of nonparalyzable photon-counting detectors with nonzero pulse length2018In: Medical physics (Lancaster), ISSN 0094-2405, Vol. 45, no 8, p. 3800-3811Article in journal (Refereed)
    Abstract [en]

    PurposePhoton-counting detectors are expected to be the next big step in the development of medical computed tomography (CT). Accurate modeling of the behavior of photon-counting detectors in both low and high count rate regimes is important for accurate image reconstruction and detector performance evaluations. The commonly used ideal nonparalyzable (delta pulse) model is built on crude assumptions that make it unsuitable for predicting the behavior of photon-counting detectors at high count rates. The aim of this work is to present an analytical count statistics model that better describes the behavior of photon-counting detectors with nonzero pulse length. MethodsAn analytical statistical count distribution model for nonparalyzable detectors with nonzero pulse length is derived using tools from statistical analysis. To validate the model, a nonparalyzable photon-counting detector is simulated using Monte Carlo methods and compared against. Image performance metrics are computed using the Fisher information metric and a comparison between the proposed model, approximations of the proposed model, and those made by the ideal nonparalyzable model is presented and analyzed. ResultsIt is shown that the presented model agrees well with the results from the Monte Carlo simulation and is stable for varying x-ray beam qualities. It is also shown that a simple Gaussian approximation of the distribution can be used to accurately model the behavior and performance of nonparalyzable detectors with nonzero pulse length. Furthermore, the comparison of performance metrics show that the proposed model predicts a very different behavior than the ideal nonparalyzable detector model, suggesting that the proposed model can fill an important gap in the understanding of pileup effects. ConclusionsAn analytical model for the count statistics of a nonparalyzable photon-counting detector with nonzero pulse length is presented. The model agrees well with results obtained from Monte Carlo simulations and can be used to improve, speed up and simplify modeling of photon-counting detectors.

  • 126.
    Gudmundsson, Thorbjörn
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain Monte Carlo for rare-event simulation in heavy-tailed settings2013Licentiate thesis, monograph (Other academic)
  • 127.
    Gudmundsson, Thorbjörn
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Rare-event simulation with Markov chain Monte Carlo2015Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Stochastic simulation is a popular method for computing probabilities or expecta- tions where analytical answers are difficult to derive. It is well known that standard methods of simulation are inefficient for computing rare-event probabilities and there- fore more advanced methods are needed to those problems.

    This thesis presents a new method based on Markov chain Monte Carlo (MCMC) algorithm to effectively compute the probability of a rare event. The conditional distri- bution of the underlying process given that the rare event occurs has the probability of the rare event as its normalising constant. Using the MCMC methodology a Markov chain is simulated, with that conditional distribution as its invariant distribution, and information about the normalising constant is extracted from its trajectory.

    In the first two papers of the thesis, the algorithm is described in full generality and applied to four problems of computing rare-event probability in the context of heavy- tailed distributions. The assumption of heavy-tails allows us to propose distributions which approximate the conditional distribution conditioned on the rare event. The first problem considers a random walk Y1 + · · · + Yn exceeding a high threshold, where the increments Y are independent and identically distributed and heavy-tailed. The second problem is an extension of the first one to a heavy-tailed random sum Y1+···+YN exceeding a high threshold,where the number of increments N is random and independent of Y1 , Y2 , . . .. The third problem considers the solution Xm to a stochastic recurrence equation, Xm = AmXm−1 + Bm, exceeding a high threshold, where the innovations B are independent and identically distributed and heavy-tailed and the multipliers A satisfy a moment condition. The fourth problem is closely related to the third and considers the ruin probability for an insurance company with risky investments.

    In last two papers of this thesis, the algorithm is extended to the context of light- tailed distributions and applied to four problems. The light-tail assumption ensures the existence of a large deviation principle or Laplace principle, which in turn allows us to propose distributions which approximate the conditional distribution conditioned on the rare event. The first problem considers a random walk Y1 + · · · + Yn exceeding a high threshold, where the increments Y are independent and identically distributed and light-tailed. The second problem considers a discrete-time Markov chains and the computation of general expectation, of its sample path, related to rare-events. The third problem extends the the discrete-time setting to Markov chains in continuous- time. The fourth problem is closely related to the third and considers a birth-and-death process with spatial intensities and the computation of first passage probabilities.

    An unbiased estimator of the reciprocal probability for each corresponding prob- lem is constructed with efficient rare-event properties. The algorithms are illustrated numerically and compared to existing importance sampling algorithms.

  • 128.
    Gudmundsson, Thorbjörn
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hult, Henrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain monte carlo for computing rare-event probabilities for a heavy-tailed random walk2014In: Journal of Applied Probability, ISSN 0021-9002, E-ISSN 1475-6072, Vol. 51, no 2, p. 359-376Article in journal (Refereed)
    Abstract [en]

    In this paper a method based on a Markov chain Monte Carlo (MCMC) algorithm is proposed to compute the probability of a rare event. The conditional distribution of the underlying process given that the rare event occurs has the probability of the rare event as its normalizing constant. Using the MCMC methodology, a Markov chain is simulated, with the aforementioned conditional distribution as its invariant distribution, and information about the normalizing constant is extracted from its trajectory. The algorithm is described in full generality and applied to the problem of computing the probability that a heavy-tailed random walk exceeds a high threshold. An unbiased estimator of the reciprocal probability is constructed whose normalized variance vanishes asymptotically. The algorithm is extended to random sums and its performance is illustrated numerically and compared to existing importance sampling algorithms.

  • 129.
    Gudmundsson, Thorbjörn
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hult, Henrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain Monte Carlo for rare-event simulation for light-tailed random walkManuscript (preprint) (Other academic)
  • 130.
    Gudmundsson, Thorbjörn
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hult, Henrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain Monte Carlo for rare-event simulation for Markov chainsManuscript (preprint) (Other academic)
  • 131.
    Gudmundsson, Thorbjörn
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hult, Henrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain Monte Carlo for rare-event simulation for stochastic recurrence equations with heavy-tailed innovationsManuscript (preprint) (Other academic)
  • 132.
    Gunnarsson, Simon
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Curve Building and SwapPricing in the Presence of Collateral and Basis Spreads2013Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The eruption of the financial crisis in 2008 caused immense widening of both domestic and cross currency basis spreads. Also, as a majority of all fixed income contracts are now collateralized the funding cost of a financial institution may deviate substantially from the domestic Libor. In this thesis, a framework for pricing of collateralized interest rate derivatives that accounts for the existence of non-negligible basis spreads is implemented. It is found that losses corresponding to several percent of the outstanding notional may arise as a consequence of not adapting to the new market conditions.

  • 133.
    Gunnvald, Patrik
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Joelsson, Viktor
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Analysis of Hedging Strategies for Hydro Power on the Nordic Power Market2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Hydro power is the largest source for generation of electricity in the Nordic region today.

    This production is heavily dependent on the weather since it dictates the terms for the availability and the amount of power to be produced. Vattenfall as a company has an incentive to avoid volatile revenue streams as it facilitates economic planning and induces a positive effect on its credit rating, thus also on its bottom line. Vattenfall is a large producer of hydro power with a possibility to move the power market which adds further complexity to the problem. In this thesis the authors develop new hedging strategies which will hedge more efficiently. With efficiency is meant the same risk, or standard deviation, at a lower cost or alternatively formulated lower risk for the same cost. In order to enable comparison and make claims about efficiency, a reference solution is developed that should reflect their current hedging strategy. To achieve higher efficiency we focus on finding dynamic hedging strategies. First a prototype model is suggested to facilitate the construction of the solution methods and if it is worthwhile to pursue a further investigation. As this initial prototype model results showed that there were substantial room for efficiency improvement, a larger main model with parameters estimated from data is constructed which encapsulate the real world scenario much better. Four different solutions methods are developed and applied to this main model setup. The results are then compared to reference strategy. We find that even though the efficiency was less then first expected from the prototype model results, using these new hedging strategies could reduce costs by 1.5 % - 5%. Although the final choice of the hedging strategy might be down to the end user we suggest the strategy called BW to reduce costs and improve efficiency. The paper also discusses among other things; the solution methods and hedging strategies, the term optimality and the impact of parameters in the model.

  • 134.
    Gustafsson, Alexander
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Wogenius, Sebastian
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Modelling Apartment Prices with the Multiple Linear Regression Model2014Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This thesis examines factors that are of most statistical significance for the sales prices of apartments in the Stockholm City Centre. Factors examined are address, area, balcony, construction year, elevator, fireplace, floor number, maisonette, monthly fee, penthouse and number of rooms. On the basis of this examination, a model for predicting prices of apartments is constructed. In order to evaluate how the factors influence the price, this thesis analyses sales statistics and the mathematical method used is the multiple linear regression model. In a minor case-study and literature review, included in this thesis, the relationship between proximity to public transport and the prices of apartments in Stockholm are examined.

    The result of this thesis states that it is possible to construct a model, from the factors analysed, which can predict the prices of apartments in Stockholm City Centre with an explanation degree of 91% and a two million SEK confidence interval of 95%. Furthermore, a conclusion can be drawn that the model predicts lower priced apartments more accurately. In the case-study and literature review, the result indicates support for the hypothesis that proximity to public transport is positive for the price of an apartment. However, such a variable should be regarded with caution due to the purpose of the modelling, which differs between an individual application and a social economic application

  • 135. Gustavsson, J
    et al.
    Näsman, P
    KTH, School of Architecture and the Built Environment (ABE), Centres, Centre for Transport Studies, CTS. KTH, School of Architecture and the Built Environment (ABE), Transport Science, Transport and Location Analysis.
    Some information about the activities at the Department of Statistics, Stockholm University, Sweden1990Conference paper (Other academic)
  • 136. Haimi, Antti
    et al.
    Wennman, Aron
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematics (Div.).
    A central limit theorem for fluctuations in polyanalytic ginibre ensembles2017In: International mathematics research notices, ISSN 1073-7928, E-ISSN 1687-0247, Vol. rnx147Article in journal (Refereed)
    Abstract [sv]

    We study fluctuations of linear statistics in polyanalytic Ginibre ensembles, a family of point processes describing planar free fermions in a uniform magnetic field at higher Landau levels. Our main result is asymptotic normality of fluctuations, extending a result of Rider and Virág. As in the analytic case, the variance is composed of independent terms from the bulk and the boundary. Our methods rely on a structural formula for polyanalytic polynomial Bergman kernels which separates out the different pure q" role="presentation">q-analytic kernels corresponding to different Landau levels. The fluctuations with respect to these pure q" role="presentation">q-analytic Ginibre ensembles are also studied, and a central limit theorem is proved. The results suggest a stabilizing effect on the variance when the different Landau levels are combined together.

  • 137.
    Halberg, Oscar
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Wärmlös Helmrich, Mattias
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Operational Risk Modeling: Addressing the Reporting Threshold Problem2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    External loss data are typically left truncated at a reporting threshold. Ignoring this truncation level leads to biased capital charge estimations. This thesis addresses the challenges of recreating the truncated part of the distribution. By predicting the continuation of a probability density function, the unobserved body of an external operational risk loss distribution is estimated. The prediction is based on internally collected losses and the tail of the external loss distribution. Using a semiparametric approach to generate sets of internal losses and applying the Best Linear Unbiased Predictor, results in an enriched external dataset that shares resemblance with the internal dataset. By avoiding any parametrical assumptions, this study proposes a new and unique way to address the reporting threshold problem. Financial institutions will benefit from these findings as it permits the use of the semiparametric approach developed by Bolancé et al. (2012) and thereby eliminates the well known difficulty with determining the breaking point beyond which the tail domain is defined when using the Loss Distribution Approach. The main conclusion from this thesis is that predicting the continuation of a function using the Best Linear Unbiased Predictor can be successfully applied in an operational risk setting. This thesis has predicted the continuation of a probability density function, resulting in a full external loss distribution.

  • 138.
    HALLGREN, FREDRIK
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).
    On Prediction and Filtering of Stock Index Returns:  2011Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
  • 139.
    Hallgren, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Forecasting Ranking in Harness Racing using Probabilities Induced by Expected PositionsManuscript (preprint) (Other academic)
    Abstract [en]

    Ranked events are pivotal in many important AI-applications such as QuestionAnswering and recommendations systems. This paper studies ranked events in the setting of harness racing.

    For each horse there exists a probability distribution over its possible rankings. In the paper it is shown that a set of expected positions (and more generally, higher moments) for the horses induces this probability distribution.

    The main contribution of the paper is a method which extracts this induced probability distribution from a set of expected positions. An algorithm isproposed where the extraction of the induced distribution is given by the estimated expectations. MATLAB code is provided for the methodology.

    This approach gives freedom to model the horses in many different ways without the restrictions imposed by for instance logistic regression. To illustrate this point, we employ a neural network and ordinary ridge regression.

    The method is applied to predicting the distribution of the finishing positions for horses in harness racing. It outperforms both multinomial logistic regression and the market odds.

    The ease of use combined with fine results from the suggested approach constitutes a relevant addition to the increasingly important field of ranked events.

  • 140.
    Hallgren, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Inference in Temporal Graphical Models2016Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis develops mathematical tools used to model and forecast different economic phenomena. The primary starting point is the temporal graphical model. Four main topics, all with applications in finance, are studied.

    The first two papers develop inference methods for networks of continuous time Markov processes, so called Continuous Time Bayesian Networks. Methodology for learning the structure of the network and for doing inference and simulation is developed. Further, models are developed for high frequency foreign exchange data.

    The third paper models growth of gross domestic product (GDP) which is observed at a very low frequency. This application is special and has several difficulties which are dealt with in a novel way using a framework developed in the paper. The framework is motivated using a temporal graphical model. The method is evaluated on US GDP growth with good results.

    The fourth paper study inference in dynamic Bayesian networks using Monte Carlo methods. A new method for sampling random variables is proposed. The method divides the sample space into subspaces. This allows the sampling to be done in parallel with independent and distinct sampling methods on the subspaces. The methodology is demonstrated on a volatility model for stock prices and some toy examples with promising results.

    The fifth paper develops an algorithm for learning the full distribution in a harness race, a ranked event. It is demonstrated that the proposed methodology outperforms logistic regression which is the main competitor. It also outperforms the market odds in terms of accuracy.

  • 141.
    Hallgren, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Nowcasting with dynamic maskingManuscript (preprint) (Other academic)
    Abstract [en]

    Nowcasting consists of tracking GDP by focusing on the data flow consisting oftimely available releases. The essential features of a nowcasting data set are differing sampling frequencies, and the ragged-edge: the differing patterns of missing observations at the end of the sample due to non-synchronicity of data publications. Various econometric frameworks have been proposed to deal withthese characteristics. During a sequence of subsequent nowcast occasions, the models are traditionally re-estimated, or updated, based on an expanding dataset as more and more data becomes available. This paper proposes to take the ragged-edge structure into account when estimating a nowcast model. Instead of using all available historical data, it is here proposed to first mask the historical data so as to reflect the pattern of data availability at the specific nowcast occasion. Since each nowcast occasion exhibits its own specific ragged-edge structure, we propose to re-estimate or recalibrate the model at each juncture employing the accompanying mask, hence dynamic masking.Dynamic masking thus tailors the model to the specific nowcast occasion. It is shown how tailoring improves precision by employing ridge regressions with and without masking in a real-time nowcasting back-test.

    The masking approach is motivated by theory and demonstrated on real data. It surpasses the dynamic factor model in backtests. Dynamic masking gives ease of implementation, a solid theoretical foundation, flexibility in modeling, and encouraging results; we therefore consider it a relevant addition to the nowcasting methodology.

  • 142.
    Hallgren, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Structure Learning and Mixed Radix representation in Continuous Time Bayesian NetworksManuscript (preprint) (Other academic)
    Abstract [en]

    Continuous time Bayesian Networks (CTBNs) are graphical representations of the dependence structures between continuous time random processes with finite state spaces. We propose a method for learning the structure of the CTBNs using a causality measure based on Kullback-Leibler divergence. We introduce the causality matrix can be seen as a generalized version of the covariance matrix. We give a mixed radix representation of the process that much facilitates the learning and simulation. A new graphical model for tick-by-tick financial data is proposed and estimated. Our approach indicates encouraging results on both the tick-data and on a simulated example.

  • 143.
    Hamdi, Ali
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    On the Snell envelope approach to optimal switching and pricing Bermudan options2011Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis consists of two papers related to systems of Snell envelopes. The first paper uses a system of Snell envelopes to formulate the problem of two-modes optimal switching for the full balance sheet in finite horizon. This means that the switching problem is formulated in terms of trade-off strategies between expected profit and cost yields, which act as obstacles to each other. Existence of a minimal solution of this system is obtained by using an approximation scheme. Furthermore, the optimal switching strategies are fully characterized.

    The second paper uses the Snell envelope to formulate the fair price of Bermudan options. To evaluate this formulation of the price, the optimal stopping strategy for such a contract must be estimated. This may be done recursively if some method of estimating conditional expectations is available. The paper focuses on nonparametric estimation of such expectations, by using regularization of a least-squares minimization, with a Tikhonov-type smoothing put on the partial diferential equation which characterizes the underlying price processes. This approach can hence be viewed as a combination of the Monte Carlo method and the PDE method for the estimation of conditional expectations. The estimation method turns out to be robust with regard tothe size of the smoothing parameter.

  • 144.
    Hamdi, Ali
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    PDE-regularization for pricing multi-dimensional Bermudan options with Monte Carlo simulationManuscript (preprint) (Other academic)
    Abstract [en]

    This paper considers the problem of pricing multi-dimensional Bermudan derivatives using Monte Carlo simulation. A new method for computing conditional expectations is proposed, which combined with the dynamic programming principle provides a way of pricing the derivatives.

    The method is a non-parametric projection with regularization. The regularization penalizes deviations from the PDE that the true conditional expectation satisfies. The point being that it is less costly to compute the norm of the PDE than it is to solve it, thus avoiding the curse of dimensionality.

    The method is shown to produce accurate numerical results in multi-dimensional settings, given a good choice of the regularization parameter. It is illustrated with the multi-dimensional Black-Scholes model and compared to the Longstaff-Schwartz approach.

  • 145.
    Hamdi, Ali
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Some aspects of optimal switching and pricing Bermudan options2013Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis consists of four papers that are all related to the Snell envelope. In the first paper, the Snell envelope is used as a formulation of a two-modes optimal switching problem. The obstacles are interconnected, take both profit and cost yields into account, and switching is based on both sides of the balance sheet. The main result is a proof of existence of a continuous minimal solution to a system of Snell envelopes, which fully characterizes the optimal switching strategy. A counter-example is provided to show that uniqueness does not hold.

    The second paper considers the problem of having a large number of production lines with two modes of production, high-production and low-production. As in the first paper, we consider both expected profit and cost yields and switching based on both sides of the balance sheet. The production lines are assumed to be interconnected through a coupling term, which is the average optimal expected yields. The corresponding system of Snell envelopes is highly complex, so we consider the aggregated yields where a mean-field approximation is used for the coupling term. The main result is a proof of existence of a continuous minimal solution to a system of Snell envelopes, which fully characterizes the optimal switching strategy. Furthermore, existence and uniqueness is proven for the mean-field reflected backward stochastic differential equations (MF-RBSDEs) we consider, a comparison theorem and a uniform bound for the MF-RBSDEs is provided.

    The third paper concerns pricing of Bermudan type options. The Snell envelope is used as a representation of the price, which is determined using Monte Carlo simulation combined with the dynamic programming principle. For this approach, it is necessary to estimate the conditional expectation of the future optimally exercised payoff. We formulate a projection on a grid which is ill-posed due to overfitting, and regularize with the PDE which characterizes the underlying process. The method is illustrated with numerical examples, where accurate results are demonstrated in one dimension.

    In the fourth paper, the idea of the third paper is extended to the multi-dimensional setting. This is necessary because in one dimension it is more efficient to solve the PDE than to use Monte Carlo simulation. We relax the use of a grid in the projection, and add local weights for stability. Using the multi-dimensional Black-Scholes model, the method is illustrated in settings ranging from one to 30 dimensions. The method is shown to produce accurate results in all examples, given a good choice of the regularization parameter.

  • 146.
    Hamdi, Ali
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Marcus, Mårten
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Pricing Bermudan options: A nonparametric estimation approachManuscript (preprint) (Other academic)
    Abstract [en]

    A nonparametric alternative to the Longstaff-Schwartz estimation of conditional expectations is suggested for pricing of Bermudan options. The method is based on regularization of a least-squares minimization, with a Tikhonov-type smoothing put on the partial differential equation which characterizes the underlying price processes. This approach can hence be viewed as a combination of the Monte Carlo method and the PDE method for the estimation of conditional expectations. The estimation method turns out to be robust with regard to the size of the smoothing parameter.

  • 147.
    Hansson, Fredrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A pricing and performance study on auto-callable structured products2012Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract

    We propose an algorithm to price and analyze the performance of auto-callable structured _nancial products. The algorithm contains Monte-Carlo simulations in order to reproduce, as probable as possible, a future product. This model is then compared to other, previously presented models. The di_erent in-data parameters together with a time dependency study is then performed to evaluate what one might expect when investing in these products. Numerical results conclude that, the risks taken by the investor closely reect the potential return for each product. When constructing these products for the near future, one must closely evaluate the demand from the investors i.e. evaluate the level of risk that the investors are willing to take.

  • 148.
    Hansén, Rasmus
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Allocation of Risk Capital to Contracts in Catastrophe Reinsurance2013Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This thesis is theresult of a project aimed at developing a tool for allocation of risk capitalin catastrophe excess-of-loss reinsurance. Allocation of risk capital is animportant tool for measuring portfolio performance and optimizing the capitalrequirement. Here, two allocation rules are described and analyzed, Eulerallocation and Capital layer allocation. The rules are applied to two differentportfolios. The main conclusions is that the two methods can be used togetherto get a better picture of how the dependence structure between the contractsaffect the portfolio result. It is also illustrated how the RORAC of one of theportfolios can be increased by 1 % using the outcome from the analyses.

  • 149.
    Hededal Klincov, Lazar
    et al.
    KTH, School of Technology and Health (STH), Medical Engineering, Computer and Electronic Engineering.
    Symeri, Ali
    KTH, School of Technology and Health (STH), Medical Engineering, Computer and Electronic Engineering.
    Devising a Trend-break-detection Algorithm of stored Key Performance Indicators for Telecom Equipment2017Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    A problem that is prevalent for testers at Ericsson is that performance test results are continuously generated but not analyzed. The time between occurrence of problems and information about the occurrence is long and variable. This is due to the manual analysis of log files that is time consuming and tedious. The requested solution is automation with an algorithm that analyzes the performance and notifies when problems occur. A binary classifier algorithm, based on statistical methods, was developed and evaluated as a solution to the stated problem. The algorithm was evaluated with simulated data and produced an accuracy of 97.54 %, to detect trend breaks. Furthermore, correlation analysis was carried out between performance and hardware to gain insights in how hardware configurations affect test runs.

  • 150.
    Heimbürger, Hjalmar
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Modelling of Stochastic Volatility using Partially Observed Markov Models2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    In this thesis, calibration of stochastic volatility models that allow correlation between the volatility and the returns has been considered. To achieve this, the dynamics has been modelled as an extension of hidden Markov models, and a special case of partially observed Markov models. This thesis shows that such models can be calibrated using sequential Monte Carlo methods, and that a model with correlation provide a better fit to the observed data. However, the results are not conclusive and more research is needed in order to confirm this for other data sets and models.

1234567 101 - 150 of 371
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf