Change search
Refine search result
1234567 151 - 200 of 464
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 151.
    Forsman, Mikael
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A Model Implementation of Incremental Risk Charge2012Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract

    In 2009 the Basel Committee on Banking Supervision released the final guidelines for computing capital for the Incremental Risk Charge, which is a complement to the traditional Value at Risk intended to measure the migration risk and the default risk in the trading book. Before Basel III banks will have to develop their own Incremental Risk Charge model following these guidelines. The development of such a model that computes the capital charge for a portfolio of corporate bonds is described in this thesis. Essential input parameters like the credit ratings of the underlying issuers, credit spreads, recovery rates at default, liquidity horizons and correlations among the positions in the portfolio will be discussed. Also required in the model is the transition matrix with probabilities of migrating between different credit states, which is measured by historical data from Moody´s rating institute. Several sensitivity analyses and stress tests are then made by generating different scenarios and running them in the model and the results of these tests are compared to a base case. As it turns out, the default risk contributes for the most part of the Incremental Risk Charge.

  • 152. Fosgerau, M.
    et al.
    Lindberg, P. O.
    Mattsson, Lars-Göran
    KTH, School of Architecture and the Built Environment (ABE), Transport Science. KTH, School of Architecture and the Built Environment (ABE), Centres, Centre for Transport Studies, CTS.
    Weibull, Jörgen
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.). Stockholm School of Economics, Sweden.
    A note on the invariance of the distribution of the maximum2018In: Journal of Mathematical Economics, ISSN 0304-4068, E-ISSN 1873-1538, Vol. 74, p. 56-61Article in journal (Refereed)
    Abstract [en]

    Many models in economics involve discrete choices where a decision-maker selects the best alternative from a finite set. Viewing the array of values of the alternatives as a random vector, the decision-maker draws a realization and chooses the alternative with the highest value. The analyst is then interested in the choice probabilities and in the value of the best alternative. The random vector has the invariance property if the distribution of the value of a specific alternative, conditional on that alternative being chosen, is the same, regardless of which alternative is considered. This note shows that the invariance property holds if and only if the marginal distributions of the random components are positive powers of each other, even when allowing for quite general statistical dependence among the random components. We illustrate the analytical power of the invariance property by way of examples.

  • 153.
    Fransson, Viktor
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Graphical lasso for covariance structure learning in the high dimensional setting2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This thesis considers the estimation of undirected Gaussian graphical models especially in the high dimensional setting where the true observations are assumed to be non-Gaussian distributed.

    The first aim is to present and compare the performances of existing Gaussian graphical model estimation methods. Furthermore since the models rely heavily on the normality assumption, various methods for relaxing the normal assumption are presented. In addition to the existing methods, a modified version of the joint graphical lasso method is introduced which monetizes on the strengths of the community Bayes method. The community Bayes method is used to partition the features (or variables) of datasets consisting of several classes into several communities which are estimated to be mutually independent within each class which allows the calculations when performing the joint graphical lasso method, to be split into several smaller parts. The method is also inspired by the cluster graphical lasso and is applicable to both Gaussian and non-Gaussian data, assuming that the normal assumption is relaxed.

    Results show that the introduced cluster joint graphical lasso method outperforms com-peting methods, producing graphical models which are easier to comprehend due to the added information obtained from the clustering step of the method. The cluster joint graphical lasso is applied to a real dataset consisting of p = 12582 features which resulted in computation gain of a factor 35 when comparing to the competing method which is very significant when analysing large datasets. The method also allows for parallelization where computations can be spread across several computers greatly increasing the computational efficiency.

  • 154.
    Fredriksson, Gustav
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hellström, Anton
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Restricted Boltzmann Machine as Recommendation Model for Venture Capital2019Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Denna studie introducerar restricted Boltzmann machines (RBMs) som rekommendationsmodell i kontexten av riskkapital. Ett nätverk av relationer används som proxy för att modellera investerares bolagspreferenser. Studiens huvudfokus är att undersöka hur RBMs kan implementeras för ett dataset bestående av relationer mellan personer och bolag, samt att undersöka om modellen går att förbättra genom att tillföra av ytterligare information.

    Nätverket skapas från styrelsesammansättningar för svenska bolag. För nätverket implementeras RBMs både med och utan den extra informationen om bolagens ursprungsort. Vardera RBM-modell undersöks genom att utvärdera dess inlärningsförmåga samt förmåga att återskapa manuellt gömda relationer.

    Resultatet påvisar att RBM-modellerna har en bristfällig förmåga att återskapa borttagna relationer, dock noteras god inlärningsförmåga. Genom att addera ursprungsort som extra information förbättras modellerna markant och god potential som rekommendationsmodell går att urskilja, både med avseende på inlärningsförmåga samt förmåga att återskapa gömda relationer.

  • 155.
    Fröling, Anton
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Lahdo, Sandy
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A smoother and more up-to-date development of the income pension2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    For an apparatus as big as the pension system, the financial stability is essential. An important feature in the existing pension system is the balance mechanism, which secures the stability of the system. The balance ratio is obtained by dividing the assets by the liabilities. When this ratio drops below 1.0000, it triggers the so-called automatic balancing.

    While the existing pension system has achieved its goal of being financially stable, it has become clear that the indexation of the pensions during balancing periods has properties that are not optimal. On a short-term perspective the income pension system is exposed to the risk of reacting with a lag, or reacting unnecessarily strong. This gave rise to a new legislative proposal, issued by the government. The goal of the proposal is to obtain a smoother and more up-to-date development of the income pension, i.e. a shorter lag period, without jeopardizing the financial stability. In addition to this it is also desirable to simplify and improve the existing calculation methods. In order to compare the existing calculation methods in the pension system with the new legislative proposal, a simplified model of the existing pension system and the modified version of it, are created.

    The results of this study shows that the new legislative proposal decreases the volatility in the pensions and it avoids the deepest valleys in the balance ratio. The development of the pension disbursements in the new system has a higher correlation with the development of the average pension-qualifying income than in the current system. Moreover, the results show that the new system has a shorter lag period which makes the income pension system more up- to-date with the current economic and demographic situation.

    The financial stability is still contained, and the new system also handles variations in the inflation better than the current system

  • 156.
    Galijasevic, Amar
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Tegbaru, Josef
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Can IPO first day returns be predicted? A multiple linear regression analysis2019Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    During the last three years the Swedish stock market has showed a strong upwards movement from the lows of 2016. At the same time the IPO activity has been large and a lot of the offerings have had a positive return during the first day of trading in the market.

    The goal of this study is to analyze if there is any particular IPO specific data that has a correlation with the first day return and if it can be used to predict the first day return for future IPO’s. If any regressors were shown to have correlation with the first day return, the goal is also to find a subset of regressors with even higher predictability. Then to classify which regressors show the highest correlation with a large positive return. The method which has been used is a multiple linear regression with IPO-data from the period 2017-2018.

    The results from the study imply that none of the chosen regressors show any significant correlation with the first day return. It is a complicated process which might be difficult to simplify and quantify into a regression model, but further studies are needed to draw a conclusion if there are any other qualitative factors which correlate with the first day return.

  • 157.
    Gallais, Arnaud
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    CPPI Structures on Funds Derivatives2011Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract

    With the ever-increasing complexity of financial markets and financial products, many investors now choose to benefit from a manager’s expertise by investing in a fund. This fueled a rapid growth of the fund industry over the past decades, and the recent emergence of complex derivatives products written on underlying funds. The diversity (hedge funds, mutual funds, funds of funds, managed accounts…) and the particularities (liquidity, specific risks) of funds call for adapted models and suited risk management. This thesis aims at understanding the issues and difficulties met when dealing with such products. In particular, we will deal in a great extent with CPPI (Constant Proportion Portfolio Insurance) structures written on funds, which combine the specificities of funds with particularities of such structures. Correctly assessing the corresponding market risks is a challenging issue, and is the subject of many investigations.

  • 158.
    Galushin, Sergey
    et al.
    KTH, School of Engineering Sciences (SCI), Physics, Nuclear Power Safety. KTH, School of Engineering Sciences (SCI), Physics, Nuclear Engineering.
    Ranlöf, L.
    Bäckström, O.
    Adolfsson, Y.
    Grishchenko, Dmitry
    KTH, School of Engineering Sciences (SCI), Physics, Nuclear Power Safety.
    Kudinov, Pavel
    KTH, School of Engineering Sciences (SCI), Physics, Nuclear Power Safety.
    Marklund, A. R.
    Joint application of risk oriented accident analysis methodology and PSA level 2 to severe accident issues in Nordic BWR2018In: PSAM 2018 - Probabilistic Safety Assessment and Management, International Association for Probablistic Safety Assessment and Management (IAPSAM) , 2018Conference paper (Refereed)
    Abstract [en]

    A comprehensive and robust assessment of severe accident management effectiveness in preventing unacceptable releases is a challenge for a today’s real life PSA. This is mainly due to the fact that major uncertainty is determined by the physical phenomena and timing of the events. The static PSA is built on choosing scenario parameters to describe the accident progression sequence and typically uses a limited number of simulations in the underlying deterministic analysis. Risk Oriented Accident Analysis Methodology framework (ROAAM+) is being developed in order to enable consistent and comprehensive treatment of both epistemic and aleatory uncertainties. The framework is based on a set of deterministic models that describe different stages of the accident progression. The results are presented in terms of distributions of conditional containment failure probabilities for given combinations of the scenario parameters. This information is used for enhanced modeling in the PSA-L2. Specifically, it includes improved definitions of the sequences determined by the physical phenomena rather than stochastic failures of the equipment, improved knowledge of timing in sequences and estimation of probabilities determined by the uncertainties in the phenomena. In this work we present an example of application of the dynamic approach in a large scale PSA model and show that the integration of the ROAAM+ results and the PSA model can potentially lead to a considerable change in PSA Level 2 analysis results. 

  • 159.
    Georgelis, Nikos
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Nyberg, Mikael
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A Scenario Based Allocation Model Using Entropy Pooling for Computing the cenarioProbabilities2013Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    We introduce a scenario based allocation model (SBAM) that uses entropy pooling for computing scenario probabilities. Compared to most other models that allow the investor to blend historical data with subjective views about the future, the SBAM does not require the investor to quantify a level of confidence in the subjective views.

     A quantitative test is performed on a simulated systematic fund offered by the fund company Informed Portfolio Management in Stockholm, Sweden. The simulated fund under study consists of four individual systematic trading strategies and the test is simulated on a monthly basis during the years 1986-2010.

     We study how the selection of views might affect the SBAM portfolios, creating three systematic views and combining them in different variations creating seven SBAM portfolios. We also compare how the size of sample data affects the results. 

     Furthermore, the SBAM is compared to more common allocation methods, namely an equally weighted portfolio and a portfolio optimization based only on historical data.

     We find that the SBAM portfolios produced higher annual returns and information ratio than the equally weighted portfolio or the portfolio optimized only on historical data.

  • 160.
    Giertz Jonsson, Fredrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Analysis and Optimization of aPortfolio of Catastrophe Bonds2014Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This Master's Thesis in mathematical statistics has the two major purposes; (i) to model and measure the risk associated with a special type of reinsurance contract, the catastrophe bond, and (ii) to analyze and develop methods of portfolio optimization suitable for a portfolio of catastrophe bonds. Two pathways of modeling potential catastrophe bond losses are analyzed; one method directly modeling potential contract losses and one method modeling the underlying contract loss governing variables. The first method is simple in its structure but with the disadvantage of the inability to introduce a dependence structure between the losses of different contracts in a simple and flexible way. The second modeling method uses a stochastic number of stochastic events representation connected into a multivariate dependence structure using the theory of copulas.

    Results show that the choice of risk measure is of great importance when analyzing catastrophe bonds and their related risks. As an example, the measure Value at Risk often fails to capture the essence of catastrophe bond risk, which in turn means that portfolio optimization with respect to the same might lead to a systematic obscurity of risk. Two coherent risk measures were investigated, the spectral risk measure and the Expected Shortfall measure. Both measures provides good representation of the risk of a portfolio consisting of catastrophe bonds.

    This thesis extends and applies a well-known optimization method of Conditional Value at Risk to obtain a method of optimization of spectral risk measures. The optimization results show that expected shortfall optimization leads to portfolios being advantageous at the specific point at which it is optimized but that their characteristics may be disadvantageous at other parts of the loss distribution. Portfolios optimized for the spectral risk measure were shown to possess good characteristics across the entire loss distribution. Optimization results were compared to the popular mean-variance portfolio optimization approach. The comparison shows that the mean-variance approach handles the special distribution of catastrophe bond losses in an over-simplistic way, and that it has a severe lack of flexibility towards focusing on different aspects of risk. The spectral risk measure optimization procedure was demonstrated to be the most flexible and possibly the most appropriate way to optimize a portfolio of catastrophe bonds.

     

  • 161.
    Gobeljic, Persa
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Classification of Probability of Defaultand Rating Philosophies2012Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Basel II consists of international recommendations on banking regulations, mainly concerning how much capital banks and other financial institutions should be made to set aside in order to protect themselves from various types of risks. Implementing Basel II involves estimating risks; one of the main measurements is Probability of Default. Firm specific and macroeconomic risks cause obligors to default. Separating the two risk factors in order to define which of them affect the Probability of Default through the years. The aim of this thesis is to enable a separation of the risk variables in the structure of Probability of Default in order to classify the rating philosophy.

  • 162. González, R. A.
    et al.
    Valenzuela, P. E.
    Rojas, C. R.
    Rojas, R. A.
    Optimal enforcement of causality in non-parametric transfer function estimation2017In: IEEE Control Systems Letters, ISSN 2475-1456, Vol. 1, no 2, p. 268-273Article in journal (Refereed)
    Abstract [en]

    Traditionally, non-parametric impulse and frequency response functions are estimated by taking the ratio of power spectral density estimates. However, this approach may often lead to non-causal estimates. In this letter, we derive a closed form expression for the impulse response estimator by smoothed empirical transfer function estimate, which allows optimal enforcement of causality on non-parametric estimators based on spectral analysis. The new method is shown to be asymptotically unbiased and of minimum covariance in a positive semidefinite sense among a broad class of linear estimators. Numerical simulations illustrate the performance of the new estimator. 

  • 163.
    Grandell, Jan
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Schmidli, Hanspeter
    Ruin probabilities in a diffusion environment2011In: Journal of Applied Probability, ISSN 0021-9002, E-ISSN 1475-6072, Vol. 48A, p. 39-50Article in journal (Refereed)
    Abstract [en]

    We consider an insurance model, where the underlying point process is a Cox process. Using a martingale approach applied to diffusion processes, finite-time Lundberg inequalities are obtained. By change-of-measure techniques, Cramer-Lundberg approximations are derived.

  • 164.
    Granström, Daria
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Abrahamsson, Johan
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Loan Default Prediction using Supervised Machine Learning Algorithms2019Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    It is essential for a bank to estimate the credit risk it carries and the magnitude of exposure it has in case of non-performing customers. Estimation of this kind of risk has been done by statistical methods through decades and with respect to recent development in the field of machine learning, there has been an interest in investigating if machine learning techniques can perform better quantification of the risk. The aim of this thesis is to examine which method from a chosen set of machine learning techniques exhibits the best performance in default prediction with regards to chosen model evaluation parameters. The investigated techniques were Logistic Regression, Random Forest, Decision Tree, AdaBoost, XGBoost, Artificial Neural Network and Support Vector Machine. An oversampling technique called SMOTE was implemented in order to treat the imbalance between classes for the response variable. The results showed that XGBoost without implementation of SMOTE obtained the best result with respect to the chosen model evaluation metric.

  • 165.
    Grossman, Mikael
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Proposal networks in object detection2019Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Locating and extracting useful data from images is a task that has been revolutionized in the last decade as computing power has risen to such a level to use deep neural networks with success. A type of neural network that uses the convolutional operation called convolutional neural network (CNN) is suited for image related tasks. Using the convolution operation creates opportunities for the network to learn their own filters, that previously had to be hand engineered. For locating objects in an image the state-of-the-art Faster R-CNN model predicts objects in two parts. Firstly, the region proposal network (RPN) extracts regions from the picture where it is likely to find an object. Secondly, a detector verifies the likelihood of an object being in that region.For this thesis, we review the current literature on artificial neural networks, object detection methods, proposal methods and present our new way of generating proposals. By replacing the RPN with our network, the multiscale proposal network (MPN), we increase the average precision (AP) with 12% and reduce the computation time per image by 10%.

  • 166.
    Gruselius, Hanna
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Generative Models and Feature Extraction on Patient Images and Structure Data in Radiation Therapy2018Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This Master thesis focuses on generative models for medical patient data for radiation therapy. The objective with the project is to implement and investigate the characteristics of a Variational Autoencoder applied to this diverse and versatile data. The questions this thesis aims to answer are: (i) whether the VAE can capture salient features of medical image data, and (ii) if these features can be used to compare similarity between patients. Furthermore, (iii) if the VAE network can successfully reconstruct its input and lastly (iv) if the VAE can generate artificial data having a reasonable anatomical appearance. The experiments carried out conveyed that the VAE is a promising method for feature extraction, since it appeared to ascertain similarity between patient images. Moreover, the reconstruction of training inputs demonstrated that the method is capable of identifying and preserving anatomical details. Regarding the generative abilities, the artificial samples generally conveyed fairly realistic anatomical structures. Future work could be to investigate the VAEs ability to generalize, with respect to both the amount of data and probabilistic considerations as well as probabilistic assumptions.

  • 167.
    Grönberg, Fredrik
    et al.
    KTH, School of Engineering Sciences (SCI), Physics, Physics of Medical Imaging.
    Danielsson, Mats
    KTH, School of Engineering Sciences (SCI), Physics, Physics of Medical Imaging.
    Sjölin, Martin
    KTH, School of Engineering Sciences (SCI), Physics, Physics of Medical Imaging.
    Count statistics of nonparalyzable photon-counting detectors with nonzero pulse length2018In: Medical physics (Lancaster), ISSN 0094-2405, Vol. 45, no 8, p. 3800-3811Article in journal (Refereed)
    Abstract [en]

    PurposePhoton-counting detectors are expected to be the next big step in the development of medical computed tomography (CT). Accurate modeling of the behavior of photon-counting detectors in both low and high count rate regimes is important for accurate image reconstruction and detector performance evaluations. The commonly used ideal nonparalyzable (delta pulse) model is built on crude assumptions that make it unsuitable for predicting the behavior of photon-counting detectors at high count rates. The aim of this work is to present an analytical count statistics model that better describes the behavior of photon-counting detectors with nonzero pulse length. MethodsAn analytical statistical count distribution model for nonparalyzable detectors with nonzero pulse length is derived using tools from statistical analysis. To validate the model, a nonparalyzable photon-counting detector is simulated using Monte Carlo methods and compared against. Image performance metrics are computed using the Fisher information metric and a comparison between the proposed model, approximations of the proposed model, and those made by the ideal nonparalyzable model is presented and analyzed. ResultsIt is shown that the presented model agrees well with the results from the Monte Carlo simulation and is stable for varying x-ray beam qualities. It is also shown that a simple Gaussian approximation of the distribution can be used to accurately model the behavior and performance of nonparalyzable detectors with nonzero pulse length. Furthermore, the comparison of performance metrics show that the proposed model predicts a very different behavior than the ideal nonparalyzable detector model, suggesting that the proposed model can fill an important gap in the understanding of pileup effects. ConclusionsAn analytical model for the count statistics of a nonparalyzable photon-counting detector with nonzero pulse length is presented. The model agrees well with results obtained from Monte Carlo simulations and can be used to improve, speed up and simplify modeling of photon-counting detectors.

  • 168.
    Gudmundsson, Thorbjörn
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain Monte Carlo for rare-event simulation in heavy-tailed settings2013Licentiate thesis, monograph (Other academic)
  • 169.
    Gudmundsson, Thorbjörn
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Rare-event simulation with Markov chain Monte Carlo2015Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Stochastic simulation is a popular method for computing probabilities or expecta- tions where analytical answers are difficult to derive. It is well known that standard methods of simulation are inefficient for computing rare-event probabilities and there- fore more advanced methods are needed to those problems.

    This thesis presents a new method based on Markov chain Monte Carlo (MCMC) algorithm to effectively compute the probability of a rare event. The conditional distri- bution of the underlying process given that the rare event occurs has the probability of the rare event as its normalising constant. Using the MCMC methodology a Markov chain is simulated, with that conditional distribution as its invariant distribution, and information about the normalising constant is extracted from its trajectory.

    In the first two papers of the thesis, the algorithm is described in full generality and applied to four problems of computing rare-event probability in the context of heavy- tailed distributions. The assumption of heavy-tails allows us to propose distributions which approximate the conditional distribution conditioned on the rare event. The first problem considers a random walk Y1 + · · · + Yn exceeding a high threshold, where the increments Y are independent and identically distributed and heavy-tailed. The second problem is an extension of the first one to a heavy-tailed random sum Y1+···+YN exceeding a high threshold,where the number of increments N is random and independent of Y1 , Y2 , . . .. The third problem considers the solution Xm to a stochastic recurrence equation, Xm = AmXm−1 + Bm, exceeding a high threshold, where the innovations B are independent and identically distributed and heavy-tailed and the multipliers A satisfy a moment condition. The fourth problem is closely related to the third and considers the ruin probability for an insurance company with risky investments.

    In last two papers of this thesis, the algorithm is extended to the context of light- tailed distributions and applied to four problems. The light-tail assumption ensures the existence of a large deviation principle or Laplace principle, which in turn allows us to propose distributions which approximate the conditional distribution conditioned on the rare event. The first problem considers a random walk Y1 + · · · + Yn exceeding a high threshold, where the increments Y are independent and identically distributed and light-tailed. The second problem considers a discrete-time Markov chains and the computation of general expectation, of its sample path, related to rare-events. The third problem extends the the discrete-time setting to Markov chains in continuous- time. The fourth problem is closely related to the third and considers a birth-and-death process with spatial intensities and the computation of first passage probabilities.

    An unbiased estimator of the reciprocal probability for each corresponding prob- lem is constructed with efficient rare-event properties. The algorithms are illustrated numerically and compared to existing importance sampling algorithms.

  • 170.
    Gudmundsson, Thorbjörn
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hult, Henrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain monte carlo for computing rare-event probabilities for a heavy-tailed random walk2014In: Journal of Applied Probability, ISSN 0021-9002, E-ISSN 1475-6072, Vol. 51, no 2, p. 359-376Article in journal (Refereed)
    Abstract [en]

    In this paper a method based on a Markov chain Monte Carlo (MCMC) algorithm is proposed to compute the probability of a rare event. The conditional distribution of the underlying process given that the rare event occurs has the probability of the rare event as its normalizing constant. Using the MCMC methodology, a Markov chain is simulated, with the aforementioned conditional distribution as its invariant distribution, and information about the normalizing constant is extracted from its trajectory. The algorithm is described in full generality and applied to the problem of computing the probability that a heavy-tailed random walk exceeds a high threshold. An unbiased estimator of the reciprocal probability is constructed whose normalized variance vanishes asymptotically. The algorithm is extended to random sums and its performance is illustrated numerically and compared to existing importance sampling algorithms.

  • 171.
    Gudmundsson, Thorbjörn
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hult, Henrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain Monte Carlo for rare-event simulation for light-tailed random walkManuscript (preprint) (Other academic)
  • 172.
    Gudmundsson, Thorbjörn
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hult, Henrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain Monte Carlo for rare-event simulation for Markov chainsManuscript (preprint) (Other academic)
  • 173.
    Gudmundsson, Thorbjörn
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Hult, Henrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Markov chain Monte Carlo for rare-event simulation for stochastic recurrence equations with heavy-tailed innovationsManuscript (preprint) (Other academic)
  • 174.
    Guinaudeau, Alexandre
    KTH, School of Electrical Engineering and Computer Science (EECS).
    Estimating the probability of event occurrence2019Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    In complex systems anomalous behaviors can occur intermittently and stochastically. In this case, it is hard to diagnose real errors among spurious ones. These errors are often hard to troubleshoot and require close attention, but troubleshooting each occurrence is time-consuming and is not always an option.

    In this thesis, we define two different models to estimate the underlying probability of occurrence of an error, one based on binary segmentation and null hypothesis testing, and the other one based on hidden Markov models. Given a threshold level of confidence, these models are tuned to trigger alerts when a change is detected with sufficiently high probability.

    We generated events drawn from Bernoulli distributions emulating these anomalous behaviors to benchmark these two candidate models. Both models have the same sensitivity, δp ≈ 10%, and delay, δt ≈ 100 observations, to detect change points. However, they do not generalize in the same way to broader problems and provide therefore two complementary solutions.

  • 175.
    Gunnarsson, Simon
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Curve Building and SwapPricing in the Presence of Collateral and Basis Spreads2013Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The eruption of the financial crisis in 2008 caused immense widening of both domestic and cross currency basis spreads. Also, as a majority of all fixed income contracts are now collateralized the funding cost of a financial institution may deviate substantially from the domestic Libor. In this thesis, a framework for pricing of collateralized interest rate derivatives that accounts for the existence of non-negligible basis spreads is implemented. It is found that losses corresponding to several percent of the outstanding notional may arise as a consequence of not adapting to the new market conditions.

  • 176.
    Gunnvald, Patrik
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Joelsson, Viktor
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Analysis of Hedging Strategies for Hydro Power on the Nordic Power Market2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Hydro power is the largest source for generation of electricity in the Nordic region today.

    This production is heavily dependent on the weather since it dictates the terms for the availability and the amount of power to be produced. Vattenfall as a company has an incentive to avoid volatile revenue streams as it facilitates economic planning and induces a positive effect on its credit rating, thus also on its bottom line. Vattenfall is a large producer of hydro power with a possibility to move the power market which adds further complexity to the problem. In this thesis the authors develop new hedging strategies which will hedge more efficiently. With efficiency is meant the same risk, or standard deviation, at a lower cost or alternatively formulated lower risk for the same cost. In order to enable comparison and make claims about efficiency, a reference solution is developed that should reflect their current hedging strategy. To achieve higher efficiency we focus on finding dynamic hedging strategies. First a prototype model is suggested to facilitate the construction of the solution methods and if it is worthwhile to pursue a further investigation. As this initial prototype model results showed that there were substantial room for efficiency improvement, a larger main model with parameters estimated from data is constructed which encapsulate the real world scenario much better. Four different solutions methods are developed and applied to this main model setup. The results are then compared to reference strategy. We find that even though the efficiency was less then first expected from the prototype model results, using these new hedging strategies could reduce costs by 1.5 % - 5%. Although the final choice of the hedging strategy might be down to the end user we suggest the strategy called BW to reduce costs and improve efficiency. The paper also discusses among other things; the solution methods and hedging strategies, the term optimality and the impact of parameters in the model.

  • 177.
    Gustafsson, Alexander
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Wogenius, Sebastian
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Modelling Apartment Prices with the Multiple Linear Regression Model2014Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This thesis examines factors that are of most statistical significance for the sales prices of apartments in the Stockholm City Centre. Factors examined are address, area, balcony, construction year, elevator, fireplace, floor number, maisonette, monthly fee, penthouse and number of rooms. On the basis of this examination, a model for predicting prices of apartments is constructed. In order to evaluate how the factors influence the price, this thesis analyses sales statistics and the mathematical method used is the multiple linear regression model. In a minor case-study and literature review, included in this thesis, the relationship between proximity to public transport and the prices of apartments in Stockholm are examined.

    The result of this thesis states that it is possible to construct a model, from the factors analysed, which can predict the prices of apartments in Stockholm City Centre with an explanation degree of 91% and a two million SEK confidence interval of 95%. Furthermore, a conclusion can be drawn that the model predicts lower priced apartments more accurately. In the case-study and literature review, the result indicates support for the hypothesis that proximity to public transport is positive for the price of an apartment. However, such a variable should be regarded with caution due to the purpose of the modelling, which differs between an individual application and a social economic application

  • 178. Gustavsson, J
    et al.
    Näsman, P
    KTH, School of Architecture and the Built Environment (ABE), Centres, Centre for Transport Studies, CTS. KTH, School of Architecture and the Built Environment (ABE), Transport Science, Transport and Location Analysis.
    Some information about the activities at the Department of Statistics, Stockholm University, Sweden1990Conference paper (Other academic)
  • 179.
    Guterstam, Rasmus
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Trojenborg, Vidar
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Exploring a personal property pricing method in insurance context using multiple regression analysis2019Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    In general, insurance companies and especially their clients face long and complicated claims processes where payments rarely, and almost reluctantly, are made the same day. A part of this slow moving procedure is the fact that in some cases the insurer has to value the personal property themselves, which can be a tedious process. In conjunction with the insurance company Hedvig, this project address this issue by examining a pricing model for a specific personal property; smartphones - one of the most common occurring claim types in the insurance context.

    Using multiple linear regression with data provided by PriceRunner, 10 key characteristics out of 91 where found to have significant explanatory power in predicting the market price of a smartphone. The model successfully simulates this market price with an explained variance of 90%. Furthermore this thesis illustrates an intuitive example regarding pricing models for personal property of other sorts, identifying limiting key components to be data availability and product complexity.

  • 180.
    Habibi, Shiva
    et al.
    Chalmers Univ Technol, S-41296 Gothenburg, Sweden..
    Frejinger, Emma
    Univ Montreal, Dept Comp Sci & Operat Res, Montreal, PQ H3C 3J7, Canada..
    Sundberg, Marcus
    KTH, School of Architecture and the Built Environment (ABE), Urban Planning and Environment, System Analysis and Economics.
    An empirical study on aggregation of alternatives and its influence on prediction in car type choice models2019In: Transportation, ISSN 0049-4488, E-ISSN 1572-9435, Vol. 46, no 3, p. 563-582Article in journal (Refereed)
    Abstract [en]

    Assessing and predicting car type choices are important for policy analysis. Car type choice models are often based on aggregate alternatives. This is due to the fact that analysts typically do not observe choices at the detailed level that they are made. In this paper, we use registry data of all new car purchases in Sweden for two years where cars are observed by their brand, model and fuel type. However, the choices are made at a more detailed level. Hence, an aggregate (observed) alternative can correspond to several disaggregate (detailed) alternatives. We present an extensive empirical study analyzing estimation results, in-sample and out-of-sample fit as well as prediction performance of five model specifications. These models use different aggregation methods from the literature. We propose a specification of a two-level nested logit model that captures correlation between aggregate and disaggregate alternatives. The nest specific scale parameters are defined as parameterized exponential functions to keep the number of parameters reasonable. The results show that the in-sample and out-of-sample fit as well as the prediction performance differ. The best model accounts for the heterogeneity over disaggregate alternatives as well as the correlation between both disaggregate and aggregate alternatives. It outperforms the commonly used aggregation method of simply including a size measure.

  • 181. Haimi, Antti
    et al.
    Wennman, Aron
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematics (Div.).
    A central limit theorem for fluctuations in polyanalytic ginibre ensembles2017In: International mathematics research notices, ISSN 1073-7928, E-ISSN 1687-0247, Vol. rnx147Article in journal (Refereed)
    Abstract [sv]

    We study fluctuations of linear statistics in polyanalytic Ginibre ensembles, a family of point processes describing planar free fermions in a uniform magnetic field at higher Landau levels. Our main result is asymptotic normality of fluctuations, extending a result of Rider and Virág. As in the analytic case, the variance is composed of independent terms from the bulk and the boundary. Our methods rely on a structural formula for polyanalytic polynomial Bergman kernels which separates out the different pure q" role="presentation">q-analytic kernels corresponding to different Landau levels. The fluctuations with respect to these pure q" role="presentation">q-analytic Ginibre ensembles are also studied, and a central limit theorem is proved. The results suggest a stabilizing effect on the variance when the different Landau levels are combined together.

  • 182.
    Halberg, Oscar
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Wärmlös Helmrich, Mattias
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Operational Risk Modeling: Addressing the Reporting Threshold Problem2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    External loss data are typically left truncated at a reporting threshold. Ignoring this truncation level leads to biased capital charge estimations. This thesis addresses the challenges of recreating the truncated part of the distribution. By predicting the continuation of a probability density function, the unobserved body of an external operational risk loss distribution is estimated. The prediction is based on internally collected losses and the tail of the external loss distribution. Using a semiparametric approach to generate sets of internal losses and applying the Best Linear Unbiased Predictor, results in an enriched external dataset that shares resemblance with the internal dataset. By avoiding any parametrical assumptions, this study proposes a new and unique way to address the reporting threshold problem. Financial institutions will benefit from these findings as it permits the use of the semiparametric approach developed by Bolancé et al. (2012) and thereby eliminates the well known difficulty with determining the breaking point beyond which the tail domain is defined when using the Loss Distribution Approach. The main conclusion from this thesis is that predicting the continuation of a function using the Best Linear Unbiased Predictor can be successfully applied in an operational risk setting. This thesis has predicted the continuation of a probability density function, resulting in a full external loss distribution.

  • 183.
    Hallberg, David
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Optimization and Systems Theory.
    Renström, Erik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Optimization and Systems Theory.
    PC Regression, Vector Autoregression, and Recurrent Neural Networks: How do they compare when predicting stock index returns for building efficient portfolios?2019Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This thesis examines the statistical and economic performance of modeling and predicting equity index returns by application of various statistical models on a set of macroeconomic and financial variables. By combining linear principal component regression, vector autoregressive models, and LSTM neural networks, the authors find that while a majority of the models display high statistical significance, virtually none of them successfully outperform classic portfolio theory on efficient markets in terms of risk-adjusted returns. Several implications are also discussed based on the results.

  • 184.
    HALLGREN, FREDRIK
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).
    On Prediction and Filtering of Stock Index Returns:  2011Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
  • 185.
    Hallgren, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Forecasting Ranking in Harness Racing using Probabilities Induced by Expected PositionsManuscript (preprint) (Other academic)
    Abstract [en]

    Ranked events are pivotal in many important AI-applications such as QuestionAnswering and recommendations systems. This paper studies ranked events in the setting of harness racing.

    For each horse there exists a probability distribution over its possible rankings. In the paper it is shown that a set of expected positions (and more generally, higher moments) for the horses induces this probability distribution.

    The main contribution of the paper is a method which extracts this induced probability distribution from a set of expected positions. An algorithm isproposed where the extraction of the induced distribution is given by the estimated expectations. MATLAB code is provided for the methodology.

    This approach gives freedom to model the horses in many different ways without the restrictions imposed by for instance logistic regression. To illustrate this point, we employ a neural network and ordinary ridge regression.

    The method is applied to predicting the distribution of the finishing positions for horses in harness racing. It outperforms both multinomial logistic regression and the market odds.

    The ease of use combined with fine results from the suggested approach constitutes a relevant addition to the increasingly important field of ranked events.

  • 186.
    Hallgren, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Inference in Temporal Graphical Models2016Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis develops mathematical tools used to model and forecast different economic phenomena. The primary starting point is the temporal graphical model. Four main topics, all with applications in finance, are studied.

    The first two papers develop inference methods for networks of continuous time Markov processes, so called Continuous Time Bayesian Networks. Methodology for learning the structure of the network and for doing inference and simulation is developed. Further, models are developed for high frequency foreign exchange data.

    The third paper models growth of gross domestic product (GDP) which is observed at a very low frequency. This application is special and has several difficulties which are dealt with in a novel way using a framework developed in the paper. The framework is motivated using a temporal graphical model. The method is evaluated on US GDP growth with good results.

    The fourth paper study inference in dynamic Bayesian networks using Monte Carlo methods. A new method for sampling random variables is proposed. The method divides the sample space into subspaces. This allows the sampling to be done in parallel with independent and distinct sampling methods on the subspaces. The methodology is demonstrated on a volatility model for stock prices and some toy examples with promising results.

    The fifth paper develops an algorithm for learning the full distribution in a harness race, a ranked event. It is demonstrated that the proposed methodology outperforms logistic regression which is the main competitor. It also outperforms the market odds in terms of accuracy.

  • 187.
    Hallgren, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Nowcasting with dynamic maskingManuscript (preprint) (Other academic)
    Abstract [en]

    Nowcasting consists of tracking GDP by focusing on the data flow consisting oftimely available releases. The essential features of a nowcasting data set are differing sampling frequencies, and the ragged-edge: the differing patterns of missing observations at the end of the sample due to non-synchronicity of data publications. Various econometric frameworks have been proposed to deal withthese characteristics. During a sequence of subsequent nowcast occasions, the models are traditionally re-estimated, or updated, based on an expanding dataset as more and more data becomes available. This paper proposes to take the ragged-edge structure into account when estimating a nowcast model. Instead of using all available historical data, it is here proposed to first mask the historical data so as to reflect the pattern of data availability at the specific nowcast occasion. Since each nowcast occasion exhibits its own specific ragged-edge structure, we propose to re-estimate or recalibrate the model at each juncture employing the accompanying mask, hence dynamic masking.Dynamic masking thus tailors the model to the specific nowcast occasion. It is shown how tailoring improves precision by employing ridge regressions with and without masking in a real-time nowcasting back-test.

    The masking approach is motivated by theory and demonstrated on real data. It surpasses the dynamic factor model in backtests. Dynamic masking gives ease of implementation, a solid theoretical foundation, flexibility in modeling, and encouraging results; we therefore consider it a relevant addition to the nowcasting methodology.

  • 188.
    Hallgren, Jonas
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Structure Learning and Mixed Radix representation in Continuous Time Bayesian NetworksManuscript (preprint) (Other academic)
    Abstract [en]

    Continuous time Bayesian Networks (CTBNs) are graphical representations of the dependence structures between continuous time random processes with finite state spaces. We propose a method for learning the structure of the CTBNs using a causality measure based on Kullback-Leibler divergence. We introduce the causality matrix can be seen as a generalized version of the covariance matrix. We give a mixed radix representation of the process that much facilitates the learning and simulation. A new graphical model for tick-by-tick financial data is proposed and estimated. Our approach indicates encouraging results on both the tick-data and on a simulated example.

  • 189.
    Hamdi, Ali
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    On the Snell envelope approach to optimal switching and pricing Bermudan options2011Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis consists of two papers related to systems of Snell envelopes. The first paper uses a system of Snell envelopes to formulate the problem of two-modes optimal switching for the full balance sheet in finite horizon. This means that the switching problem is formulated in terms of trade-off strategies between expected profit and cost yields, which act as obstacles to each other. Existence of a minimal solution of this system is obtained by using an approximation scheme. Furthermore, the optimal switching strategies are fully characterized.

    The second paper uses the Snell envelope to formulate the fair price of Bermudan options. To evaluate this formulation of the price, the optimal stopping strategy for such a contract must be estimated. This may be done recursively if some method of estimating conditional expectations is available. The paper focuses on nonparametric estimation of such expectations, by using regularization of a least-squares minimization, with a Tikhonov-type smoothing put on the partial diferential equation which characterizes the underlying price processes. This approach can hence be viewed as a combination of the Monte Carlo method and the PDE method for the estimation of conditional expectations. The estimation method turns out to be robust with regard tothe size of the smoothing parameter.

  • 190.
    Hamdi, Ali
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    PDE-regularization for pricing multi-dimensional Bermudan options with Monte Carlo simulationManuscript (preprint) (Other academic)
    Abstract [en]

    This paper considers the problem of pricing multi-dimensional Bermudan derivatives using Monte Carlo simulation. A new method for computing conditional expectations is proposed, which combined with the dynamic programming principle provides a way of pricing the derivatives.

    The method is a non-parametric projection with regularization. The regularization penalizes deviations from the PDE that the true conditional expectation satisfies. The point being that it is less costly to compute the norm of the PDE than it is to solve it, thus avoiding the curse of dimensionality.

    The method is shown to produce accurate numerical results in multi-dimensional settings, given a good choice of the regularization parameter. It is illustrated with the multi-dimensional Black-Scholes model and compared to the Longstaff-Schwartz approach.

  • 191.
    Hamdi, Ali
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Some aspects of optimal switching and pricing Bermudan options2013Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    This thesis consists of four papers that are all related to the Snell envelope. In the first paper, the Snell envelope is used as a formulation of a two-modes optimal switching problem. The obstacles are interconnected, take both profit and cost yields into account, and switching is based on both sides of the balance sheet. The main result is a proof of existence of a continuous minimal solution to a system of Snell envelopes, which fully characterizes the optimal switching strategy. A counter-example is provided to show that uniqueness does not hold.

    The second paper considers the problem of having a large number of production lines with two modes of production, high-production and low-production. As in the first paper, we consider both expected profit and cost yields and switching based on both sides of the balance sheet. The production lines are assumed to be interconnected through a coupling term, which is the average optimal expected yields. The corresponding system of Snell envelopes is highly complex, so we consider the aggregated yields where a mean-field approximation is used for the coupling term. The main result is a proof of existence of a continuous minimal solution to a system of Snell envelopes, which fully characterizes the optimal switching strategy. Furthermore, existence and uniqueness is proven for the mean-field reflected backward stochastic differential equations (MF-RBSDEs) we consider, a comparison theorem and a uniform bound for the MF-RBSDEs is provided.

    The third paper concerns pricing of Bermudan type options. The Snell envelope is used as a representation of the price, which is determined using Monte Carlo simulation combined with the dynamic programming principle. For this approach, it is necessary to estimate the conditional expectation of the future optimally exercised payoff. We formulate a projection on a grid which is ill-posed due to overfitting, and regularize with the PDE which characterizes the underlying process. The method is illustrated with numerical examples, where accurate results are demonstrated in one dimension.

    In the fourth paper, the idea of the third paper is extended to the multi-dimensional setting. This is necessary because in one dimension it is more efficient to solve the PDE than to use Monte Carlo simulation. We relax the use of a grid in the projection, and add local weights for stability. Using the multi-dimensional Black-Scholes model, the method is illustrated in settings ranging from one to 30 dimensions. The method is shown to produce accurate results in all examples, given a good choice of the regularization parameter.

  • 192.
    Hamdi, Ali
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Marcus, Mårten
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Pricing Bermudan options: A nonparametric estimation approachManuscript (preprint) (Other academic)
    Abstract [en]

    A nonparametric alternative to the Longstaff-Schwartz estimation of conditional expectations is suggested for pricing of Bermudan options. The method is based on regularization of a least-squares minimization, with a Tikhonov-type smoothing put on the partial differential equation which characterizes the underlying price processes. This approach can hence be viewed as a combination of the Monte Carlo method and the PDE method for the estimation of conditional expectations. The estimation method turns out to be robust with regard to the size of the smoothing parameter.

  • 193.
    Hansson, Fredrik
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A pricing and performance study on auto-callable structured products2012Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Abstract

    We propose an algorithm to price and analyze the performance of auto-callable structured _nancial products. The algorithm contains Monte-Carlo simulations in order to reproduce, as probable as possible, a future product. This model is then compared to other, previously presented models. The di_erent in-data parameters together with a time dependency study is then performed to evaluate what one might expect when investing in these products. Numerical results conclude that, the risks taken by the investor closely reect the potential return for each product. When constructing these products for the near future, one must closely evaluate the demand from the investors i.e. evaluate the level of risk that the investors are willing to take.

  • 194.
    Hansén, Jacob
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Gustafsson, Axel
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    A Study on Comparison Websites in the Airline Industry and Using CART Methods to Determine Key Parameters in Flight Search Conversion2019Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This bachelor thesis in applied mathematics and industrial engineering and management aimed to identify relationships between search parameters in flight comparison search engines and the exit conversion rate, while also investigating how the emergence of such comparison search engines has impacted the airline industry. To identify such relationships, several classification models were employed in conjunction with several sampling methods to produce a predictive model using the program R. To investigate the impact of the emergence of comparison websites, Porter's 5 forces and a SWOT - analysis were employed to analyze findings of a literature study and a qualitative interview. The classification models developed performed poorly with regards to several assessments metrics which suggested that there were little to no significance in the relationship between the search parameters investigated and exit conversion rate. Porter's 5 forces and the SWOT-analysis suggested that the competitive landscape of the airline industry has become more competitive and that airlines which do not manage to adapt to this changing market environment will experience decreasing profitability.

  • 195.
    Hansén, Rasmus
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Allocation of Risk Capital to Contracts in Catastrophe Reinsurance2013Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    This thesis is theresult of a project aimed at developing a tool for allocation of risk capitalin catastrophe excess-of-loss reinsurance. Allocation of risk capital is animportant tool for measuring portfolio performance and optimizing the capitalrequirement. Here, two allocation rules are described and analyzed, Eulerallocation and Capital layer allocation. The rules are applied to two differentportfolios. The main conclusions is that the two methods can be used togetherto get a better picture of how the dependence structure between the contractsaffect the portfolio result. It is also illustrated how the RORAC of one of theportfolios can be increased by 1 % using the outcome from the analyses.

  • 196.
    Hedblom, Edvin
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Åkerblom, Rasmus
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Debt recovery prediction in securitized non-performing loans using machine learning2019Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Credit scoring using machine learning has been gaining attention within the research field in recent decades and it is widely used in the financial sector today. Studies covering binary credit scoring of securitized non-performing loans are however very scarce. This paper is using random forest and artificial neural networks to predict debt recovery for such portfolios. As a performance benchmark, logistic regression is used. Due to the nature of high imbalance between the classes, the performance is evaluated mainly on the area under both the receiver operating characteristic curve and the precision-recall curve. This paper shows that random forest, artificial neural networks and logistic regression have similar performance. They all indicate an overall satisfactory ability to predict debt recovery and hold potential to be implemented in day-to-day business related to non-performing loans.

     

  • 197.
    Hededal Klincov, Lazar
    et al.
    KTH, School of Technology and Health (STH), Medical Engineering, Computer and Electronic Engineering.
    Symeri, Ali
    KTH, School of Technology and Health (STH), Medical Engineering, Computer and Electronic Engineering.
    Devising a Trend-break-detection Algorithm of stored Key Performance Indicators for Telecom Equipment2017Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    A problem that is prevalent for testers at Ericsson is that performance test results are continuously generated but not analyzed. The time between occurrence of problems and information about the occurrence is long and variable. This is due to the manual analysis of log files that is time consuming and tedious. The requested solution is automation with an algorithm that analyzes the performance and notifies when problems occur. A binary classifier algorithm, based on statistical methods, was developed and evaluated as a solution to the stated problem. The algorithm was evaluated with simulated data and produced an accuracy of 97.54 %, to detect trend breaks. Furthermore, correlation analysis was carried out between performance and hardware to gain insights in how hardware configurations affect test runs.

  • 198.
    Hedman, Molly
    et al.
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Lind, Hanna
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Den psykiska ohälsan i Sverige2019Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The mental health has increased in Sweden, which besides the personal suffering affects both the society and economy. The reason behind the increase does not have any definite explanation but the answer may, at least partly, be found in macroeconomic and socioeconomic factors. This report will therefore investigate if there exists a relationship between mental health problems and macroeconomic and socioeconomic factors. An analysis of how these factors may explain the increase of mental health problems is also performed. To see if a relationship exists, a multivariable regression analysis is performed, where the dependent variable is defined as severe problems with anxiety and worry. The regression variables are education level, GDP per capita, the households disposable income and unemployment. The analysis is performed on the groups; women, men and total population and the data is collected over the years 2002 to 2017.

    The analysis indicates a certain relationship between the different macro and socioeconomic variables and mental health problems. For the total population, education level is the most significant. For women, education level, GDP per capita and the households disposable income are most important. For men, unemployment and disposable income are the strongest correlated variables.

    The models approximately fulfills the assumptions for the least square method and have multicollinearity present, which in total makes them less reliable. Further research to validate these relationships and to contribute to explanations of potential causality is needed.

  • 199.
    Heimbürger, Hjalmar
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Modelling of Stochastic Volatility using Partially Observed Markov Models2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    In this thesis, calibration of stochastic volatility models that allow correlation between the volatility and the returns has been considered. To achieve this, the dynamics has been modelled as an extension of hidden Markov models, and a special case of partially observed Markov models. This thesis shows that such models can be calibrated using sequential Monte Carlo methods, and that a model with correlation provide a better fit to the observed data. However, the results are not conclusive and more research is needed in order to confirm this for other data sets and models.

  • 200.
    Hellander, Martin
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.
    Credit Value Adjustment: The Aspects of Pricing Counterparty Credit Risk on Interest Rate Swaps2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    In this thesis, the pricing of counterparty credit risk on an OTC plain vanilla interest rate swap is investigated. Counterparty credit risk can be defined as the risk that a counterparty in a financial contract might not be able or willing to fulfil their obligations. This risk has to be taken into account in the valuation of an OTC derivative. The market price of the counterparty credit risk is known as the Credit Value Adjustment (CVA). In a bilateral contract, such as a swap, the party’s own creditworthiness also has to be taken into account, leading to another adjustment known as the Debit Value Adjustment (DVA). Since 2013, the international accounting standards (IFRS) states that these adjustments have to be done in order to reflect the fair value of an OTC derivative.

    A short background and the derivation of CVA and DVA is presented, including related topics like various risk mitigation techniques, hedging of CVA, regulations etc.. Four different pricing frameworks are compared, two more sophisticated frameworks and two approximative approaches. The most complex framework includes an interest rate model in form of the LIBOR Market Model and a credit model in form of the Cox-Ingersoll- Ross model. In this framework, the impact of dependencies between credit and market risk factors (leading to wrong-way/right-way risk) and the dependence between the default time of different parties are investigated.

1234567 151 - 200 of 464
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf