Please wait ... |

Refine search result

CiteExportLink to result list
http://kth.diva-portal.org/smash/resultList.jsf?query=&language=en&searchType=SIMPLE&noOfRows=50&sortOrder=author_sort_asc&sortOrder2=title_sort_asc&onlyFullText=false&sf=all&aq=%5B%5B%7B%22personId%22%3A%22u13cu9mc%22%7D%5D%5D&aqe=%5B%5D&aq2=%5B%5B%5D%5D&af=%5B%5D $(function(){PrimeFaces.cw("InputTextarea","widget_formSmash_upper_j_idt1166_recordPermLink",{id:"formSmash:upper:j_idt1166:recordPermLink",widgetVar:"widget_formSmash_upper_j_idt1166_recordPermLink",autoResize:true});}); $(function(){PrimeFaces.cw("OverlayPanel","widget_formSmash_upper_j_idt1166_j_idt1168",{id:"formSmash:upper:j_idt1166:j_idt1168",widgetVar:"widget_formSmash_upper_j_idt1166_j_idt1168",target:"formSmash:upper:j_idt1166:permLink",showEffect:"blind",hideEffect:"fade",my:"right top",at:"right bottom",showCloseIcon:true});});

Permanent link

Cite

Citation styleapa harvard1 ieee modern-language-association-8th-edition vancouver Other style $(function(){PrimeFaces.cw("SelectOneMenu","widget_formSmash_upper_j_idt1184",{id:"formSmash:upper:j_idt1184",widgetVar:"widget_formSmash_upper_j_idt1184",behaviors:{change:function(ext) {PrimeFaces.ab({s:"formSmash:upper:j_idt1184",e:"change",f:"formSmash",p:"formSmash:upper:j_idt1184",u:"formSmash:upper:otherStyle"},ext);}}});});

- apa
- harvard1
- ieee
- modern-language-association-8th-edition
- vancouver
- Other style

Languagede-DE en-GB en-US fi-FI nn-NO nn-NB sv-SE Other locale $(function(){PrimeFaces.cw("SelectOneMenu","widget_formSmash_upper_j_idt1195",{id:"formSmash:upper:j_idt1195",widgetVar:"widget_formSmash_upper_j_idt1195",behaviors:{change:function(ext) {PrimeFaces.ab({s:"formSmash:upper:j_idt1195",e:"change",f:"formSmash",p:"formSmash:upper:j_idt1195",u:"formSmash:upper:otherLanguage"},ext);}}});});

- de-DE
- en-GB
- en-US
- fi-FI
- nn-NO
- nn-NB
- sv-SE
- Other locale

Output formathtml text asciidoc rtf $(function(){PrimeFaces.cw("SelectOneMenu","widget_formSmash_upper_j_idt1205",{id:"formSmash:upper:j_idt1205",widgetVar:"widget_formSmash_upper_j_idt1205"});});

- html
- text
- asciidoc
- rtf

Rows per page

- 5
- 10
- 20
- 50
- 100
- 250

Sort

- Standard (Relevance)
- Author A-Ö
- Author Ö-A
- Title A-Ö
- Title Ö-A
- Publication type A-Ö
- Publication type Ö-A
- Issued (Oldest first)
- Issued (Newest first)
- Created (Oldest first)
- Created (Newest first)
- Last updated (Oldest first)
- Last updated (Newest first)
- Disputation date (earliest first)
- Disputation date (latest first)

- Standard (Relevance)
- Author A-Ö
- Author Ö-A
- Title A-Ö
- Title Ö-A
- Publication type A-Ö
- Publication type Ö-A
- Issued (Oldest first)
- Issued (Newest first)
- Created (Oldest first)
- Created (Newest first)
- Last updated (Oldest first)
- Last updated (Newest first)
- Disputation date (earliest first)
- Disputation date (latest first)

Select

The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.

1. Ahmad, M. Rauf et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_0_j_idt1271",{id:"formSmash:items:resultList:0:j_idt1271",widgetVar:"widget_formSmash_items_resultList_0_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:0:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Pavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.).PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:0:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); A U-classifier for high-dimensional data under non-normality2018In: Journal of Multivariate Analysis, ISSN 0047-259X, E-ISSN 1095-7243, Vol. 167, p. 269-283Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_0_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:0:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_0_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); A classifier for two or more samples is proposed when the data are high-dimensional and the distributions may be non-normal. The classifier is constructed as a linear combination of two easily computable and interpretable components, the U-component and the P-component. The U-component is a linear combination of U-statistics of bilinear forms of pairwise distinct vectors from independent samples. The P-component, the discriminant score, is a function of the projection of the U-component on the observation to be classified. Together, the two components constitute an inherently bias-adjusted classifier valid for high-dimensional data. The classifier is linear but its linearity does not rest on the assumption of homoscedasticity. Properties of the classifier and its normal limit are given under mild conditions. Misclassification errors and asymptotic properties of their empirical counterparts are discussed. Simulation results are used to show the accuracy of the proposed classifier for small or moderate sample sizes and large dimensions. Applications involving real data sets are also included.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:0:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 2. Appelberg, J. et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_1_j_idt1271",{id:"formSmash:items:resultList:1:j_idt1271",widgetVar:"widget_formSmash_items_resultList_1_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:1:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Janson, C.Lindberg, E.Pavlenko, TatjanaDepartment of Statistics, Stockholm University, Stockholm, Sweden.Hedenstierna, G.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:1:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Lung aeration during sleep in patients with obstructive sleep apnoea2010In: Clinical Physiology and Functional Imaging, ISSN 1475-0961, E-ISSN 1475-097X, Vol. 30, no 4, p. 301-307Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_1_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:1:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_1_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Background: Previous studies have indicated that patients with obstructive sleep apnoea (OSA) have altered ventilation and lung volumes awake and the results suggest that this may be a determinant of severity of desaturations during sleep. However, little is known about regional lung aeration during sleep in patients with OSA. Methods: Twelve patients with OSA were included in the study. Computed tomography was used to study regional lung aeration during wakefulness and sleep. Lung aeration was calculated in ml gas/g lung tissue in four different regions of interest (ROI(1-4)), along the border of the lung from ventral to dorsal. Results: Lung aeration in the dorsal (dependent) lung region (ROI(4)) was lower during sleep compared to wakefulness 0 center dot 78 +/- 0 center dot 19 versus 0 center dot 88 +/- 0 center dot 19 (mean +/- SD) ml gas/g lung tissue (P = 0 center dot 005). Associations were found between awake expiratory reserve volume and change in lung aeration from wakefulness to sleep in ROI(4) (r = -0 center dot 69; P = 0 center dot 012). In addition, the change in lung aeration in the dorsal region correlated to sleep time (r = 0 center dot 69; P = 0 center dot 014) but not to time in supine position. The difference in lung aeration between inspiration and expiration (i.e. ventilation), was larger in the ventral lung region when expressed as ml gas per g lung tissue. In two patients it was noted that, during on-going obstructive apnoea, lung aeration tended to be increased rather than decreased. Conclusions: Aeration in the dorsal lung region is reduced during sleep in patients with OSA. The decrease is related to lung volume awake and to sleep time.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:1:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 3. Appelberg, Jonas et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_2_j_idt1271",{id:"formSmash:items:resultList:2:j_idt1271",widgetVar:"widget_formSmash_items_resultList_2_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:2:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Pavlenko, TatjanaMid Sweden University.Bergman, HenrikRothen, HHedenstierna, GöranPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:2:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Lung aeration during sleep2007In: Chest, ISSN 0012-3692, E-ISSN 1931-3543, Vol. 131, no 1, p. 122-129Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_2_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:2:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_2_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Background: During sleep, ventilation and functional residual capacity (FRC) decrease slightly. This study addresses regional lung aeration during wakefulness and sleep. Methods: Ten healthy subjects underwent spirometry awake and with polysomnography, including pulse oximetry, and also CT when awake and during sleep. Lung aeration in different lung regions was analyzed. Another three subjects were studied awake to develop a protocol for dynamic CT scanning during breathing. Results: Aeration in the dorsal, dependent lung region decreased from a mean of 1.14 ± 0.34 mL (± SD) of gas per gram of lung tissue during wakefulness to 1.04 ± 0.29 mL/g during non-rapid eye movement (NREM) sleep (- 9%) [p = 0.034]. In contrast, aeration increased in the most ventral, nondependent lung region, from 3.52 ± 0.77 to 3.73 ± 0.83 mL/g (+ 6%) [p = 0.007]. In one subject studied during rapid eye movement (REM) sleep, aerationdecreased from 0.84 to 0.65 mL/g (- 23%). The fall in dorsal lung aeration during sleepcorrelated to awake FRC (R

^{2}= 0.60; p = 0.008). Airway closure, measured awake, occurred near and sometimes above the FRC level. Ventilation tended to be larger in dependent, dorsal lung regions, both awake and during sleep (upper region vs lower region, 3.8% vs 4.9% awake, p = 0.16, and 4.5% vs 5.5% asleep, p = 0.09, respectively). Conclusions: Aeration is reduced in dependent lung regions and increased in ventral regions during NREM and REM sleep. Ventilation was more uniformly distributed between upper and lower lung regions than has previously been reported in awake, upright subjects. Reduced respiratory muscle tone and airway closure are likely causative factors.PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:2:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 4. Corander, J. et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_3_j_idt1271",{id:"formSmash:items:resultList:3:j_idt1271",widgetVar:"widget_formSmash_items_resultList_3_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:3:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Koski, TimoKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.Pavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.Tillander, A.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:3:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Bayesian block-diagonal predictive classifier for Gaussian data2013In: Synergies of Soft Computing and Statistics for Intelligent Data Analysis, Springer, 2013, p. 543-551Conference paper (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_3_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:3:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_3_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); The paper presents a method for constructing Bayesian predictive classifier in a high-dimensional setting. Given that classes are represented by Gaussian distributions with block-structured covariance matrix, a closed form expression for the posterior predictive distribution of the data is established. Due to factorization of this distribution, the resulting Bayesian predictive and marginal classifier provides an efficient solution to the high-dimensional problem by splitting it into smaller tractable problems. In a simulation study we show that the suggested classifier outperforms several alternative algorithms such as linear discriminant analysis based on block-wise inverse covariance estimators and the shrunken centroids regularized discriminant analysis.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:3:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 5. Dahmoun, Marju et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_4_j_idt1271",{id:"formSmash:items:resultList:4:j_idt1271",widgetVar:"widget_formSmash_items_resultList_4_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:4:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Ödmark, Inga-StinaRisberg, BjörnKarlsson, MatsPavlenko, TatjanaMid Sweden University.Bäckström, TorbjörnPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:4:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Apoptosis, proliferation, and sex steroid receptors in postmenopausal endometrium before and during HRT2004In: Maturitas, ISSN 0378-5122, E-ISSN 1873-4111, Vol. 49, no 2, p. 114-123Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_4_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:4:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_4_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Objectives: Endometrial homeostasis, indicated as the balance between apoptosis and proliferation, was studied with regard to endometrial safety and bleeding disturbances. Materials and methods: The quantitatively sufficient endometrial biopsies of 92 postmenopausal women enrolled in the study were investigated. The participants were divided into two groups, each receiving a continuous combined HRT regimen with either conjugated estrogen (CE) 0.625 mg + 5 mg medroxyprogesterone acetate (MPA) (=CE/MPA) or 17-beta-estradiol (E2) 2 mg + 1 mg norethisterone acetate (NETA) (=E2/NETA). These were evaluated according to apoptotic index (Ai) and proliferation marker Ki-67 index. Estrogen receptor alpha (ER) and progesterone receptor (PR) expression were also monitored, as well as endometrial thickness. Quantitative in situ techniques were used. Results: Ai and Ki-67 index were unchanged in epithelial glands of endometrium from baseline to second biopsy obtained after 1 year of combined continuous HRT. In stromal tissue, Ki-67 index was increased, while Ai was on the same level. PR expression in both epithelium and stroma was unchanged. Endometrial thickness was unaffected during therapy, and the histopathological evaluation showed no development of hyperplasia or carcinoma. Conclusions: The unaffected homeostasis in endometrial epithelium contributes to endometrial safety and is in accordance with the histopathological findings of no hyperplasia. The homeostasis of stroma was transformed to be more proliferative. Increased stromal proliferation may be of importance for stromal support of the veins and for decreasing breakthrough bleeding during HRT. The increased stromal proliferation, as well as the decreased ER expression both in epithelium and stroma, could be an effect of progesterone.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:4:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 6. Fomina, Svitlana et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_5_j_idt1271",{id:"formSmash:items:resultList:5:j_idt1271",widgetVar:"widget_formSmash_items_resultList_5_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:5:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Pavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.Englund, ErlingBagdasarova, IngrettaPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:5:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Clinical course of steroid sensitive nephrotic syndrome in children: outcome and outlook2011In: The Open Pediatric Medicine Journal, ISSN 1874-3099, Vol. 5, p. 18-28Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_5_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:5:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_5_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); *Introduction:*The aim of our study was to investigate the relative efficiency and adverse effects of various treatments of steroid sensitive nephrotic syndrome (SSNS) in children, and to determine factors associated with relapse risk in these patients.*Materials and Method:*We retrospectively studied the data from 690 SSNS children treated in referral center over 25 years. The analyzed treatment protocols were: Prednisolone (PRED, eight weeks in a dose 1.5-2.0 mg/kg, then it tapering and given for 9-12 months), Chlorambucil (CHL, cumulative dose 28.5-30 mg/kg), Cyclophosphamide intravenously (CYC I.V., cumulative dose of 30-36 mg/kg, then supporting dose of CHL, cumulative dose of 20-25 mg/kg) and intramuscular (CYC I.M., cumulative dose of 120-150 mg/kg). The alkylating agents were used after remission induction by PRED and under its protection.*Results:*Cumulative relapse-free survival was 81.9%, 69.0% and 64.5% after 12, 36 and 60 months, respectively. In multivariate analyses, relapse risk was associated with age of treatment (<6 years), and both PRED and CYC I.V. The only predictive factor for early relapse was PRED, unlike two and more relapses group where PRED and CYC I.V. as well as age from 3 to 6 years was highly prognostic. The high probability of sustained remission in combination with relatively mild adverse effects was observed for PRED used at first episode and CHL used at relapse.*Conclusion:*To summarize, our protocols characterized by the prolonged PRED and CHL demonstrated promising results and should be considered as an efficient alternative strategy in SSNS management.PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:5:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 7. Fomina, Svitlana et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_6_j_idt1271",{id:"formSmash:items:resultList:6:j_idt1271",widgetVar:"widget_formSmash_items_resultList_6_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:6:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Pavlenko, TatjanaEnglund, ErlingBagdasarova, IngrettaPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:6:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Clinical patterns and renal survival of nephrotic syndrome in childhood: a single-center study (1980-2006)2010In: The open urology & nephrology journal, ISSN 1874-303X, Vol. 3, p. 8-15Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_6_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:6:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_6_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); To investigate changes in the diagnostic patterns, disease profiles courses and therapeutic strategies for severe forms of childhood nephrotic syndrome (NS), the clinical features of 1 349 children treated during two consecutive time periods, 1980-2000 (n=1 162) and 2001-2006 (n=187), were retrospectively reviewed. The significant increase in initial renal impairment, NS with hypertension, and NS with hypertension and hematuria was observed (27.7%

*vs*51.3%, 1.0%*vs*5.3% and 16.4%*vs*21.9%, respectively). The rate of both secondary steroid resistance (SR) and Focal Segmental Glomerulosclerosis increased significantly, (1.8%*vs*5.6%, p=0.032, and 14.9%*vs*29.0%, p=0.034, respectively). The initial renal insufficiency and hypertension were highly predictive of the development of stage 3 of chronic kidney disease (CKD3) among SR patients in a multivariate Cox regression (p=0.001) for years 1980-2000. A higher hazard of CKD3 in male SR patients from three to six years old was observed in 2001-2006. Kaplan-Meier survival curves revealed a shift in the cumulative probability of CKD3, indicating a slower decline of the renal function for SR NS in the years 2001-2006 (p=0.008): the estimated five-year CKD3 risk was 39.7%*vs*27.7%. Achievements in inducing remission and retarding the development of CKD3 in combination with increased severity of NS indicate the effectiveness of domestic strategies of NS management.PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:6:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 8. Gauraha, Niharika et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_7_j_idt1271",{id:"formSmash:items:resultList:7:j_idt1271",widgetVar:"widget_formSmash_items_resultList_7_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:7:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Pavlenko, TatyanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.Parui, Swapan K.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:7:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Post Lasso Stability Selection for High Dimensional Linear Models2017In: ICPRAM: Proceedings of The 6th International Conference on Pattern Recognition Applications and Methods / [ed] DeMarsico, M DiBaja, GS Fred, A, Scitepress , 2017, p. 638-646Conference paper (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_7_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:7:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_7_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Lasso and sub-sampling based techniques (e. g. Stability Selection) are nowadays most commonly used methods for detecting the set of active predictors in high-dimensional linear models. The consistency of the Lasso-based variable selection requires the strong irrepresentable condition on the design matrix to be fulfilled, and repeated sampling procedures with large feature set make the Stability Selection slow in terms of computation time. Alternatively, two-stage procedures (e. g. thresholding or adaptive Lasso) are used to achieve consistent variable selection under weaker conditions (sparse eigenvalue). Such two-step procedures involve choosing several tuning parameters that seems easy in principle, but difficult in practice. To address these problems efficiently, we propose a new two-step procedure, called Post Lasso Stability Selection (PLSS). At the first step, the Lasso screening is applied with a small regularization parameter to generate a candidate subset of active features. At the second step, Stability Selection using weighted Lasso is applied to recover the most stable features from the candidate subset. We show that under mild (generalized irrepresentable) condition, this approach yields a consistent variable selection method that is computationally fast even for a very large number of variables. Promising performance properties of the proposed PLSS technique are also demonstrated numerically using both simulated and real data examples.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:7:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 9. Hyodo, M. et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_8_j_idt1271",{id:"formSmash:items:resultList:8:j_idt1271",widgetVar:"widget_formSmash_items_resultList_8_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:8:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Shutoh, N.Nishiyama, T.Pavlenko, TetyanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:8:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Testing block-diagonal covariance structure for high-dimensional data2015In: Statistica neerlandica (Print), ISSN 0039-0402, E-ISSN 1467-9574, Vol. 69, no 4, p. 460-482Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_8_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:8:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_8_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); A test statistic is developed for making inference about a block-diagonal structure of the covariance matrix when the dimensionality p exceeds n, where n = N - 1 and N denotes the sample size. The suggested procedure extends the complete independence results. Because the classical hypothesis testing methods based on the likelihood ratio degenerate when p > n, the main idea is to turn instead to a distance function between the null and alternative hypotheses. The test statistic is then constructed using a consistent estimator of this function, where consistency is considered in an asymptotic framework that allows p to grow together with n. The suggested statistic is also shown to have an asymptotic normality under the null hypothesis. Some auxiliary results on the moments of products of multivariate normal random vectors and higher-order moments of the Wishart matrices, which are important for our evaluation of the test statistic, are derived. We perform empirical power analysis for a number of alternative covariance structures.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:8:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 10. Hyodo, Masashi et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_9_j_idt1271",{id:"formSmash:items:resultList:9:j_idt1271",widgetVar:"widget_formSmash_items_resultList_9_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:9:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Shutoh, NobumichiSeo, TakashiPavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics. Tokyo Univ Sci, Fac Sci, Dept Math Informat Sci, Japan.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:9:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Estimation of the covariance matrix with two-step monotone missing data2016In: Communications in Statistics - Theory and Methods, ISSN 0361-0926, E-ISSN 1532-415X, Vol. 45, no 7, p. 1910-1922Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_9_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:9:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_9_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); We suggest shrinkage based technique for estimating covariance matrix in the high-dimensional normal model with missing data. Our approach is based on the monotone missing scheme assumption, meaning that missing values patterns occur completely at random. Our asymptotic framework allows the dimensionality p grow to infinity together with the sample size, N, and extends the methodology of Ledoit and Wolf (2004) to the case of two-step monotone missing data. Two new shrinkage-type estimators are derived and their dominance properties over the Ledoit and Wolf (2004) estimator are shown under the expected quadratic loss. We perform a simulation study and conclude that the proposed estimators are successful for a range of missing data scenarios.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:9:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 11. Koizumi, K. et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_10_j_idt1271",{id:"formSmash:items:resultList:10:j_idt1271",widgetVar:"widget_formSmash_items_resultList_10_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:10:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Hyodo, M.Pavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:10:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Modified jarque-bera type tests for multivariate normality in a high-dimensional framework2014In: Journal of Statistical Theory and Practice, ISSN 1559-8608, E-ISSN 1559-8616, Vol. 8, no 2, p. 382-399Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_10_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:10:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_10_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); In this article, we introduce two types of new omnibus procedures for testing multivariate normality based on the sample measures of multivariate skewness and kurtosis. These characteristics, initially introduced by, for example, Mardia (1970) and Srivastava (1984), were then extended by Koizumi, Okamoto, and Seo (2009), who proposed the multivariate Jarque-Bera type test () based on the Srivastava (1984) principal components measure scores of skewness and kurtosis. We suggest an improved MJB test () that is based on the Wilson-Hilferty transform, and a modified MJB test () that is based on the F-approximation to. Asymptotic properties of both tests are examined, assuming that both dimensionality and sample size go to infinity at the same rate. Our simulation study shows that the suggested test outperforms both and for a number of high-dimensional scenarios. The test is then used for testing multivariate normality of the real data digitalized character image.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:10:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 12. Koizumi, K. et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_11_j_idt1271",{id:"formSmash:items:resultList:11:j_idt1271",widgetVar:"widget_formSmash_items_resultList_11_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:11:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Sumikawa, T.Pavlenko, TetyanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:11:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Measures of multivariate skewness and kurtosis in high-dimensional framework2014In: SUT Journal of Mathematics, ISSN 0916-5746, Vol. 50, no 2, p. 483-511Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_11_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:11:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_11_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Skewness and kurtosis characteristics of a multivariate p-dimensional distribution introduced by Mardia (1970) have been used in various testing procedures and demonstrated attractive asymptotic properties in large sample settings. However these characteristics are not designed for high-dimensional problems where the dimensionality, p can largely exceeds the sample size, N. Such type of high-dimensional data are commonly encountered in modern statistical applications. This the suggests that new measures of skewness and kurtosis that can accommodate high-dimensional settings must be derived and carefully studied. In this paper, we show that, by exploiting the dependence structure, new expressions for skewness and kurtosis are introduced as an extension of the corresponding Mardia’s measures, which uses the potential advantages that the block-diagonal covariance structure has to offer in high dimensions. Asymptotic properties of newly derived measures are investigated and the cumulant based characterizations are presented along with of applications to a mixture of multivariate normal distributions and multivariate Laplace distribution, for which the explicit expressions of skewness and kurto-sis are obtained. Test statistics based on the new measures of skewness and kurtosis are proposed for testing a distribution shape, and their limit distributions are established in the asymptotic framework where N → ∞ and p is fixed but large, including p > N. For the dependence structure learning, the gLasso based technique is explored followed by AIC step which we propose for optimization of the gLasso candidate model. Performance accuracy of the test procedures based on our estimators of skewness and kurtosis are evaluated using Monte Carlo simulations and the validity of the suggested approach is shown for a number of cases when p > N.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:11:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 13. Nishiyama, Takahiro et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_12_j_idt1271",{id:"formSmash:items:resultList:12:j_idt1271",widgetVar:"widget_formSmash_items_resultList_12_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:12:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Hyodo, MasashiSeo, TakashiPavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:12:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Testing linear hypotheses of mean vectors for high-dimension data with unequal covariance matrices2013In: Journal of Statistical Planning and Inference, ISSN 0378-3758, E-ISSN 1873-1171, Vol. 143, no 11, p. 1898-1911Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_12_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:12:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_12_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); We propose a new test procedure for testing linear hypothesis on the mean vectors of normal populations with unequal covariance matrices when the dimensionality, p exceeds the sample size N, i.e. p/N -> c < infinity. Our procedure is based on the Dempster trace criterion and is shown to be consistent in high dimensions. The asymptotic null and non-null distributions of the proposed test statistic are established in the high dimensional setting and improved estimator of the critical point of the test is derived using Cornish-Fisher expansion. As a special case, our testing procedure is applied to multivariate Behrens-Fisher problem. We illustrate the relevance and benefits of the proposed approach via Monte-Carlo simulations which show that our new test is comparable to, and in many cases is more powerful than, the tests for equality of means presented in the recent literature.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:12:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 14. Olsson, Jimmy PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_13_j_idt1268",{id:"formSmash:items:resultList:13:j_idt1268",widgetVar:"widget_formSmash_items_resultList_13_j_idt1268",onLabel:"Olsson, Jimmy ",offLabel:"Olsson, Jimmy ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_13_j_idt1271",{id:"formSmash:items:resultList:13:j_idt1271",widgetVar:"widget_formSmash_items_resultList_13_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:13:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Pavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.Rios, FelixKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:13:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Bayesian structure learning in graphical models using sequential Monte CarloManuscript (preprint) (Other academic)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_13_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:13:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_13_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); In this paper we present a family of algorithms, the junction tree expanders, for expanding junction trees in the sense that the number of nodes in the underlying decomposable graph is increased by one. The family of junction tree expanders is equipped with a number of theoretical results including a characterization stating that every junction tree and consequently every de- composable graph can be constructed by iteratively using a junction tree expander. Further, an important feature of a stochastic implementation of a junction tree expander is the Markovian property inherent to the tree propagation dynamics. Using this property, a sequential Monte Carlo algorithm for approximating a probability distribution defined on the space of decompos- able graphs is developed with the junction tree expander as a proposal kernel. Specifically, we apply the sequential Monte Carlo algorithm for structure learning in decomposable Gaussian graphical models where the target distribution is a junction tree posterior distribution. In this setting, posterior parametric inference on the underlying decomposable graph is a direct by- product of the suggested methodology; working with the G-Wishart family of conjugate priors, we derive a closed form expression for the Bayesian estimator of the precision matrix of Gaus- sian graphical models Markov with respect to a decomposable graph. Performance accuracy of the graph and parameter estimators are illustrated through a collection of numerical examples demonstrating the feasibility of the suggested approach in high-dimensional domains.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:13:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 15. Olsson, Jimmy PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_14_j_idt1268",{id:"formSmash:items:resultList:14:j_idt1268",widgetVar:"widget_formSmash_items_resultList_14_j_idt1268",onLabel:"Olsson, Jimmy ",offLabel:"Olsson, Jimmy ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_14_j_idt1271",{id:"formSmash:items:resultList:14:j_idt1271",widgetVar:"widget_formSmash_items_resultList_14_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:14:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Pavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.Rios, Felix LeopoldoPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:14:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Bayesian inference in decomposable graphical models using sequential Monte Carlo methodsManuscript (preprint) (Other academic)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_14_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:14:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_14_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); In this study we present a sequential sampling methodology for Bayesian inference in decomposable graphical models. We recast the problem of graph estimation, which in general lacks natural sequential interpretation, into a sequential setting. Specifically, we propose a recursive Feynman-Kac model which generates a flow of junction tree distributions over a space of increasing dimensions and develop an efficient sequential Monte Carlo sampler. As a key ingredient of the proposal kernel in our sampler we use the Christmas tree algorithm developed in the companion paper Olsson et al. [2017]. We focus on particle MCMC methods, in particular particle Gibbs (PG) as it allows for generating MCMC chains with global moves on an underlying space of decomposable graphs. To further improve the algorithm mixing properties of this PG, we incorporate a systematic refreshment step implemented through direct sampling from a backward kernel. The theoretical properties of the algorithm are investigated, showing in particular that the refreshment step improves the algorithm performance in terms of asymptotic variance of the estimated distribution. Performance accuracy of the graph estimators are illustrated through a collection of numerical examples demonstrating the feasibility of the suggested approach in both discrete and continuous graphical models.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:14:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 16. Olsson, Jimmy PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_15_j_idt1268",{id:"formSmash:items:resultList:15:j_idt1268",widgetVar:"widget_formSmash_items_resultList_15_j_idt1268",onLabel:"Olsson, Jimmy ",offLabel:"Olsson, Jimmy ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_15_j_idt1271",{id:"formSmash:items:resultList:15:j_idt1271",widgetVar:"widget_formSmash_items_resultList_15_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:15:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Pavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.Rios, Felix LeopoldoPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:15:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Generating junction trees of decomopsable graphs with the Christmas tree algorithmManuscript (preprint) (Other academic)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_15_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:15:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_15_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); The junction tree representation provides an attractive structural property for organizing a decomposable graph. In this study, we present a novel stochastic algorithm which we call the Christmas tree algorithm for building of junction trees sequentially by adding one node at a time to the underlying decomposable graph. The algorithm has two important theoretical properties. Firstly, every junction tree and hence every decomposable graph have positive probability of being generated. Secondly, the transition probability from one tree to another has a tractable expression. These two properties, along with the reversed version of the proposed algorithm are key ingredients in the construction of a sequential Monte Carlo sampling scheme for approximating distributions over decomposable graphs, see Olsson et al. [2016]. As an illustrating example, we specify a distribution over the space of junction trees and estimate of the number of decomposable graph through the normalizing constant.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:15:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 17. Pavlenko, Tatjana PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_16_j_idt1268",{id:"formSmash:items:resultList:16:j_idt1268",widgetVar:"widget_formSmash_items_resultList_16_j_idt1268",onLabel:"Pavlenko, Tatjana ",offLabel:"Pavlenko, Tatjana ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Mid Sweden University .PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:16:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:16:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Feature informativeness in high-dimensional discriminant analysis2003In: Communications in Statistics - Theory and Methods, ISSN 0361-0926, E-ISSN 1532-415X, Vol. 32, no 2, p. 459-474Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_16_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:16:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_16_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); A concept of feature informativeness was introduced as a way of measuring the discriminating power of a set of features. A question of interest is how this property of features affects the discrimination performance. The effect is assessed by means of a weighted discriminant function, which distributes weights among features according to their informativeness. The asymptotic normality of the weighted discriminant function is proven and the limiting expressions for the errors are obtained in the growing dimension asymptotic framework, i.e., when the number of features is proportional to the sample size. This makes it possible to establish the optimal in a sense of minimum error probability type of weighting.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:16:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 18. Pavlenko, Tatjana PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_17_j_idt1268",{id:"formSmash:items:resultList:17:j_idt1268",widgetVar:"widget_formSmash_items_resultList_17_j_idt1268",onLabel:"Pavlenko, Tatjana ",offLabel:"Pavlenko, Tatjana ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_17_j_idt1271",{id:"formSmash:items:resultList:17:j_idt1271",widgetVar:"widget_formSmash_items_resultList_17_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Dept. of Statistics, Stockholm University, Sweden.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:17:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Björkström, AndersPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:17:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Exploiting Sparse Dependence Structure in Model Based Classification2010In: COMBINING SOFT COMPUTING AND STATISTICAL METHODS IN DATA ANALYSIS / [ed] Borgelt, C; GonzalezRodriguez, G; Trutschnig, W; Lubiano, MA; Gil, MA; Grzegorzewski, P; Hryniewicz, O, 2010, p. 509-517Conference paper (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_17_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:17:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_17_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Sparsity patterns discovered in the data dependence structure were used to reduce the dimensionality and improve performance accuracy of the model based classifier in a high dimensional framework.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:17:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 19. Pavlenko, Tatjana PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_18_j_idt1268",{id:"formSmash:items:resultList:18:j_idt1268",widgetVar:"widget_formSmash_items_resultList_18_j_idt1268",onLabel:"Pavlenko, Tatjana ",offLabel:"Pavlenko, Tatjana ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_18_j_idt1271",{id:"formSmash:items:resultList:18:j_idt1271",widgetVar:"widget_formSmash_items_resultList_18_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:18:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Björkström, AndersStockholm Univ, Stockholm, Sweden.Tillander, AnnikaStockholm Univ, Stockholm, Sweden.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:18:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Covariance structure approximation via gLasso in high-dimensional supervised classification2012In: Journal of Applied Statistics, ISSN 0266-4763, E-ISSN 1360-0532, Vol. 39, no 8, p. 1643-1666Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_18_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:18:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_18_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Recent work has shown that the Lasso-based regularization is very useful for estimating the high-dimensional inverse covariance matrix. A particularly useful scheme is based on penalizing the l(1) norm of the off-diagonal elements to encourage sparsity. We embed this type of regularization into high-dimensional classification. A two-stage estimation procedure is proposed which first recovers structural zeros of the inverse covariance matrix and then enforces block sparsity by moving non-zeros closer to the main diagonal. We show that the block-diagonal approximation of the inverse covariance matrix leads to an additive classifier, and demonstrate that accounting for the structure can yield better performance accuracy. Effect of the block size on classification is explored, and a class of as ymptotically equivalent structure approximations in a high-dimensional setting is specified. We suggest a variable selection at the block level and investigate properties of this procedure in growing dimension asymptotics. We present a consistency result on the feature selection procedure, establish asymptotic lower an upper bounds for the fraction of separative blocks and specify constraints under which the reliable classification with block-wise feature selection can be performed. The relevance and benefits of the proposed approach are illustrated on both simulated and real data.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:18:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 20. Pavlenko, Tatjana et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_19_j_idt1271",{id:"formSmash:items:resultList:19:j_idt1271",widgetVar:"widget_formSmash_items_resultList_19_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:19:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Chernyak, OleksandrPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:19:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Credit risk modeling using Bayesian networks2010In: International Journal of Intelligent Systems, ISSN 0884-8173, E-ISSN 1098-111X, International Journal of Intelligent Systems, Vol. 25, no 4, p. 326-344Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_19_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:19:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_19_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); The main goal of this research is to demonstrate how probabilistic graphs may be used for modeling and assessment of credit concentration risk. The destructive power of credit concentrations essentially depends on the amount of correlation among borrowers. However, borrower companies correlation and concentration of credit risk exposures have been difficult for the banking industry to measure in an objective way as they are riddled with uncertainty. As a result, banks do not manage to make a quantitative link to the correlation driving risks and fail to prevent concentrations from accumulating. In this paper, we argue that Bayesian networks provide an attractive solution to these problems and we show how to apply them in representing, quantifying and managing the uncertain knowledge in concentration of credits risk exposures.We suggest the stepwise Bayesian network model building and show how to incorporate expert-based prior beliefs on the risk exposure of a group of related borrowers, and then update these beliefs through the whole model with the new information. We then explore a specific graph structure, a tree-augmented Bayesian network and show that this model provides better understanding of the risk accumulating due to business links between borrowers.We also present two strategies of model assessment that exploit the measure of mutual information and show that the constructed Bayesian network is a reliable model that can be implemented to identify and control threat from concentration of credit exposures. Finally, we demonstrate that suggested tree-augmented Bayesian network is also suitable for stress-testing analysis, in particular, it can provide the estimates of the posterior risk of losses related to the unfavorable changes in the financial conditions of a group of related borrowers.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:19:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 21. Pavlenko, Tatjana PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_20_j_idt1268",{id:"formSmash:items:resultList:20:j_idt1268",widgetVar:"widget_formSmash_items_resultList_20_j_idt1268",onLabel:"Pavlenko, Tatjana ",offLabel:"Pavlenko, Tatjana ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_20_j_idt1271",{id:"formSmash:items:resultList:20:j_idt1271",widgetVar:"widget_formSmash_items_resultList_20_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Mid Sweden University.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:20:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Friden, HåkanPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:20:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Scoring Feature Subsets for Separation Power in Supervised Bayes Classification2006In: Soft Methods for Integrated Uncertainty Modelling / [ed] Lawry, J; Miranda, E; Bugarin, A; Li, S; Gil, MA; Grzegorzewski, P; Hyrniewicz, O, 2006, Vol. 37, p. 383-391Conference paper (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_20_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:20:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_20_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); We present a method for evaluating the discriminative power of compact feature combinations (blocks) using the distance-based scoring measure, yielding an algorithm for selecting feature blocks that significantly contribute to the outcome variation. To estimate classification performance with subset selection in a high dimensional framework we jointly evaluate both stages of the process: selection of significantly relevant blocks and classification. Classification power and performance properties of the classifier with the proposed subset selection technique has been studied on several simulation models and confirms the benefit of this approach.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:20:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 22. Pavlenko, Tatjana PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_21_j_idt1268",{id:"formSmash:items:resultList:21:j_idt1268",widgetVar:"widget_formSmash_items_resultList_21_j_idt1268",onLabel:"Pavlenko, Tatjana ",offLabel:"Pavlenko, Tatjana ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_21_j_idt1271",{id:"formSmash:items:resultList:21:j_idt1271",widgetVar:"widget_formSmash_items_resultList_21_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:21:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Rios, Felix LeopoldoPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:21:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Graphical posterior predictive classifier: Bayesian model averaging with particle GibbsManuscript (preprint) (Other academic)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_21_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:21:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_21_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); In this study, we present a multi-class graphical Bayesian predictive classifier that incorporates the uncertainty in the model selection into the standard Bayesian formalism. For each class, the dependence structure underlying the observed features is represented by a set of decomposable Gaussian graphical models. Emphasis is then placed on the Bayesian model averaging which takes full account of the class-specific model uncertainty by averaging over the posterior graph model probabilities. An explicit evaluation of the model probabilities is well known to be infeasible. To address this issue, we consider the particle Gibbs strategy of Olsson et al. (2016) for posterior sampling from decomposable graphical models which utilizes the Christmas tree algorithm of Olsson et al. (2017) as proposal kernel. We also derive a strong hyper Markov law which we call the hyper normal Wishart law that allow to perform the resultant Bayesian calculations locally. The proposed predictive graphical classifier reveals superior performance compared to the ordinary Bayesian predictive rule that does not account for the model uncertainty, as well as to a number of out-of-the-box classifiers.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:21:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 23. Pavlenko, Tatjana PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_22_j_idt1268",{id:"formSmash:items:resultList:22:j_idt1268",widgetVar:"widget_formSmash_items_resultList_22_j_idt1268",onLabel:"Pavlenko, Tatjana ",offLabel:"Pavlenko, Tatjana ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_22_j_idt1271",{id:"formSmash:items:resultList:22:j_idt1271",widgetVar:"widget_formSmash_items_resultList_22_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); KTH, School of Engineering Sciences (SCI), Mathematics (Dept.).PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:22:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Roy, AnuradhaPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:22:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Supervised classifiers for high-dimensional higher-order data with locally doubly exchangeable covariance structure2017In: Communications in Statistics - Theory and Methods, ISSN 0361-0926, E-ISSN 1532-415X, Vol. 46, no 23, p. 11612-11634Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_22_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:22:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_22_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); We explore the performance accuracy of the linear and quadratic classifiers for high-dimensional higher-order data, assuming that the class conditional distributions are multivariate normal with locally doubly exchangeable covariance structure. We derive a two-stage procedure for estimating the covariance matrix: at the first stage, the Lasso-based structure learning is applied to sparsifying the block components within the covariance matrix. At the second stage, the maximum-likelihood estimators of all block-wise parameters are derived assuming the doubly exchangeable within block covariance structure and a Kronecker product structured mean vector. We also study the effect of the block size on the classification performance in the high-dimensional setting and derive a class of asymptotically equivalent block structure approximations, in a sense that the choice of the block size is asymptotically negligible.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:22:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 24. Pavlenko, Tatjana PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_23_j_idt1268",{id:"formSmash:items:resultList:23:j_idt1268",widgetVar:"widget_formSmash_items_resultList_23_j_idt1268",onLabel:"Pavlenko, Tatjana ",offLabel:"Pavlenko, Tatjana ",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_23_j_idt1271",{id:"formSmash:items:resultList:23:j_idt1271",widgetVar:"widget_formSmash_items_resultList_23_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Mid Sweden University.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:23:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); von Rosen, DietrichPrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:23:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); On the Optimal Weighting of High-Dimensional Bayesian Networks2004In: Advances and applications in statistics, ISSN 0972-3617, Vol. 4, p. 357-377Article in journal (Refereed)25. Shutoh, N. et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_24_j_idt1271",{id:"formSmash:items:resultList:24:j_idt1271",widgetVar:"widget_formSmash_items_resultList_24_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:24:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Hyodo, M.Pavlenko, TetyanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.Seo, T.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:24:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Constrained linear discriminant rule via the Studentized classification statistic based on monotone missing data2012In: SUT Journal of Mathematics, ISSN 0916-5746, Vol. 48, no 1, p. 55-69Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_24_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:24:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_24_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); This paper provides an asymptotic expansion for the distribution of the Studentized linear discriminant function with k-step monotone missing training data. It turns out to be a certain generalization of the results derived by Anderson [1] and Shutoh and Seo [12]. Furthermore we also derive the cutoff point constrained by a conditional probability of misclassification using the idea of McLachlan [8]. Finally we perform Monte Carlo simulation to evaluate our results.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:24:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); 26. Watanabe, Hiroki et al. PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_25_j_idt1271",{id:"formSmash:items:resultList:25:j_idt1271",widgetVar:"widget_formSmash_items_resultList_25_j_idt1271",onLabel:"et al.",offLabel:"et al.",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:25:orgPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Hyodo, MasashiSeo, TakashiPavlenko, TatjanaKTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Mathematical Statistics.PrimeFaces.cw("Panel","testPanel",{id:"formSmash:items:resultList:25:etAlPanel",widgetVar:"testPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500}); Asymptotic properties of the misclassification rates for Euclidean Distance Discriminant rule in high-dimensional data2015In: Journal of Multivariate Analysis, ISSN 0047-259X, E-ISSN 1095-7243, Vol. 140, p. 234-244Article in journal (Refereed)Abstract [en] PrimeFaces.cw("SelectBooleanButton","widget_formSmash_items_resultList_25_j_idt1306_0_j_idt1307",{id:"formSmash:items:resultList:25:j_idt1306:0:j_idt1307",widgetVar:"widget_formSmash_items_resultList_25_j_idt1306_0_j_idt1307",onLabel:"Abstract [en]",offLabel:"Abstract [en]",onIcon:"ui-icon-triangle-1-s",offIcon:"ui-icon-triangle-1-e"}); Performance accuracy of the Euclidean Distance Discriminant rule (EDDR) is studied in the high-dimensional asymptotic framework which allows the dimensionality to exceed sample size. Under mild assumptions on the traces of the covariance matrix, our new results provide the asymptotic distribution of the conditional misclassification rate and the explicit expression for the consistent and asymptotically unbiased estimator of the expected misclassification rate. To get these properties, new results on the asymptotic normality of the quadratic forms and traces of the higher power of Wishart matrix, are established. Using our asymptotic results, we further develop two generic methods of determining a cut-off point for EDDR to adjust the misclassification rates. Finally, we numerically justify the high accuracy of our asymptotic findings along with the cut-off determination methods in finite sample applications, inclusive of the large sample and high-dimensional scenarios.

PrimeFaces.cw("Panel","tryPanel",{id:"formSmash:items:resultList:25:j_idt1306:0:abstractPanel",widgetVar:"tryPanel",toggleable:true,toggleSpeed:500,collapsed:false,toggleOrientation:"vertical",closable:true,closeSpeed:500});

CiteExportLink to result list
http://kth.diva-portal.org/smash/resultList.jsf?query=&language=en&searchType=SIMPLE&noOfRows=50&sortOrder=author_sort_asc&sortOrder2=title_sort_asc&onlyFullText=false&sf=all&aq=%5B%5B%7B%22personId%22%3A%22u13cu9mc%22%7D%5D%5D&aqe=%5B%5D&aq2=%5B%5B%5D%5D&af=%5B%5D $(function(){PrimeFaces.cw("InputTextarea","widget_formSmash_lower_j_idt1586_recordPermLink",{id:"formSmash:lower:j_idt1586:recordPermLink",widgetVar:"widget_formSmash_lower_j_idt1586_recordPermLink",autoResize:true});}); $(function(){PrimeFaces.cw("OverlayPanel","widget_formSmash_lower_j_idt1586_j_idt1588",{id:"formSmash:lower:j_idt1586:j_idt1588",widgetVar:"widget_formSmash_lower_j_idt1586_j_idt1588",target:"formSmash:lower:j_idt1586:permLink",showEffect:"blind",hideEffect:"fade",my:"right top",at:"right bottom",showCloseIcon:true});});

Permanent link

Cite

Citation styleapa harvard1 ieee modern-language-association-8th-edition vancouver Other style $(function(){PrimeFaces.cw("SelectOneMenu","widget_formSmash_lower_j_idt1604",{id:"formSmash:lower:j_idt1604",widgetVar:"widget_formSmash_lower_j_idt1604",behaviors:{change:function(ext) {PrimeFaces.ab({s:"formSmash:lower:j_idt1604",e:"change",f:"formSmash",p:"formSmash:lower:j_idt1604",u:"formSmash:lower:otherStyle"},ext);}}});});

- apa
- harvard1
- ieee
- modern-language-association-8th-edition
- vancouver
- Other style

Languagede-DE en-GB en-US fi-FI nn-NO nn-NB sv-SE Other locale $(function(){PrimeFaces.cw("SelectOneMenu","widget_formSmash_lower_j_idt1615",{id:"formSmash:lower:j_idt1615",widgetVar:"widget_formSmash_lower_j_idt1615",behaviors:{change:function(ext) {PrimeFaces.ab({s:"formSmash:lower:j_idt1615",e:"change",f:"formSmash",p:"formSmash:lower:j_idt1615",u:"formSmash:lower:otherLanguage"},ext);}}});});

- de-DE
- en-GB
- en-US
- fi-FI
- nn-NO
- nn-NB
- sv-SE
- Other locale

Output formathtml text asciidoc rtf $(function(){PrimeFaces.cw("SelectOneMenu","widget_formSmash_lower_j_idt1625",{id:"formSmash:lower:j_idt1625",widgetVar:"widget_formSmash_lower_j_idt1625"});});

- html
- text
- asciidoc
- rtf