CiteExport$(function(){PrimeFaces.cw("TieredMenu","widget_formSmash_upper_j_idt162",{id:"formSmash:upper:j_idt162",widgetVar:"widget_formSmash_upper_j_idt162",autoDisplay:true,overlay:true,my:"left top",at:"left bottom",trigger:"formSmash:upper:exportLink",triggerEvent:"click"});}); $(function(){PrimeFaces.cw("OverlayPanel","widget_formSmash_upper_j_idt165_j_idt167",{id:"formSmash:upper:j_idt165:j_idt167",widgetVar:"widget_formSmash_upper_j_idt165_j_idt167",target:"formSmash:upper:j_idt165:permLink",showEffect:"blind",hideEffect:"fade",my:"right top",at:"right bottom",showCloseIcon:true});});

Dense Subset Sum may be the hardestPrimeFaces.cw("AccordionPanel","widget_formSmash_some",{id:"formSmash:some",widgetVar:"widget_formSmash_some",multiple:true}); PrimeFaces.cw("AccordionPanel","widget_formSmash_all",{id:"formSmash:all",widgetVar:"widget_formSmash_all",multiple:true});
function selectAll()
{
var panelSome = $(PrimeFaces.escapeClientId("formSmash:some"));
var panelAll = $(PrimeFaces.escapeClientId("formSmash:all"));
panelAll.toggle();
toggleList(panelSome.get(0).childNodes, panelAll);
toggleList(panelAll.get(0).childNodes, panelAll);
}
/*Toggling the list of authorPanel nodes according to the toggling of the closeable second panel */
function toggleList(childList, panel)
{
var panelWasOpen = (panel.get(0).style.display == 'none');
// console.log('panel was open ' + panelWasOpen);
for (var c = 0; c < childList.length; c++) {
if (childList[c].classList.contains('authorPanel')) {
clickNode(panelWasOpen, childList[c]);
}
}
}
/*nodes have styleClass ui-corner-top if they are expanded and ui-corner-all if they are collapsed */
function clickNode(collapse, child)
{
if (collapse && child.classList.contains('ui-corner-top')) {
// console.log('collapse');
child.click();
}
if (!collapse && child.classList.contains('ui-corner-all')) {
// console.log('expand');
child.click();
}
}
2016 (English)In: Leibniz International Proceedings in Informatics, LIPIcs, Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing , 2016Conference paper, Published paper (Refereed)
##### Resource type

Text
##### Abstract [en]

##### Place, publisher, year, edition, pages

Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing , 2016.
##### Keywords [en]

Additive combinatorics, Exponential-time algorithm, Homomorphic hashing, Littlewood-offord problem, Subset Sum
##### National Category

Discrete Mathematics
##### Identifiers

URN: urn:nbn:se:kth:diva-188298DOI: 10.4230/LIPIcs.STACS.2016.13Scopus ID: 2-s2.0-84961625723ISBN: 9783959770019 (print)OAI: oai:DiVA.org:kth-188298DiVA, id: diva2:936276
##### Conference

33rd Symposium on Theoretical Aspects of Computer Science, STACS 2016, 17 February 2016 through 20 February 2016
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt608",{id:"formSmash:j_idt608",widgetVar:"widget_formSmash_j_idt608",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt621",{id:"formSmash:j_idt621",widgetVar:"widget_formSmash_j_idt621",multiple:true});
#####

PrimeFaces.cw("AccordionPanel","widget_formSmash_j_idt636",{id:"formSmash:j_idt636",widgetVar:"widget_formSmash_j_idt636",multiple:true});
##### Funder

Swedish Research Council Formas, 621-2012-4546
##### Note

The Subset Sum problem asks whether a given set of n positive integers contains a subset of elements that sum up to a given target t. It is an outstanding open question whether the O∗(2n/2)-time algorithm for Subset Sum by Horowitz and Sahni [J. ACM 1974] can be beaten in the worst-case setting by a "truly faster", O∗(2(0.5-δ)n)-time algorithm, with some constant δ > 0. Continuing an earlier work [STACS 2015], we study Subset Sum parameterized by the maximum bin size β, defined as the largest number of subsets of the n input integers that yield the same sum. For every ∈ > 0 we give a truly faster algorithm for instances with β ≤ 2(0.5-∈)n, as well as instances with β ≥ 20.661n. Consequently, we also obtain a characterization in terms of the popular density parameter n/log2 t: if all instances of density at least 1.003 admit a truly faster algorithm, then so does every instance. This goes against the current intuition that instances of density 1 are the hardest, and therefore is a step toward answering the open question in the affirmative. Our results stem from a novel combinatorial analysis of mixings of earlier algorithms for Subset Sum and a study of an extremal question in additive combinatorics connected to the problem of Uniquely Decodable Code Pairs in information theory.

QC 20160613

Available from: 2016-06-13 Created: 2016-06-09 Last updated: 2016-06-13Bibliographically approved
doi
isbn
urn-nbn$(function(){PrimeFaces.cw("Tooltip","widget_formSmash_j_idt1565",{id:"formSmash:j_idt1565",widgetVar:"widget_formSmash_j_idt1565",showEffect:"fade",hideEffect:"fade",showDelay:500,hideDelay:300,target:"formSmash:altmetricDiv"});});

CiteExport$(function(){PrimeFaces.cw("TieredMenu","widget_formSmash_lower_j_idt1618",{id:"formSmash:lower:j_idt1618",widgetVar:"widget_formSmash_lower_j_idt1618",autoDisplay:true,overlay:true,my:"left top",at:"left bottom",trigger:"formSmash:lower:exportLink",triggerEvent:"click"});}); $(function(){PrimeFaces.cw("OverlayPanel","widget_formSmash_lower_j_idt1619_j_idt1621",{id:"formSmash:lower:j_idt1619:j_idt1621",widgetVar:"widget_formSmash_lower_j_idt1619_j_idt1621",target:"formSmash:lower:j_idt1619:permLink",showEffect:"blind",hideEffect:"fade",my:"right top",at:"right bottom",showCloseIcon:true});});