abstract
stringlengths
5
11.1k
authors
stringlengths
9
1.96k
title
stringlengths
4
367
__index_level_0__
int64
0
1,000k
Loop abort Faults on Lattice-Based Fiat-Shamir & Hash'n Sign signatures.
['Thomas Espitau', 'Pierre-Alain Fouque', 'Benoît Gérard', 'Mehdi Tibouchi']
Loop abort Faults on Lattice-Based Fiat-Shamir & Hash'n Sign signatures.
992,430
As System on a Chip (SoC) testing faces new challenges, some new test architectures must be developed. This paper describes a Test Access Mechanism (TAM) named CAS-BUS that solves some of the new problems the test industry has to deal with. This TAM is scalable, flexible and dynamically reconfigurable. The CAS-BUS architecture is compatible with the IEEE P1500 standard proposal in its current state of development, and is controlled by Boundary Scan features.#R##N##R##N#This basic CAS-BUS architecture has been extended with two independent variants. The first extension has been designed in order to manage SoC made up with both wrapped cores and non wrapped cores with Boundray Scan features. The second deals with a test pin expansion method in order to solve the I/O bandwidth problem. The proposed solution is based on a new compression/decompression mechanism which provides significant results in case of non correlated test patterns processing. This solution avoids TAM performance degradation.#R##N##R##N#These test architectures are based on the CAS-BUS TAM and allow trade-offs to optimize both test time and area overhead. A tool-box environment is provided, in order to automatically generate the needed component to build the chosen SoC test architecture.
['Mounir Benabdenbi', 'Walid Maroufi', 'Meryem Marzouki']
CAS-BUS: A Test Access Mechanism and a Toolbox Environment for Core-Based System Chip Testing
153,311
The vehicle routing problem is a classical combinatorial optimization problem. This work is about a variant of the vehicle routing problem with dynamically changing orders and time windows. In real-world applications often the demands change during operation time. New orders occur and others are canceled. In this case new schedules need to be generated on-the-fly. Online optimization algorithms for dynamical vehicle routing address this problem but so far they do not consider time windows. Moreover, to match the scenarios found in real-world problems adaptations of benchmarks are required. In this paper, a practical problem is modeled based on the procedure of daily routing of a delivery company. New orders by customers are introduced dynamically during the working day and need to be integrated into the schedule. A multiple ant colony algorithm combined with powerful local search procedures is proposed to solve the dynamic vehicle routing problem with time windows. The performance is tested on a new benchmark based on simulations of a working day. The problems are taken from Solomon’s benchmarks but a certain percentage of the orders are only revealed to the algorithm during operation time. Different versions of the MACS algorithm are tested and a high performing variant is identified. Finally, the algorithm is tested in situ: In a field study, the algorithm schedules a fleet of cars for a surveillance company. We compare the performance of the algorithm to that of the procedure used by the company and we summarize insights gained from the implementation of the real-world study. The results show that the multiple ant colony algorithm can get a much better solution on the academic benchmark problem and also can be integrated in a real-world environment.
['Zhiwei Yang', 'Jan-Paul van Osta', 'Barry D. Van Veen', 'Rick van Krevelen', 'Richard van Klaveren', 'Andries Stam', 'Joost N. Kok', 'Thomas Bäck', 'Michael Emmerich']
Dynamic vehicle routing with time windows in theory and practice
697,382
Synchronizing Automata and Independent Systems of Words.
['Arturo Carpi', "Flavio D'Alessandro"]
Synchronizing Automata and Independent Systems of Words.
746,352
There has recently been much interest in history-based methods using suffix trees to solve POMDPs. However, these suffix trees cannot efficiently represent environments that have long-term dependencies. We extend the recently introduced CTMDP algorithm to the space of looping suffix trees which have previously only been used in solving determinis- tic POMDPs. The resulting algorithm replicates results from CTMDP for environments with short term dependencies, while it outperforms LSTM-based methods on TMaze, a deep memory environment.
['Mayank Daswani', 'Peter Sunehag', 'Marcus Hutter']
Feature Reinforcement Learning using Looping Suffix Trees
324,608
Regulating Vendor-Client Workarounds: An Information Brokering Approach
['Jade Wendy Brooks', 'Mayasandra-Nagaraja Ravishankar', 'Ilan Oshri']
Regulating Vendor-Client Workarounds: An Information Brokering Approach
575,313
An algorithm for text/graphics separation is presented in this paper. The basic principle of the algorithm is to erase nontext regions from mixed text and graphics engineering drawings, rather than extract text regions directly. This algorithm can be used to extract both Chinese and Western characters, dimensions, and symbols and has few limitations on the kind of engineering drawings and noise level. It is robust to text-graphics touching, text fonts, and written orientations.
['Zhaoyang Lu']
Detection of text regions from digital engineering drawings
535,424
We show that if a three-dimensional polytopal complex has a knot in its 1-skeleton, where the bridge index of the knot is larger than the number of edges of the knot, then the complex is not constructible, and hence, not shellable. As an application we settle a conjecture of Hetyei concerning the shellability of cubical barycentric subdivisions of 3-spheres. We also obtain similar bounds concluding that a 3-sphere or 3-ball is non-shellable or not vertex decomposable. These two last bounds are sharp.
['Richard Ehrenborg', 'Masahiro Hachimori']
Non-constructible Complexes and the Bridge Index
77,236
Cloud Computing has recently become an important driver for IT service provisioning. In addition t o its associated benefits for both customers and IT service providers, cloud computing also comes along with new challenges. One of the major challenges for providers is to reduce the energy consumption, since today already more than fifty percent of operational costs in data centers account for energy. A possible way to reduce these costs is to distribute load in terms of virtual machines within the data center. Developing algorithms for this purpose has been a topic of recent research. In order to capture the state of the art of energy efficient load distribution in clouds, this paper presents a structured literature review on load distribution algorithms that aim to reduce the energy consumption in data centers for cloud computing. The algorithms are reviewed in terms of their type, their evaluation method and their potential side effects in terms of drawbacks.
['Matthias Splieth', 'Frederik Kramer', 'Klaus Turowski']
Classification of Techniques for Energy Efficient Load Distribution Algorithms in Clouds - A Systematic Literature Review
667,330
The present paper proposes a memetic algorithm for tuning Fuzzy Wavelet Neural Network (FWNN) parameters in an adaptive way; to achieve this goal, our proposed algorithm combines Particle Swarm Optimization (PSO) as an evolutionary algorithm and an innovative local search which is based on a Fuzzy Inference System (FIS). The PSO increases the exploration ability of the memetic algorithm while the local search enhances its exploitation ability. To evaluate the performance of the proposed method, we have assessed our method by three known nonlinear problems commonly applied in the literature for modeling. In comparison with other methods used in the literature, our proposed method showed certain advantages, namely: a fewer number of obtained rules for FWNN, much better results in terms of error criteria, and faster convergence speed.
['Hojjat-Allah Bazoobandi', 'Mahdi Eftekhari']
A fuzzy based memetic algorithm for tuning fuzzy wavelet neural network parameters
625,015
The follow the leader (FTL) algorithm, perhaps the simplest of all online learning algorithms, is known to perform well when the loss functions it is used on are positively curved. In this paper we ask whether there are other "lucky" settings when FTL achieves sublinear, "small" regret. In particular, we study the fundamental problem of linear prediction over a non-empty convex, compact domain. Amongst other results, we prove that the curvature of the boundary of the domain can act as if the losses were curved: In this case, we prove that as long as the mean of the loss vectors have positive lengths bounded away from zero, FTL enjoys a logarithmic growth rate of regret, while, e.g., for polyhedral domains and stochastic data it enjoys finite expected regret. Building on a previously known meta-algorithm, we also get an algorithm that simultaneously enjoys the worst-case guarantees and the bound available for FTL.
['Ruitong Huang', 'Tor Lattimore', 'András György', 'Csaba Szepesvári']
Following the Leader and Fast Rates in Linear Prediction: Curved Constraint Sets and Other Regularities
940,315
Information System agility to support collaborative organisations
['Frédérick Bénaben', 'François Vernadat']
Information System agility to support collaborative organisations
992,819
This contribution introduces the forthcoming extension of the ISO SQL standard for multi-dimensional arrays, SQL/MDA. We present concepts, the language, and highlight how it can be implemented in a scalable manner. Examples used stem from Earth Observation and related domains.
['Peter Baumann', 'Dimitar Misev']
Enhancing science support in SQL
585,854
Thermal design in sub-100 nm technologies imposes significant challenges to the CAD designers. A Compact Thermal Model (CTM) is proposed to represent the package as a network of nodes. The discrete models resulting from, for example, the finite element method (FEM) are usually very large. In order to handle this size of full models, the CTM is condensed using Static Matrix Condensation methodology where the system matrices obtained by the spatial discretization of heat transfer partial differential equation (PDE) are reduced by condensing the heat sources, surface nodes and internal nodes. This method reduces the complexity and at the same time preserves the accuracy of the model.
['Shriram Krishnamoorthy', 'Masud H. Chowdhury']
Compact thermal network model: Realization and reduction
402,006
Like any other large and complex software systems, Service-Based Systems (SBSs) must evolve to fit new user requirements and execution contexts. The changes resulting from the evolution of SBSs may degrade their design and quality of service (QoS) and may often cause the appearance of common poor solutions in their architecture, called antipatterns, in opposition to design patterns, which are good solutions to recurring problems. Antipatterns resulting from these changes may hinder the future maintenance and evolution of SBSs. The detection of antipatterns is thus crucial to assess the design and QoS of SBSs and facilitate their maintenance and evolution. However, methods and techniques for the detection of antipatterns in SBSs are still in their infancy despite their importance. In this paper, we introduce a novel and innovative approach supported by a framework for specifying and detecting antipatterns in SBSs. Using our approach, we specify 10 well-known and common antipatterns, including Multi Service and Tiny Service, and automatically generate their detection algorithms. We apply and validate the detection algorithms in terms of precision and recall two systems developed independently, (1) Home-Automation, an SBS with 13 services, and (2) FraSCAti, an open-source implementation of the Service Component Architecture (SCA) standard with more than 100 services. This validation demonstrates that our approach enables the specification and detection of Service Oriented Architecture (SOA) antipatterns with an average precision of 90% and recall of 97.5%.
['Francis Palma', 'Mathieu Nayrolles', 'Naouel Moha', 'Yann-Gael Gueheneuc', 'Benoit Baudry', 'Jean-Marc Jézéquel']
SOA Antipatterns: an Approach for their Specification and Detection
122,782
We develop an environmental estimation method that allows large groups of agents to infer the value of an environmental field using measurements. Agents maintain estimates for subregions of the domain and communicate with local neighbors, so the method's communication and memory requirements do not increase with the number of agents, the size of the environment representation, or the agents' density in the environment. Despite the distributed representation, the union of individual estimates matches an estimate generated by a central computer with access to all measurements employing the variational inverse method, a finite element-based interpolation procedure. We also introduce a distributed query system, allowing users to determine an estimate anywhere in the domain without accessing all measurements or the full environment representation.
['Matthew L. Elwin', 'Randy A. Freeman', 'Kevin M. Lynch']
Environmental estimation with distributed finite element agents
975,813
A picture is worth a thousand dollars
['Martha J. Farah']
A picture is worth a thousand dollars
72,512
Subtleties in the Definition of IND-CCA: When and How Should Challenge-Decryption be Disallowed?
['Mihir Bellare', 'Dennis Hofheinz', 'Eike Kiltz']
Subtleties in the Definition of IND-CCA: When and How Should Challenge-Decryption be Disallowed?
747,613
Knowledge Management Systems: A multidimensional Analysis.
['Mouna Ben Chouikha', 'Salem Ben Dhaou Dakhli']
Knowledge Management Systems: A multidimensional Analysis.
980,562
Finite-time consensus of networked nonlinear systems under directed graph
['Naim Zoghlami', 'Lotfi Beji', 'Rhouma Mlayeh', 'Azgal Abichou', 'Chaker Jammazi']
Finite-time consensus of networked nonlinear systems under directed graph
698,634
We present encube $-$ a qualitative, quantitative and comparative visualisation and analysis system, with application to high-resolution, immersive three-dimensional environments and desktop displays. encube extends previous comparative visualisation systems by considering: 1) the integration of comparative visualisation and analysis into a unified system; 2) the documentation of the discovery process; and 3) an approach that enables scientists to continue the research process once back at their desktop. Our solution enables tablets, smartphones or laptops to be used as interaction units for manipulating, organising, and querying data. We highlight the modularity of encube, allowing additional functionalities to be included as required. Additionally, our approach supports a high level of collaboration within the physical environment. We show how our implementation of encube operates in a large-scale, hybrid visualisation and supercomputing environment using the CAVE2 at Monash University, and on a local desktop, making it a versatile solution. We discuss how our approach can help accelerate the discovery rate in a variety of research scenarios.
['Dany Vohl', 'David G. Barnes', 'Christopher J. Fluke', 'Govinda R. Poudel', 'Nellie Georgiou-Karistianis', 'Amr H. Hassan', 'Yuri Benovitski', 'Tsz Ho Wong', 'Owen Kaluza', 'Toàn D. Nguyên', 'C. Paul Bonnington']
Large-scale comparative visualisation of sets of multidimensional data
904,378
In this paper we propose a method to build similarity relations into extended Rough Set Theory. Similarity is estimated using ideas from Granular computing and Case-base reasoning. A new measure is introduced in order to compute the quality of the similarity relation. This work presents a study of a case of a similarity relation based on a global similarity function between two objects, this function includes the weights for each feature and local functions to calculate how the values of a given feature are similar. This approach was proved in the function approximation problem. Promissory results are obtained in several experiments.
['Yaima Filiberto', 'Yailé Caballero', 'Rafael Larrua', 'Rafael Bello']
A method to build similarity relations into extended Rough Set Theory
382,741
Faster Model-Based Optimization Through Resource-Aware Scheduling Strategies
['Jakob Richter', 'Helena Kotthaus', 'Bernd Bischl', 'Peter Marwedel', 'Jörg Rahnenführer', 'Michel Lang']
Faster Model-Based Optimization Through Resource-Aware Scheduling Strategies
951,012
Journal of the Association for Information Science and Technology#R##N#Early View (Online Version of Record published before inclusion in an issue)
['Loet Leydesdorff', 'Caroline S. Wagner', 'Lutz Bornmann']
Replicability and the public/private divide
591,458
A general two-stage Kalman filter that is equivalent to, but numerically more efficient than, the standard single-stage Kalman filter is developed for general, time-varying, linear discrete-time systems. Analytical results defining the reduction in computational burdens are presented. Simulation results that validate the predicted efficiency improvements are shown as well.
['Chien-Shu Hsieh', 'Fu-Chuang Chen']
General two-stage Kalman filters
75,408
Even though video technologies have continuously advanced, playing a video still requires full length of the videos playtime, high volume of data traffic, and appropriate sound play devices. To overcome these limitations, this paper proposes a method to serve a video in an image carousel format. By providing multiple key frame images in carousel form with full script of the video, the proposed image carousel system transforms any playable video content into a readable form so that users can read the content rather than watch. Reading a video make consumers be freed from sound restrictions, and increase the potential and efficiency in the video linkage and usage in other services. In addition, this approach saves both playtime and data traffic. When a video encoded in H.264 codec with 720p resolution is transformed into the proposed image carousel form, in average 90–95 % of the data traffic were saved.
['Jinhong Yang', 'Hyojin Park', 'Kyuyeong Jeon', 'Jeong-ile Jeong', 'Jun Kyun Choi']
Serving a video into an image carousel: system design and implementation
916,719
A Novel Time Reversal-Least Sidelobe Scheme to Minimize ISI and MUI
['Do-Hoon Kim', 'Jungwook Wee', 'Kyu-Sung Hwang']
A Novel Time Reversal-Least Sidelobe Scheme to Minimize ISI and MUI
202,682
The computation of relations from a number of potential matches is a major task in computer vision. Often RANSAC is employed for the robust computation of relations such as the fundamental matrix. For (quasi-)degenerate data however, it often fails to compute the correct relation. The computed relation is always consistent with the data but RANSAC does not verify that it is unique. The paper proposes a framework that estimates the correct relation with the same robustness as RANSAC even for (quasi-)degenerate data. The approach is based on a hierarchical RANSAC over the number of constraints provided by the data. In contrast to all previously presented algorithms for (quasi-)degenerate data our technique does not require problem specific tests or models to deal with degenerate configurations. Accordingly it can be applied for the estimation of any relation on any data and is not limited to a special type of relation as previous approaches. The results are equivalent to the results achieved by state of the art approaches that employ knowledge about degeneracies.
['Jan-Michael Frahm', 'Marc Pollefeys']
RANSAC for (Quasi-)Degenerate data (QDEGSAC)
443,750
A virtual reality modeling language (VRML) based application has been developed as a marketing tool for a commercial park. VRML is a new Web based technology for specifying and delivering three-dimensional interactive visualizations over the Internet through a Web browser. As a part of its definition, VRML includes primitives that specify geometries, sense different conditions in in the visualization, and allow custom definition of methods. Geometries and conditions may be linked so that the geometries can be modified or added interactively. The visualization features simple operation, an extensive menu structure, dynamic creation of objects, and an arbitration scheme.
['Lee A. Belfore', 'Rajesh Vennam']
VRML for urban visualization
98,599
Joint object tracking and pose estimation is an important issue in Augmented Reality (AR), interactive systems, and robotic systems. Many studies are based on object detection methods that only focus on the reliability of the features. Other methods combine object detection with frame-by-frame tracking using the temporal redundancy in the video. However, in some mixed methods, the interval between consecutive detection frames is usually too short to take the full advantage of the frame-by-frame tracking, or there is no appropriate switching mechanism between detection and tracking. In this paper, an iterative optimization tracking method is proposed to alleviate the deviations of the tracking points and prolong the interval, and thus speed up the pose estimation process. Moreover, an adaptive detection interval algorithm is developed, which can make the switch between detection and frame-by-frame tracking automatically according to the quality of frames so as to improve the accuracy in a tough tracking environment. Experimental results on the benchmark dataset manifest that the proposed algorithms, as an independent part, can be combined with some inter-frame tracking methods for optimization.
['Shuang Ye', 'Chuancai Liu', 'Zhiwu Li', 'Abdulrahman Al-Ahmari']
Iterative optimization for frame-by-frame object pose tracking ☆
987,227
This paper proposes an efficient parallel algorithm for an important class of dynamic programming problems that includes Viterbi, Needleman--Wunsch, Smith--Waterman, and Longest Common Subsequence. In dynamic programming, the subproblems that do not depend on each other, and thus can be computed in parallel, form stages, or wavefronts. The algorithm presented in this paper provides additional parallelism allowing multiple stages to be computed in parallel despite dependences among them. The correctness and the performance of the algorithm relies on rank convergence properties of matrix multiplication in the tropical semiring, formed with plus as the multiplicative operation and max as the additive operation. This paper demonstrates the efficiency of the parallel algorithm by showing significant speedups on a variety of important dynamic programming problems. In particular, the parallel Viterbi decoder is up to 24× faster (with 64 processors) than a highly optimized commercial baseline.
['Saeed Maleki', 'Madanlal Musuvathi', 'Todd Mytkowicz']
Efficient parallelization using rank convergence in dynamic programming algorithms
894,581
We examine three hierarchies of circuit classes and show they are closed under complementation. (1) The class of languages recognized by a family of polynomial size skew circuits with width O(w), are closed under complement. (2) The class of languages recognized by family of polynomial size circuits with width O(w) and polynomial tree-size, are closed under complement. (3) The class of languages recognized by a family of polynomial size, O(log(n)) depth, bounded AND fan-in with OR fan-in f (f/spl ges/log(n)) circuits are closed under complement. These improve upon the results of (i) Immerman (1988) and Szelepcsenyi (1988), who show that /spl Nscr//spl Lscr//spl Oscr//spl Gscr/ is closed under complementation, and (ii) Borodin et al. (1989), who show that /spl Lscr//spl Oscr//spl Gscr//spl Cscr//spl Fscr//spl Lscr/ is closed under complement.
['V. Vinay']
Hierarchies of circuit classes that are closed under complement
417,784
The no slot wasting immediately using the TAR bit (NSW IUT) bandwidth balancing mechanism has been previously proposed for dual bus architectures. NSW IUT is a variation of the no slot wasting bandwidth balancing (NSW BWB) mechanism and it can also introduce bandwidth fairness into a DQDB network without wasting channel slots. In addition, it enables lightly loaded stations to use more effectively the TAR=1 bits they observe and improve considerably their delay performance. In the paper, the authors use simulation to carry out a thorough investigation of the delay performance of the NSW IUT mechanism under single and multiple priority classes of traffic as well as under the presence of erasure nodes. Throughout the study, they also compare NSW IUT with NSW BWB as well as with the bandwidth balancing of the IEEE 802.6 standard. >
['Dennis Karvelas', 'Michail Papamichail']
Performance study of the NSW-IUT bandwidth balancing mechanism
531,293
Intrapersonal synchronization of limb movements is a relevant feature for assessing coordination of motoric behavior. In this paper, we show that it can also distinguish between full-body movements performed with different expressive qualities, namely rigidity, fluidity, and impulsivity. For this purpose, we collected a dataset of movements performed by professional dancers, and annotated the perceived movement qualities with the help of a group of experts in expressive movement analysis. We computed intra personal synchronization by applying the Event Synchronization algorithm to the time-series of the speed of arms and hands. Results show that movements performed with different qualities display a significantly different amount of intra personal synchronization: impulsive movements are the most synchronized, the fluid ones show the lowest values of synchronization, and the rigid ones lay in between.
['Paolo Alborno', 'Stefano Piana', 'Maurizio Mancini', 'Radoslaw Niewiadomski', 'Gualtiero Volpe', 'Antonio Camurri']
Analysis of Intrapersonal Synchronization in Full-Body Movements Displaying Different Expressive Qualities
746,252
Several researchers have reported difficulties in analyzing the behavior of single queues and networks of queues. This is so even in the case of closed product-form networks, for which an exact solution and efficient solution algorithms are known. The difficulty arises because the exact solution could not, by itself, be used for such analysis as proving properties of the network, relating performance measures to one another, and characterizing some interesting behavior. This paper proposes an approach to surmounting such difficulties. The idea is to analyze an approximate solution based on Schweitzer's approximation, and interpret the results as approximate relationships among the exact performance measures. This approach is applied to three problems concerning the interaction among job classes, the mean arrival and variance of queue length, and thrashing. The reliability of the approach is tested by applying it to an optimal routing problem, for which the exact solution is known. The results are illustra...
['Y. C. Tay']
An approach to analyzing the behavior of some queueing networks
210,640
Three-dimensional integration is considered a promising solution to cure the challenges of performance, power consumption, quality, and reliability issues. The feature of 2.5D ICs is that the dies are stacked on a passive silicon interposer and the dies communicate with each other by means of TSV-based interconnects and re-Distribution layers (RDL) within the silicon interposer. This paper aims to investigate the efficient post-bond test technique for the 2.5D ICs with silicon interposer. In order to efficiently reuse the functional interconnects as the parallel TAM (test access mechanism) for testing dies, a novel macro-die-based interconnect reuse strategy and its corresponding design-for-test (DFT) architecture are proposed in this paper. The proposed strategy merges several dies to form a macro die and then connected to other dies to form a daisy chain for testing. Experimental results show that the proposed techniques have higher success rates for the required TAM width constraints. Moreover, since we can get wider TAMs, the test length then can be reduced significantly.
['Shyue-Kung Lu', 'Huaimin Li', 'Masaki Hashizume', 'Jin-Hua Hong', 'Zheng-Ru Tsai']
Efficient test length reduction techniques for interposer-based 2.5D ICs
194,197
We develop techniques to investigate relativized hierarchical unambiguous computation. We apply our techniques to push forward some known constructs involving relativized unambiguity based complexity classes (UP and Promise- UP) to new constructs involving arbitrary levels of the relativized unambiguous polynomial hierarchy (UPH). Our techniques are developed on constraints imposed by hierarchical assembly of unambiguous nondeterministic polynomial-time Turing machines, and so our techniques differ substantially, in applicability and in nature, from standard techniques (such as the switching lemma [Has87]), which are known to play roles in carrying out similar generalizations.#R##N##R##N#Aside from achieving these generalizations, we resolve a question posed by Cai, Hemachandra, and Vyskoc [CHV93] on an issue related to nonadaptive Turing access to UP and adaptive smart Turing access to Promise-UP.
['Holger Spakowski', 'Rahul Tripathi']
Hierarchical unambiguity
714,067
The Modified Mumford-Shah Model Based on Nonlocal Means Method for Textures Segmentation
['Jingge Lu', 'Guodong Wang', 'Zhenkuan Pan']
The Modified Mumford-Shah Model Based on Nonlocal Means Method for Textures Segmentation
593,112
Monte-Carlo Tree Search for the Maximum Satisfiability Problem
['Jack Goffinet', 'Raghuram Ramanujan']
Monte-Carlo Tree Search for the Maximum Satisfiability Problem
880,112
Selectivity estimation is an integral part of query optimization. In this paper, we propose a novel approach to approximate data density functions of relations and use them to estimate selectivities. A data density function here is approximated by a partial sum of an orthogonal series. Such approximate density functions can be derived easily, stored efficiently, and maintained dynamically. Experimental results show that our approach yields comparable or better estimation accuracy than the Wavelet and DCT methods, especially in the high dimensional spaces.
['Feng Yan', 'Wen-Chi Hou', 'Qiang Zhu']
Selectivity estimation using orthogonal series
53,707
This paper aims to speed up physical simulation of a complex model by making use of several levels of simulation details, which are determined according to the observer position. Multiple dynamically simulated sailing boats in a virtual lake are created as a test environment for the implementation. Both consistency and completeness are satisfied in the proposed method of simulation.
['Gurkan Koldas', 'Veysi Isler']
Multiresolution behavioral modeling in a virtual environment
232,704
Imitation is a powerful learning tool that can be used by a robotic agent to socially learn new skills and tasks. One of the fundamental problems in imitation is the correspondence problem, how to map between the actions, states and effects of the model and imitator agents, when the embodiment of the agents is dissimilar. In our approach, the matching depends on different metrics and granularity. Focusing on object manipulation and arrangement demonstrated by a human, this paper presents Jabberwocky, a system that uses different metrics and granularity to produce action command sequences that when executed by an imitating agent can achieve corresponding effects (manipulandum absolute/relative position, displacement, rotation and orientation). Based on a single demonstration of an object manipulation task by a human and using a combination of effect metrics, the system is shown to produce correspondence solutions that are then performed by an imitating agent, generalizing with respect to different initial object positions and orientations in the imitator's workspace. Depending on the particular metrics and granularity used, the corresponding effects will differ (shown in examples), making the appropriate choice of metrics and granularity depend on the task and context
['Aris Alissandrakis', 'Chrystopher L. Nehaniv', 'Kerstin Dautenhahn', 'Joe Saunders']
An Approach for Programming Robots by Demonstration: Generalization Across Different Initial Configurations of Manipulated Objects
499,627
A New Graph Algorithm for the Analysis of Conformational Dynamics of Molecules
['Dominique Barth', 'S. Bougueroua', 'M.-P. Gaigeot', 'Franck Quessette', 'Riccardo Spezia', 'Sandrine Vial']
A New Graph Algorithm for the Analysis of Conformational Dynamics of Molecules
683,129
It has been widely documented that student-staff ratios continue to rise in higher education (H.E.) within the UK. As a consequence, H.E. Institutions are becoming increasingly reliant on e-learning tools and this has increased difficulties that students experience in making the transition from secondary to tertiary education. In H.E., students generally receive far fewer contact hours, classes are larger, and they are often provided with much less personal guidance on how to progress through the educational process than they were used to at secondary level. E-learning system design is typically more focused on issues related to student and content management and less on individualised feedback and process based support. One of the issues with the design of current e-learning systems is that they are poor at representing individual student participation and students’ progress through a course. This may be one of the contributing factors to diminishing student engagement in H.E. which ultimately results in poor grades and low retention rates. In this paper we address this issue by considering how player representation techniques commonly found computer games may inspire the design of e-learning systems so as to improve student engagement within an H.E. course.
['Darryl Charles', 'Therese Charles', 'Michael McNeill']
Using Player and World Representation Techniques from Computer Games to Improve Student Engagement
27,224
We investigate maximum likelihood parameter learning in Conditional Random Fields (CRF) and present an empirical study of pseudo-likelihood (PL) based approximations of the parameter likelihood gradient. We show, as opposed to [1][2], that these parameter learning methods can be improved and evaluate the resulting performance employing different inference techniques. We show that the approximation based on penalized pseudo-likelihood (PPL) in combination with the Maximum A Posteriori (MAP) inference yields results comparable to other state of the art approaches, while providing advantages to formulating parameter learning as a convex optimization problem. Eventually, we demonstrate applicability on the task of detecting man-made structures in natural images.
['Filip Korč', 'Wolfgang Förstner']
Approximate Parameter Learning in Conditional Random Fields: An Empirical Investigation
52,840
This paper proposes a neural fuzzy connection admission control (NFCAC) scheme, which combines benefits of fuzzy logic controller and learning abilities of the neural-net, to solve the connection admission control (CAC) problems in ATM networks. Fuzzy logic systems have been successfully applied to deal with the traffic control related problems and provided a robust mathematical framework for dealing with "real-world" imprecision; multilayer neural networks are capable of producing complex decisions with arbitrarily nonlinear boundaries and they have been used as a solution for CAC. However, the application of a neural network or a fuzzy logic system to CAC presents some difficulties in a real system operation. The proposed NFCAC solves the difficulties by combining the benefits of the existing traffic control mechanisms, linguistic control strategy of the fuzzy logic controller and the learning ability of the neural net. Simulation results show that the proposed NFCAC saves a large amount of training time and simplifies the design procedure of a CAC controller but provides a superior system utilization, while keeping the QoS contract, than either the neural network or fuzzy logic system does.
['Ray-Guang Cheng', 'Chung-Ju Chang']
A neural-net based fuzzy admission controller for an ATM network
140,904
Bi-Objective Combinatorial Optimization problems are ubiquitous in real-world applications and designing approaches to solve them efficiently is an important research area of Artificial Intelligence. In Constraint Programming, the recently introduced bi-objective Pareto constraint allows one to solve bi-objective combinatorial optimization problems exactly. Using this constraint, every non-dominated solution is collected in a single tree-search while pruning subtrees that cannot lead to a non-dominated solution. This paper introduces a simpler and more efficient filtering algorithm for the bi-objective Pareto constraint. The efficiency of this algorithm is experimentally confirmed on classical bi-objective benchmarks.
['Renaud Hartert', 'Pierre Schaus']
A support-based algorithm for the bi-objective pareto constraint
553,089
Prosody transplantation in text-to-speech: applications and tools.
['Bert Van Coile', 'A. de Zitter', 'Luc Van Tichelen', 'Annemie Vorstermans']
Prosody transplantation in text-to-speech: applications and tools.
775,951
We look at the scenario of having to route a continuous rate of traffic from a source node to a sink node in a network, where the objective is to maximize throughput. This is of interest, e.g., for providers of streaming content in communication networks. The overall path latency, which was relevant in other non-cooperative network routing games such as the classic Wardrop model, is of lesser concern here. To that end, we define bottleneck games with splittable traffic where the throughput on a path is inversely proportional to the maximum latency of an edge on that very path-the bottleneck latency. Therefore, we define a Wardrop equilibrium as a traffic distribution where this bottleneck latency is at minimum on all used paths. As a measure for the overall system well-being-called social cost-we take the weighted sum of the bottleneck latencies of all paths. Our main findings are as follows: First, we prove social cost of Wardrop equilibria on series parallel graphs to be unique. Even more, for any graph whose subgraph induced by all simple start-destination paths is not series parallel, there exist games having equilibria with different social cost. For the price of stability, we give an independence result with regard to the network topology. Finally, our main result is giving a new exact price of stability for Wardrop/bottleneck games on parallel links with M/M/1 latency functions. This result is at the same time the exact price of stability for bottleneck games on general graphs.
['Vladimir V. Mazalov', 'Burkhard Monien', 'Florian Schoppmann', 'Karsten Tiemann']
Wardrop equilibria and price of stability for bottleneck games with splittable traffic
847,955
This paper presents a 99 g finger joint that can exert a very strong fingertip force of more than 100 N. We have shown that a simple five-bar linkage can be used as a load-sensitive continuously variable transmission (CVT) for a finger joint. The maximum fingertip force of our previous finger was limited by the mechanical strength of its links and bearings, not by the power of its DC motor. If the machine strength can be improved, a much greater fingertip force can be expected. To design a lightweight CVT with sufficient mechanical strength, we analyze the internal force of the CVT and select light plain bearings. We also analyze the stress by using the finite element method and select the materials of its links. Experimental results verify that the maximum fingertip force is more than 100 N near the singular configuration of the CVT and the maximum angular velocity is more than 550 deg/s. These motions are impossible without the CVT. We also developed a very light shape memory alloy brake of 0.56 g. The CVT with the brake can hold a fingertip force of more than 100 N not only near the singular configuration. The electric energy consumed by the brake is much less than that by the DC motor
['Takeshi Takaki', 'Toru Omata']
100g-100N finger joint with load-sensitive continuously variable transmission
365,971
Private Database Access With HE-over-ORAM Architecture.
['Craig Gentry', 'Shai Halevi', 'Charanjit S. Jutla', 'Mariana Raykova']
Private Database Access With HE-over-ORAM Architecture.
796,481
Mobile phones are increasingly used for security sensitive activities such as online banking or mobile payments. This usually involves some cryptographic operations, and therefore introduces the problem of securely storing the corresponding keys on the phone. In this paper we evaluate the security provided by various options for secure storage of key material on Android, using either Android's service for key storage or the key storage solution in the Bouncy Castle library. The security provided by the key storage service of the Android OS depends on the actual phone, as it may or may not make use of ARM TrustZone features. Therefore we investigate this for different models of phones. We find that the hardware-backed version of the Android OS service does offer device binding -- i.e. keys cannot be exported from the device -- though they could be used by any attacker with root access. This last limitation is not surprising, as it is a fundamental limitation of any secure storage service offered from the TrustZone's secure world to the insecure world. Still, some of Android's documentation is a bit misleading here. Somewhat to our surprise, we find that in some respects the software-only solution of Bouncy Castle is stronger than the Android OS service using TrustZone's capabilities, in that it can incorporate a user-supplied password to secure access to keys and thus guarantee user consent.
['Tim Cooijmans', 'Joeri de Ruiter', 'Erik Poll']
Analysis of Secure Key Storage Solutions on Android
124,157
In the literature, most of the stereo matching methods have been limited to gray level images, only few authors have dealt with color images straightly. In this paper, we propose a novel area-based color stereo matching method based on competitive-cooperative neural networks. Seven kinds of color spaces are tested in order to evaluate their suitability to stereo matching. The experimental results show that the matching precision is increased efficiently, when using adaptive color features instead of gray values.According to the experimental results, Ohta, Opponent and YCbCr color spaces can offer good color features for stereo matching.
['Xijun Hua', 'Masahiro Yokomichi', 'Michio Kono']
Stereo Correspondence Using Color Based on Competitive-cooperative Neural Networks
423,219
A robust hyper spectral unmixing algorithm that finds multiple sets of end members is introduced. The algorithm, called Robust Context Dependent Spectral Unmixing (RCDSU), combines the advantages of context dependent unmixing and robust clustering. RCDSU adapts the unmixing to different regions, or contexts, of the spectral space. It combines fuzzy and possibilistic clustering and linear unmixing to learn multiple contexts and the optimal end members and abundances for each context. RCDSU uses fuzzy membership functions to partition the data, and possibilistic membership functions to identify noise and outliers. An extension of RCDSU to deal with the case of an unknown number of contexts is also proposed. The performance of the proposed work is evaluated using simulated and real hyper spectral data. The experiments show that the proposed methods can handle noisy data and identify an "optimal" number of contexts and appropriate end members within each context.
['Hamdi Jenzri', 'Hichem Frigui', 'Paul D. Gader']
Robust Context Dependent Spectral Unmixing
466,236
Convolutive blind source separation for more than two sources in the frequency domain
['Hiroshi Sawada', 'Ryo Mukai', 'Shoko Araki', 'Shoji Makino']
Convolutive blind source separation for more than two sources in the frequency domain
36,629
Reachability of Standard and Fractional Continuous-Time Systems with Piecewise Constant Inputs
['Krzysztof Rogowski']
Reachability of Standard and Fractional Continuous-Time Systems with Piecewise Constant Inputs
845,334
In the development of medical skill training systems, the efficiency of the system and the provision of quantitative feedback information to the trainee are very important. Furthermore, usage of the simulated operation platform should be as realistic as possible. In order to satisfy these requirements, we developed a robot to be used for airway management training: Waseda KyotoKagaku Airway No.1 Refined RII (WKA-1RII) (Fig. 1). In addition to realistically shaped hardware and various sensory equipments inside the robot, the new training system also uses a binocular vision system, an inertial measurement unit and a 3D simulation software component to track the movements of the trainee. For fully estimating the skills of the trainee and providing richer feedback, it is not enough to only use information about the tool's movements from the inside of the robot. Therefore, we propose a system to fuse the sensory information from inside the robot with data from 3d vision and from the inertial measurement unit. This accurate information about the movements of the trainee is used to model the progress of the training with a 3d computer graphics simulator. The trainee can use this visualization during or after the training procedure to verify his training status. Using this system he has the possibility to easily compare his performance with a guideline performance provided by an experienced surgeon. In this paper we show a first conceptual application of this approach. The experimental results lead to the consideration that the approach is worth following in further research.
['Chunbao Wang', 'Yohan Noh', 'Hiroyuki Ishii', 'Go Kikuta', 'Kazuki Ebihara', 'Mitsuhiro Tokumoto', 'Isamu Okuyama', 'Matsuoka Yusuke', 'Chihara Terunaga', 'Atsuo Takanishi', 'Kazuyuki Hatake']
Development of a 3D simulation which can provide better understanding of trainee's performance of the task using airway management training system WKA-1RII
223,129
From a set of images in a particular domain, labeled with part locations and class, we present a method to automatically learn a large and diverse set of highly discriminative intermediate features that we call Part-based One-vs.-One Features (POOFs). Each of these features specializes in discrimination between two particular classes based on the appearance at a particular part. We demonstrate the particular usefulness of these features for fine-grained visual categorization with new state-of-the-art results on bird species identification using the Caltech UCSD Birds (CUB) dataset and parity with the best existing results in face verification on the Labeled Faces in the Wild (LFW) dataset. Finally, we demonstrate the particular advantage of POOFs when training data is scarce.
['Thomas Berg', 'Peter N. Belhumeur']
POOF: Part-Based One-vs.-One Features for Fine-Grained Categorization, Face Verification, and Attribute Estimation
7,233
Background#R##N#microRNAs (miRNAs) are a class of small non-coding RNAs which have been recognized as ubiquitous post-transcriptional regulators. The analysis of interactions between different miRNAs and their target genes is necessary for the understanding of miRNAs' role in the control of cell life and death. In this paper we propose a novel data mining algorithm, called HOCCLUS2, specifically designed to bicluster miRNAs and target messenger RNAs (mRNAs) on the basis of their experimentally-verified and/or predicted interactions. Indeed, existing biclustering approaches, typically used to analyze gene expression data, fail when applied to miRNA:mRNA interactions since they usually do not extract possibly overlapping biclusters (miRNAs and their target genes may have multiple roles), extract a huge amount of biclusters (difficult to browse and rank on the basis of their importance) and work on similarities of feature values (do not limit the analysis to reliable interactions).
['Gianvito Pio', 'Michelangelo Ceci', "Domenica D'Elia", 'Corrado Loglisci', 'Donato Malerba']
A Novel Biclustering Algorithm for the Discovery of Meaningful Biological Correlations between microRNAs and their Target Genes
29,378
Craniofacial Image Analysis
['Ezgi Mercan', 'Indriyati Atmosukarto', 'Jia Wu', 'Shu Liang', 'Linda G. Shapiro']
Craniofacial Image Analysis
811,668
This paper describes the design of a leading-one prediction (LOP) logic for floating-point addition with an exact determination of the shift amount for normalization of the adder result. Leading-one prediction is a technique to calculate the number of leading zeros of the result in parallel with the addition. However, the prediction might be in error by one bit and previous schemes to correct this error result in a delay increase. The design presented here incorporates a concurrent position correction logic, operating in parallel with the LOP, to detect the presence of that error and produce the correct shift amount. We describe the error detection as part of the overall LOP, perform estimates of its delay and complexity, and compare with previous schemes.
['Javier D. Bruguera', 'Tomás Lang']
Leading-one prediction with concurrent position correction
534,688
The paper is focused on systems with delay terms at the left (and the right) side of differential equations. Analysis and synthesis of delay systems can be conveniently studied through a special ring of RQ-meromorphic functions. The control methodology is based on a solution of Diophantine equations in this ring. Final controllers result in the Smith predictor like structure. Controller parameters are tuned through a pole-placement problem as a desired multiple root of the characteristic closed loop equation. The methodology is illustrated by a stable second order transfer function with a dead-time term. Then the paper brings an autotuning method as a combination of biased-relay feedback estimation and the proposed algebraic control design. The developed approach is illustrated by examples in the Matlab and Simulink environment.
['Roman Prokop', 'Jiri Korbel', 'Libor Pekar']
Delay systems with meromorphic functions design
857,611
Konzept und Realisierung eines optischen Mikrosystems zur Wiederherstellung der Akkommodation
['Ulrich Gengenbach', 'Christoph Beck', 'Helmut Guth', 'Liane Koker', 'Markus Krug', 'Thomas Martin', 'Jörg Nagel', 'Ingo Sieber', 'Peter F. Stiller', 'Oliver Stachs', 'Rudolf Guthoff']
Konzept und Realisierung eines optischen Mikrosystems zur Wiederherstellung der Akkommodation
902,188
Abstract#R##N##R##N#Label-increasing trees are fully labeled rooted trees with the restriction that the labels are in increasing order on every path from the root; the best known example is the binary case—no tree with more than two branches at the root, or internal vertices of degree greater than three—extensively examined by Foata and Schutzenberger in A Survey of Combinatorial Theory. The forests without branching restrictions are enumerated by number of trees by Fn(x) = x(x + 1)…(x + n − 1), n >1 (F0(x) = 1), whose equivalent: Fn(x) = Yn(xT1,…, xTn), Fn(1)= Tn + 1 = n!, is readily adapted to branching restriction.
['John Riordan']
Forests of label-increasing trees
39,580
The IETF group is currently working on service differentiation in the Internet. However, in wireless environments such as ad hoc networks, where channel conditions are variable and bandwidth is scarce, the Internet differentiated services are suboptimal without lower layers' support. The IEEE 802.11 standard for Wireless LANs is the most widely used WLAN standard today. It has a mode of operation that can be used to provide service differentiation, but it has been shown to perform insufficiently. In this paper, we present a service differentiation scheme for support QoS in the wireless IEEE 802.11, which is based on a multiple queuing system to provide priority of user's flows. We simulate and analyze the performance of our algorithm and compare its performance with the original IEEE 802.11b protocol. Simulation results show that our approach outperforms the standard 802.11b in terms of throughput and packet loss.
['Mohamed Brahma', 'Kwang-soo Kim', 'Abdelhafid Abouaissa', 'Pascal Lorenz']
A New Approach for Traffic Engineering in Mobile Ad-hoc Networks
325,681
ECG and pulse waveforms are important physiological parameters that can be utilized to analyze noninvasively physical condition and monitor cardiovascular diseases. In recent years, they were studied for evaluating the effects of exercise on human body. However, the systematical analysis on the changes of ECG and pulse waveforms before and after exercise has not been found. This study examines the change of ECG and pulse waveforms after 12-minute-running exercise in nine weeks with 10 healthy subjects. In the experiment, RQ, SQ and TQ represent the amplitude difference between R-, S-, T- and Q-wave of ECG, respectively. Similarly, H1 and H2 represent the amplitude of highest and small peak relative to the baseline of pulse waveforms, respectively. It was found that 90% of subjects' heart rate (HR) at rest has significant decreased (p<0.001) during the experiment period. Moreover, other parameters (RQ/SQ/HR, TQ/RQ/HR, baseline-signal ratio and H1/H2) all have significant difference (p<0.001) in nine weeks. 80% of RQ/SQ/HR, 60% of TQ/RQ/HR, 100% of baseline-signal energy ratio and 80% of H1/H2 at rest decreased during nine weeks experiment; while 70% of RQ/SQ/HR declined, 100% of TQ/RQ/HR and 90% of H1/H2 increased immediately after exercise comparing with rest condition. Results showed that these parameters could properly denote the change of ECG and pulse waveforms in exercise experiment. Furthermore, according to the variation trend of these parameters, it indicated that 12-minute-running improved cardiopulmonary function of subjects significantly.
['Lisheng Xu', 'Yue Zhong', 'Sainan Yin', 'Yuemin Zhang', 'Yanhua Shen', 'Deguo Hao', 'Yiming Hu', 'Ruifeng Zhang']
ECG and pulse variability analysis for exercise evaluation
204,720
Industrial Control Systems (ICS) are used for operating and monitoring industrial processes. Recent reports state that current ICS infrastructures are not sufficiently protected against cyber threats. Unfortunately, due to the specific nature of these systems, the application of common security counter-measures is often not effective. This paper summarizes experiences over a series of research efforts for building tools and mechanisms to improve the security and awareness in ICS. In particular, we discuss challenges and opportunities identified during an extensive analysis of ICS data resources. We believe that such insights are valuable for further research in the ICS context.
['Dina Hadziosmanovic', 'Damiano Bolzoni', 'Sandro Etalle', 'Pieter H. Hartel']
Challenges and opportunities in securing industrial control systems
434,904
To adapt database technology to new environments like cloud platforms or multi-core hardware, or to try anew to provide an extensible database platform, it is useful to separate transaction services from data management elements that need close physical proximity to data. With "generic" transactional services of concurrency control and recovery in a separate transactional component (TC), indexing, cache and disk management, now in a data component (DC), can be simplified and tailored more easily to the platform or to a data type extension with a special purpose index. This decomposition requires that details of the DC's management of data be hidden from the TC. Thus, locking and logging need to be "logical", which poses a number of problems. One problem is the handling of locking for ranges of keys. Locks need to be taken at the TC prior to the records and their keys being known to the TC. We describe generic two approaches for dealing with this. (1) Make a "speculative" visit" to the DC to learn key values. (2) Lock a "covering resource" first, then learn and lock key values and ultimately release the covering resource lock. The "table" is the only logical (and hence known to the TC) covering resourse in the traditional locking hierarchy, but using it limits concurrency. Concurrency is improved with the introduction of new partition resources. We show how partitions as covering resources combine high concurrency with low locking overhead. Using partitions is sufficiently effective to consider adapting it for a traditional database kernel.
['David Lomet', 'Mohamed F. Mokbel']
Locking key ranges with unbundled transaction services
311,614
The linear quadtree is a spatial access method that is built by decomposing the spatial objects in a database into quadtree blocks and storing these quadtree blocks in a B-tree. The linear quadtree is very useful for geographic information systems because it provides good query performance while using existing B-tree implementations. An algorithm and a cost model are presented for processing window queries in linear quadtrees. The algorithm can handle query windows of any shape in the general case of spatial databases with overlapping objects. The algorithm recursively decomposes the space into quadtree blocks, and uses the quadtree blocks overlapping the query window to search the B-tree. The cost model estimates the I/O cost of processing window queries using the algorithm. The cost model is also based on a recursive decomposition of the space, and it uses very simple parameters that can easily be maintained in the database catalog. Experiments with real and synthetic data sets verify the accuracy of the cost model.
['Ashraf Aboulnaga', 'Walid G. Aref']
Window Query Processing in Linear Quadtrees
73,283
We address temporal action localization in untrimmed long videos. This is important because videos in real applications are usually unconstrained and contain multiple action instances plus video content of background scenes or other activities. To address this challenging issue, we exploit the effectiveness of deep networks in temporal action localization via three segment-based 3D ConvNets: (1) a proposal network identifies candidate segments in a long video that may contain actions, (2) a classification network learns one-vs-all action classification model to serve as initialization for the localization network, and (3) a localization network fine-tunes the learned classification network to localize each action instance. We propose a novel loss function for the localization network to explicitly consider temporal overlap and achieve high temporal localization accuracy. In the end, only the proposal network and the localization network are used during prediction. On two largescale benchmarks, our approach achieves significantly superior performances compared with other state-of-the-art systems: mAP increases from 1.7% to 7.4% on MEXaction2 and increases from 15.0% to 19.0% on THUMOS 2014.
['Zheng Shou', 'Dongang Wang', 'Shih-Fu Chang']
Temporal Action Localization in Untrimmed Videos via Multi-stage CNNs
732,626
Une approche langage pour la gestion de donnèes dans les systèmes de méta-modélisation.
['Stéphane Jean', 'Yamine Aït Ameur', 'Guy Pierra']
Une approche langage pour la gestion de donnèes dans les systèmes de méta-modélisation.
667,058
In this paper we study a new technique we call post-bagging, which consists in resampling parts of a classification model rather then the data. We do this with a particular kind of model: large sets of classification association rules, and in combination with ordinary best rule and weighted voting approaches. We empirically evaluate the effects of the technique in terms of classification accuracy. We also discuss the predictive power of different metrics used for association rule mining, such as confidence, lift, conviction and χ2. We conclude that, for the described experimental conditions, post-bagging improves classification results and that the best metric is conviction.
['Alípio Mário Jorge', 'Paulo J. Azevedo']
An experiment with association rules and classification: post-bagging and conviction
529,057
Two medieval manuscripts are recorded, investigated and analyzed by philologists in collaboration with computer scientists. Due to mold, air humidity and water the parchment is partially damaged and consequently hard to read. In order to enhance the readability of the text, the manuscript pages are imaged in different spectral bands ranging from 360 to 1000nm. A registration process is necessary for further image processing methods which combine the information gained by the different spectral bands. Therefore, the images are coarsely aligned using rotationally invariant features and an affine transformation. Afterwards, the similarity of the different images is computed by means of the normalized cross correlation. Finally, the images are accurately mapped to each other by the local weighted mean transformation. The algorithms used for the registration and results in enhancing the texts using Multivariate Spatial Correlation are presented in this paper.
['Martin Lettner', 'Markus Diem', 'Robert Sablatnig', 'Heinz Miklas']
Registration and enhancing of multispectral manuscript images
20,872
Detection of critical errors of locomotion mode recognition for volitional control of powered transfemoral prostheses
['Fan Zhang', 'Ming Liu', 'He Huang']
Detection of critical errors of locomotion mode recognition for volitional control of powered transfemoral prostheses
687,301
Biological data is becoming so complex, it is difficult for scientists and other professionals to interpret and understand it. New tools are needed to better support the manipulation and understanding of data in order to improve analyses and the formation of new hypotheses. Tangible mtDNA is an active tangible and tabletop system that allows multiple users with diverse expertise to collaborate in exploring and understanding mitochondrial DNA sequencing data in breast cancer patients. In an evaluation of the system, 5 expert biologists found it to be effective for data exploration and useful in supporting understanding, collaboration and discussion of DNA datasets.
['Roozbeh Manshaei', 'Nauman Baig', 'Sean DeLong', 'Shahin Khayyer', 'Brien East', 'Ali Mazalek']
Exploring Genetic Mutations on Mitochondrial DNA Cancer Data with Interactive Tabletop and Active Tangibles
936,815
This paper presents a representation for melodic segment classes and applies it to music data mining. Melody is modeled as a sequence of segments, each segment being a sequence of notes. These segments are assigned to classes through a knowledge representation scheme which allows the flexible construction of abstract views of the music surface. The representation is applied to sequential pattern discovery and to the statistical modeling of musical style.
['Darrell Conklin']
Melodic analysis with segment classes
507,676
The adaptive Volterra filter has been successfully applied in nonlinear acoustic echo cancellation (AEC) systems and nonlinear line echo cancellation systems, but its applications are limited by its required computational complexity and slow convergence rate, especially for systems with long memory length. In this paper, we first apply a more general nonlinear filter, the function expansion nonlinear filter, in the acoustic echo cancellation - the Volterra filter can be regarded as special case of the function expansion nonlinear filter. Then by leveraging to a multi-channel configuration of the function expansion nonlinear filter and the sampling theory for nonlinear systems, we extend linear sub-band delay-less adaptive filter techniques to develop an efficient sub-band implementation of the adaptive function expansion nonlinear filter. The developed sub-band configuration of the adaptive nonlinear filter can greatly improve the convergence rate and reduce the computational complexity of nonlinear echo cancellers, which is shown by analyses and simulations.
['Dayong Zhou', 'Yunhua Wang', 'Victor E. DeBrunner', 'Linda S. DeBrunner']
Sub-band Implementation of Adaptive Nonlinear Filter for Adaptive Nonlinear Echo Cancellation
343,152
From May 05 to May 08, 2009, the Dagstuhl Seminar 09192#R##N#``From Quality of Service to Quality of Experience ''#R##N#was held in Schloss Dagstuhl~--~Leibniz Center for Informatics.#R##N#The notion of \emph{Quality of Service} has served as a central research topic in communication networks for more than a decade, however, usually starting from a rather technical view on service quality. Therefore, recently the notion of emph{Quality of Experience} has emerged, redirecting the focus towards the end user and trying to quantify her subjective experience gained from using a service. The goal of this Dagstuhl seminar is to discuss this important paradigm shift in an interdisciplinary international community of key researchers, to investigate innovative research methodologies and to deepen the scientific understanding of this topic which is highly relevant for the economic success of future mobile and fixed communication services.
['Markus Fiedler', 'Kalevi Kilkki', 'Peter Reichl']
09192 Executive Summary -- From Quality of Service to Quality of Experience
743,750
Reducing Artifacts in Surface Meshes Extracted from Binary Volumes.
['Ragnar Bade', 'Olaf Konrad', 'Bernhard Preim']
Reducing Artifacts in Surface Meshes Extracted from Binary Volumes.
798,600
Recent years have witnessed the success of the proposed line-based distributed coding and transmission system for broadcasting satellite images (LineCast). The LineCast achieves high compression efficiency and provides smooth multicast capability with soft broadcast and coset coding, in which the precision of side information is crucial. However, the generated side information is rough by averaging several candidates based on template matching. In this paper, we propose a low-rank approximation based LineCast scheme for video broadcasting (LRALineCast) to generate more precise side information by exploiting the cross-frame correlation. The side information generation problem is thus reformulated as a low-rank matrix completion problem, where the approximately low-rank matrix is rearranged by the similar lines searched by template matching. LRALineCast achieves good broadcast performance and avoids the annoying cliff effect by utilizing soft broadcast transmission. The experimental results show that LRALineCast outperforms the state-of-the-art 2D broadcasting scheme and LineCast with up to 2dB coding gain.
['Wenbin Yin', 'Xiaopeng Fan', 'Yunhui Shi', 'Debin Zhao']
Low-rank approximation based LineCast for video broadcasting
989,758
In current Critical Information Infrastructures, cyber-attackers can remotely leverage complex interdependencies with the underlying communication systems to cause large damages. Assessing the vulnerabilities of such systems and identifying potential countermeasures are crucial functionalities for their protection. This paper presents the INSPIRE Ontology Handler, a standard-based tool that enables automatic audits of the security and criticality level of information systems. It is composed of an infrastructure discovery component aiming at automatically discovering assets, an ontology repository to store and manipulate instantiated ontologies, and a visualization component to graphically view and modify information stored within ontologies.
['Mathieu Bouet', 'Maurice Israel']
INSPIRE Ontology Handler: Automatically building and managing a knowledge base for Critical Information Infrastructure protection
526,786
Solving polynomial systems subsystem-by-subsystem means to solve a system of polynomial equations by first solving subsets of the system and then intersecting the results. The approach leads to numerical representations of all the solution components of a system. The focus of this paper is the development of a parallel implementation to solve large systems involving a recursive divide-and- conquer scheme. Because we concentrate our discussion on the distribution of the path tracking jobs, we have selected applications for which we have optimal homotopies, for which all paths converge to regular solutions.
['Yun Guan', 'Jan Verschelde']
Parallel Implementation of a Subsystem-by-Subsystem Solver
304,530
The problem of complex spectral estimation is of great interest in many applications. This paper studies the general class of the forward-backward matched-filterbank (MAFI) spectral estimators including the widely used Capon as well as the more recently introduced amplitude and phase estimation of a sinusoid (APES) methods. In particular, we show by means of a higher order expansion technique that the one-dimensional (1-D) Capon estimator underestimates the true spectrum, whereas the 1-D APES method is unbiased; we also show that the bias of the forward-backward Capon is half that of the forward-only Capon (to within a second-order approximation). Furthermore. We show that these results can be extended to the two-dimensional (2-D) Capon and APES estimators. Numerical examples are also presented to demonstrate quantitatively the properties of and the relation between these MAFI estimators.
['Hongbin Li', 'Jian Li', 'Petre Stoica']
Performance analysis of forward-backward matched-filterbank spectral estimators
234,918
Dependency Graphs in Natural Language Processing
['Cristina Barbero', 'Vincenzo Lombardo']
Dependency Graphs in Natural Language Processing
413,019
The use of cyber-physical systems (CPS) for different tasks and in multiple environments where different laws and standards need to be adhered to often leads to development of multiple product variants. In the domain of embedded systems, product line engineering (PLE) is an established approach to manage variability that can be used to efficiently develop and maintain those variants. At the same time CPS are known for their iterative evolution process. While software configuration management (SCM) offers solutions to manage software evolution, it has to be adapted for the use in combination with PLE. One important part of SCM that needs to be adapted to PLE is version control. This paper presents a branching model that covers use cases that version control has to support in a PLE context.
['Robert Hellebrand', 'Michael Schulze', 'Martin Becker']
A branching model for variability-affected cyber-physical systems
835,653
Heuristic algorithms strengthen researchers to solve more complex and combinatorial problems in a reasonable time. Markowitz's Mean-Variance portfolio selection model is one of those aforesaid problems. Actually, Markowitz's model is a nonlinear (quadratic) programming problem which has been solved by a variety of heuristic and non-heuristic techniques. In this paper a portfolio selection model which is based on Markowitz's portfolio selection problem including three of the most important limitations is considered. The results can lead Markowitz's model to a more practical one. Minimum transaction lots, cardinality constraints (both of which have been presented before in other researches) and market (sector) capitalization (which is proposed in this research for the first time as a constraint for Markowitz model), are considered in extended model. No study has ever proposed and solved this expanded model. To solve this mixed-integer nonlinear programming (NP-Hard), a corresponding genetic algorithm (GA) is utilized. Computational study is performed in two main parts; first, verifying and validating proposed GA and second, studying the applicability of presented model using large scale problems.
['Hamed Soleimani', 'Hamid Reza Golmakani', 'Mohammad Hossein Salimi']
Markowitz-based portfolio selection with minimum transaction lots, cardinality constraints and regarding sector capitalization using genetic algorithm
219,001
Both engagement and motivation are important factors in any learning process. Fortunately, there are different approaches worth considering during the design process of learning experiences that could help educators create engagement. One that has shown great potential is the inclusion of game design principles, or game-like experiences, in a curriculum. This paper presents the design and analysis of an e-learning activity within a software engineering course that relies on the application of such an approach as its motivational foundation from a purely educational standpoint (i.e. not as entertainment). The goal was to encourage adult learners to solve non-graded formative activities and to increase their sense of kinship to the class group. After one semester, the results revealed a positive assessment of the experience designed and student engagement.
['Alberto Mora', 'Elena Planas', 'Joan Arnedo-Moreno']
Designing game-like activities to engage adult learners in higher education
953,604
Let F be a family of graphs. In the F -C ompletion problem, we are given an n -vertex graph G and an integer k as input, and asked whether at most k edges can be added to G so that the resulting graph does not contain a graph from F as an induced subgraph. It was shown recently that two special cases of F -C ompletion , namely, (i) the problem of completing into a chordal graph known as M inimum F ill-in (SIAM J. Comput. 2013), which corresponds to the case of F e{ C 4 , C 5 , C 6 , l}, and (ii) the problem of completing into a split graph (Algorithmica 2015), that is, the case of F e{ C 4 , 2 K 2 , C 5 }, are solvable in parameterized subexponential time 2 O (s k l k ) n O (1) . The exploration of this phenomenon is the main motivation for our research on F -C ompletion . In this article, we prove that completions into several well-studied classes of graphs without long induced cycles and paths also admit parameterized subexponential time algorithms by showing that: —The problem T rivially P erfect C ompletion , which is F - C ompletion for F e{ C 4 , P 4 }, a cycle and a path on four vertices, is solvable in parameterized subexponential time 2 O (s k log k ) n O (1) . —The problems known in the literature as P seudosplit C ompletion , the case in which F{2 K 2 , C 4 }, and T hreshold C ompletion , in which F e2 K 2 , P 4 , C 4 }, are also solvable in time 2 O (s k log k ) n O }(1) . We complement our algorithms for F -C ompletion with the following lower bounds: —For F e{2 K 2 }, F e { C 4 }, F e{ P o 4 }, and F e{2 K 2 , P 4 }, F -C ompletion cannot be solved in time 2 o(k) n O (1) unless the Exponential Time Hypothesis (ETH) fails. Our upper and lower bounds provide a complete picture of the subexponential parameterized complexity of F -C ompletion problems for any F ⊆ {2 K 2 , C 4 , P 4 }.
['Pål Grønås Drange', 'Fedor V. Fomin', 'Michał Pilipczuk', 'Yngve Villanger']
Exploring the Subexponential Complexity of Completion Problems
605,073
A simple toolkit for DNA fragment assembly.
['João Meidanis']
A simple toolkit for DNA fragment assembly.
988,510
Data is the new capital of the global economy and data analysis can increase efficiency and reduce costs for innovation. Finding and accessing both public and private data is essential to exploiting this data capital. In this paper we describe a platform, VOICE Observatory (VO), that provides an infrastructure and analytical resources to facilitate data-driven innovation. The VO curates a catalogue of analytical resources and provides secure access to them. Through the VO, users can efficiently share, find and use curated analytical resources to create new analytics. Furthermore, the VO provides a well-defined API for authorised applications to access datasets on the behalf of certain users.
['Eugene Siow', 'Xin Wang', 'Thanassis Tiropanis']
Facilitating data-driven innovation using VOICE observatory infrastructure
761,823
Hospital evacuation is a difficult process that requires a robust strategy and careful execution. In the past, threats leading to possible evacuation were primarily natural disasters. In recent years the broadened nature of threats, including hazardous material spills and terrorist incidents, has complicated this already complex problem. Its importance continues to grow, but there is still no consistent approach to tackle this problem. Plan development and evaluation are crucial to the plan's refinement which leads to successful response when an evacuation threat occurs. This research describes the issues inherent in planning and evaluation along with the complexities of constructing appropriate models for emergency preparedness and evacuation.
['Kevin Taaffe', 'Rachel Kohl', 'D. L. Kimbler']
Hospital evacuation: issues and complexities
509,441
IRLBA: Fast Partial Singular Value Decomposition Method.
['James Baglama']
IRLBA: Fast Partial Singular Value Decomposition Method.
981,920
Diagnosis techniques using urine are non-invasive, inexpensive, and easy to perform in clinical settings. The metabolites in urine, as the end products of cellular processes, are closely linked to phenotypes. Although research using urine metabolome has many advantages, there can also be problems, such as multiple characteristic signals mixing or averaging into undistinguishable signals. As a result, it seems that univariate methods cannot identify precise boundaries between two groups, such as cancerous and normal samples. Moreover, due to individual differences in genetic makeup and heterogeneity in cancer progression, the analysis of combinatorial information from many variables seems to be more suitable than univariate analysis. In this study, we therefore propose classification models using multivariate classification techniques and develop an analysis procedure for classification studies using metabolome data. Through this strategy, we identified five potential urinary biomarkers for breast cancer with high accuracy and also proposed potential diagnosis rules to help in clinical decision making. After further validation with independent cohorts and experimental confirmation, these marker candidates will likely lead to clinically applicable assays for earlier diagnoses of breast cancer. This multivariate classification research is the second trial in metabolome analysis after Denkert et al. and the first for urine metabolome studies.
['Young-Hoon Kim', 'Imhoi Koo', 'Byung Hwa Jung', 'Bong Chul Chung', 'Doheon Lee']
Multivariate classification of urine metabolome profiles for breast cancer diagnosis
554,533
The proliferation of multimedia-capable mobile devices and ubiquitous high-speed network technologies to deliver multimedia objects has fueled the demand of mobile streaming multimedia. A necessary criterion for the mass acceptance of mobile devices is acceptable battery life of these devices. This paper explores linear prediction-based client-side strategies to reduce the wireless network interface card (WNIC) energy consumption by transitioning the WNIC to a lower power consuming sleep state. The basic idea of this strategy is to selectively choose proper periods of time to suspend communication by switching the WNIC to sleep state. A linear prediction-based time series forecasting technique is used to predict future no-data intervals. Simulation results show that linear prediction-based strategy gives better results than those based on simple averaging [Surendar Chandra et al., (2002)].
['Yong Wei', 'Surendar Chandra', 'Suchendra M. Bhandarkar']
A statistical prediction-based scheme for energy-aware multimedia data streaming
98,140
The effects of age on player behavior in educational games.
["Eleanor O'Rourke", 'Eric Butler', 'Yun-En Liu', 'Christy Ballweber', 'Zoran Popović']
The effects of age on player behavior in educational games.
759,071
Opening up data has become a global trend, especially when democratic governments want to show openness and transparency. In this paper, we describe how we implement a 4-star election open data in RDF in Taiwan. For the past fifty years, most of the data was recorded on paper and was stored in the Central Election Commission of Taiwan. First, we convert paper documents to PDF to digitize the data from the handwritten paper. We convert PDF files into CSV files by doing OCR and manual inputting. In addition, we classified and integrated all columns by their property and revised the column name in English. Since we were going to establish ontology to describe resources, properties, statements, and the relationship between entities, we structured E-R (entity-relationship) diagrams and transformed the E-R diagrams to ontology by using the Protege OWL tool afterwards. Finally, we opened up Taiwan election data in RDF with ontology and CSV files using Python automation.
['Ching-Yuan Chien', 'Chia-Hsin Hung', 'Min-Yuh Day', 'Yuh-Tay Lin', 'Chu-Yin Yang']
Developing Four Stars Election Open Data in RDF: Evidence from Taiwan Election Open Data Project
842,142
The InsMT+ is an improved version of InsMT system participated at OAEI 2014. The InsMT+ an automatic instance matching system which con- sists in identifying the instances that describe the same real-world objects. The InsMT+ applies different string-based matchers with a local filter. This is the second participation of our system and we have improved somehow the results obtained by the previous version.
['Abderrahmane Khiat', 'Moussa Benaissa']
InsMT+ Results for OAEI 2015 Instance Matching
793,452
Semantics, Causality and Mobility in Membrane Computing.
['Oana Agrigoroaiei', 'Bogdan Aman', 'Gabriel Ciobanu']
Semantics, Causality and Mobility in Membrane Computing.
738,354
Laws set requirements that force organizations to assess the security and privacy of their IT systems and impose the adoption of the implementation of minimal precautionary security measures. Several frameworks have been proposed to deal with thii issue. For instance, purpose-based access control is normally considered a good solution for meeting the requirements of privacy legislation. Yet, understanding why, how, and when such solutions to security and privacy problems have to be deployed is often unanswered. In this paper, we look at the problem from a broader perspective, accounting for legal and organizational issues. Security engineers and legal experts should be able to start from the organizational model and derive from there the points where security and privacy problems may arise and determine which solutions best fit the (legal) problems that they face. In particular, we investigate the methodology needed to capture security and privacy requirements for a Health Care Centre using a smart items infrastructure.
['Luca Compagna', 'Paul El Khoury', 'Fabio Massacci', 'Reshma Thomas', 'Nicola Zannone']
How to capture, model, and verify the knowledge of legal, security, and privacy experts: a pattern-based approach
4,880
Information Technology Outsourcing: Asset Transfer and the Role of Contract
['Young Bong Chang', 'Vijay Gurbaxani', 'Paul Merage', 'Kiron Ravindran']
Information Technology Outsourcing: Asset Transfer and the Role of Contract
660,941