diff --git "a/inspec/test.json" "b/inspec/test.json" new file mode 100644--- /dev/null +++ "b/inspec/test.json" @@ -0,0 +1,500 @@ +{"name": "test_193", "title": "Twenty years of the literature on acquiring out-of-print materials", "abstract": "This article reviews the last two-and-a-half decades of literature on acquiring out-of-print materials to assess recurring issues and identify changing practices. The out-of-print literature is uniform in its assertion that libraries need to acquire o.p. materials to replace worn or damaged copies, to replace missing copies, to duplicate copies of heavily used materials, to fill gaps in collections, to strengthen weak collections, to continue to develop strong collections, and to provide materials for new courses, new programs, and even entire new libraries", "fulltext": "", "keywords": "out-of-print materials;acquisition;out-of-print books;recurring issues;changing practices;library materials"} +{"name": "test_1930", "title": "A new method of systemological analysis coordinated with the procedure of", "abstract": "object-oriented design. II For pt.I. see Vestn. KhGPU, no.81, p.15-18 (2000). The paper presents the results of development of an object-oriented systemological method used to design complex systems. A formal system representation, as well as an axiomatics of the calculus of systems as functional flow-type objects based on a Node-Function-Object class hierarchy are proposed. A formalized NFO/UFO analysis algorithm and CASE tools used to support it are considered", "fulltext": "", "keywords": "formal system representation;functional flow-type objects;formalized nfo/ufo analysis algorithm;systemological analysis;case tools;object-oriented design;axiomatics;complex systems design"} +{"name": "test_1931", "title": "Mathematical fundamentals of constructing fuzzy Bayesian inference techniques", "abstract": "Problems and an associated technique for developing a Bayesian approach to decision-making in the case of fuzzy data are presented. The concept of fuzzy and pseudofuzzy quantities is introduced and main operations with pseudofuzzy quantities are considered. The basic relationships and the principal concepts of the Bayesian decision procedure based on the modus-ponens rule are proposed. Some problems concerned with the practical realization of the fuzzy Bayesian method are considered", "fulltext": "", "keywords": "modus-ponens rule;mathematical fundamentals;decision making;fuzzy bayesian inference techniques;pseudofuzzy quantities"} +{"name": "test_1932", "title": "Solution of the safe problem on (0,1)-matrices", "abstract": "A safe problem with mn locks is studied. It is reduced to a system of linear equations in the modulo 2 residue class. There are three possible variants defined by the numbers m and n evenness, with only one of them having a solution. In two other cases, correction of the initial state of the safe insuring a solution is proposed", "fulltext": "", "keywords": "safe problem;computer games;mn locks;linear equations;1)-matrices;linear diophantine equations;(0;modulo 2 residue class"} +{"name": "test_1933", "title": "Accelerated simulation of the steady-state availability of non-Markovian", "abstract": "systems A general accelerated simulation method for evaluation of the steady-state availability of non-Markovian systems is proposed. It is applied to the investigation of a class of systems with repair. Numerical examples are given", "fulltext": "", "keywords": "numerical examples;general accelerated simulation method;steady-state availability;accelerated simulation;non-markovian systems"} +{"name": "test_1934", "title": "Computational finite-element schemes for optimal control of an elliptic system", "abstract": "with conjugation conditions New optimal control problems are considered for distributed systems described by elliptic equations with conjugate conditions and a quadratic minimized function. Highly accurate computational discretization schemes are constructed for the case where a feasible control set u/sub delta / coincides with the full Hilbert space u of controls", "fulltext": "", "keywords": "computational discretization schemes;optimal control problems;distributed systems;conjugate conditions;elliptic equations;quadratic minimized function"} +{"name": "test_1935", "title": "Identification of states of complex systems with estimation of admissible", "abstract": "measurement errors on the basis of fuzzy information The problem of identification of states of complex systems on the basis of fuzzy values of informative attributes is considered. Some estimates of a maximally admissible degree of measurement error are obtained that make it possible, using the apparatus of fuzzy set theory, to correctly identify the current state of a system", "fulltext": "", "keywords": "measurement error;fuzzy set theory;admissible measurement errors;informative attributes;complex systems states identification;fuzzy information"} +{"name": "test_1936", "title": "A new approach to the decomposition of Boolean functions by the method of", "abstract": "q-partitions.II. Repeated decomposition For pt.I. see Upr. Sist. Mash., no. 6, p. 29-42 (1999). A new approach to the decomposition of Boolean,functions that depend on n variables and are represented in various forms is considered. The approach is based on the method of q-partitioning of minterms and on the introduced concept of a decomposition clone. The theorem on simple disjunctive decomposition of full and partial functions is formulated. The approach proposed is illustrated by examples", "fulltext": "", "keywords": "boolean functions decomposition;partial functions;decomposition clone;logic synthesis;q-partitions;disjunctive decomposition;minterms"} +{"name": "test_1937", "title": "Nonlinear extrapolation algorithm for realization of a scalar random process", "abstract": "A method of construction of a nonlinear extrapolation algorithm is proposed. This method makes it possible to take into account any nonlinear random dependences that exist in an investigated process and are described by mixed central moment functions. The method is based on the V. S. Pugachev canonical decomposition apparatus. As an example, the problem of nonlinear extrapolation is solved for a moment function of third order", "fulltext": "", "keywords": "nonlinear extrapolation algorithm;canonical decomposition apparatus;mixed central moment functions;nonlinear random dependences;scalar random process;moment function"} +{"name": "test_1938", "title": "A method for solution of systems of linear algebraic equations with", "abstract": "m-dimensional lambda -matrices A system of linear algebraic equations with m-dimensional lambda -matrices is considered. The proposed method of searching for the solution of this system lies in reducing it to a numerical system of a special kind", "fulltext": "", "keywords": "numerical system;linear algebraic equations;m-dimensional lambda -matrices"} +{"name": "test_1939", "title": "Compatibility of systems of linear constraints over the set of natural numbers", "abstract": "Criteria of compatibility of a system of linear Diophantine equations, strict inequations, and nonstrict inequations are considered. Upper bounds for components of a minimal set of solutions and algorithms of construction of minimal generating sets of solutions for all types of systems are given. These criteria and the corresponding algorithms for constructing a minimal supporting set of solutions can be used in solving all the considered types of systems and systems of mixed types", "fulltext": "", "keywords": "minimal generating sets;strict inequations;linear constraints;upper bounds;linear diophantine equations;nonstrict inequations;set of natural numbers"} +{"name": "test_194", "title": "Books on demand: just-in-time acquisitions", "abstract": "The Purdue University Libraries Interlibrary Loan unit proposed a pilot project to purchase patrons' loan requests from Amazon. com, lend them to the patrons, and then add the titles to the collection. Staff analyzed previous monograph loans, developed ordering criteria, implemented the proposal as a pilot project for six months, and evaluated the resulting patron comments, statistics, and staff perceptions. As a result of enthusiastic patron comments and a review of the project statistics, the program was extended", "fulltext": "", "keywords": "publication on demand;monograph loans;purdue university libraries interlibrary loan unit;staff perceptions;patron comments;ordering criteria"} +{"name": "test_1940", "title": "New lower bounds of the size of error-correcting codes for the Z-channel", "abstract": "Optimization problems on graphs are formulated to obtain new lower bounds of the size of error-correcting codes for the Z-channel", "fulltext": "", "keywords": "optimization problems;error-correcting codes;lower bounds;z-channel;graphs"} +{"name": "test_1941", "title": "Descriptological foundations of programming", "abstract": "Descriptological foundations of programming are constructed. An explication of the concept of a descriptive process is given. The operations of introduction and elimination of abstraction at the level of processes are refined. An intensional concept of a bipolar function is introduced. An explication of the concept of introduction and extraction of abstraction at the bipole level is given. On this basis, a complete set of descriptological operations is constructed", "fulltext": "", "keywords": "bipole level;descriptive process;programming;intensional concept;descriptological foundations;bipolar function"} +{"name": "test_1942", "title": "Precoded OFDM with adaptive vector channel allocation for scalable video", "abstract": "transmission over frequency-selective fading channels Orthogonal frequency division multiplexing (OFDM) has been applied in broadband wireline and wireless systems for high data rate transmission where severe intersymbol interference (ISI) always occurs. The conventional OFDM system provides advantages through conversion of an ISI channel into ISI-free subchannels at multiple frequency bands. However, it may suffer from channel spectral nulls and heavy data rate overhead due to cyclic prefix insertion. Previously, a new OFDM framework, the precoded OFDM, has been proposed to mitigate the above two problems through precoding and conversion of an ISI channel into ISI-free vector channels. In this paper, we consider the application of the precoded OFDM system to efficient scalable video transmission. We propose to enhance the precoded OFDM system with adaptive vector channel allocation to provide stronger protection against errors to more important layers in the layered bit stream structure of scalable video. The more critical layers, or equivalently, the lower layers, are allocated vector channels of higher transmission quality. The channel quality is characterized by Frobenius norm metrics; based on channel estimation at the receiver. The channel allocation information is fed back periodically to the transmitter through a control channel. Simulation results have demonstrated the robustness of the proposed scheme to noise and fading inherent in wireless channels", "fulltext": "", "keywords": "channel estimation;robustness;layered bit stream structure;frequency-selective fading channels;frobenius norm metrics;heavy data rate overhead;channel quality;orthogonal frequency division multiplexing;channel spectral nulls;adaptive vector channel allocation;scalable video transmission;critical layers;isi-free vector channels;precoded ofdm;isi channel;channel allocation information;lower layers;control channel"} +{"name": "test_1943", "title": "I-WAP: an intelligent WAP site management system", "abstract": "The popularity regarding wireless communications is such that more and more WAP sites have been developed with wireless markup language (WML). Meanwhile, to translate hypertext markup language (HTML) pages into proper WML ones becomes imperative since it is difficult for WAP users to read most contents designed for PC users via their mobile phone screens. However, for those sites that have been maintained with hypertext markup language (HTML), considerable time and manpower costs will be incurred to rebuild them with WML. In this paper, we propose an intelligent WAP site management system to cope with these problems. With the help of the intelligent management system, the original contents of HTML Web sites can be automatically translated to proper WAP content in an efficient way. As a consequence, the costs associated with maintaining WAP sites could be significantly reduced. The management system also allows the system manager to define the relevance of numerals and keywords for removing unimportant or meaningless contents. The original contents will be reduced and reorganized to fit the size of mobile phone screens, thus reducing the communication cost and enhancing readability. Numerical results gained through various experiments have evinced the effective performance of the WAP management system", "fulltext": "", "keywords": "wireless markup language;hypertext markup language;html pages;wireless communication;intelligent wap site management system;communication cost;readability;i-wap;wireless mobile internet;mobile phone"} +{"name": "test_1944", "title": "A framework of electronic tendering for government procurement: a lesson", "abstract": "learned in Taiwan To render government procurement efficient, transparent, nondiscriminating, and accountable, an electronic government procurement system is required. Accordingly, Taiwan government procurement law (TGPL) states that suppliers may employ electronic devices to forward a tender. This investigation demonstrates how the electronic government procurement system functions and reengineers internal procurement processes, which in turn benefits both government bodies and vendors. The system features explored herein include posting/receiving bids via the Internet, vendor registration, certificate authorization, contract development tools, bid/request for proposal (RFP) development, online bidding, and online payment, all of which can be integrated easily within most existing information infrastructures", "fulltext": "", "keywords": "vendor registration;public key infrastructure;internet bids;electronic government procurement system;taiwan government procurement law;online bidding;electronic tendering;internal procurement processes;certificate authorization;online payment;contract development tools;payment gateway;request for proposal development;rfp development;certification authority;reengineering"} +{"name": "test_1945", "title": "The development of a mobile manipulator imaging system for bridge crack", "abstract": "inspection A mobile manipulator imaging system is developed for the automation of bridge crack inspection. During bridge safety inspections, an eyesight inspection is made for preliminary evaluation and screening before a more precise inspection. The inspection for cracks is an important part of the preliminary evaluation. Currently, the inspectors must stand on the platform of a bridge inspection vehicle or a temporarily erected scaffolding to examine the underside of a bridge. However, such a procedure is risky. To help automate the bridge crack inspection process, we installed two CCD cameras and a four-axis manipulator system on a mobile vehicle. The parallel cameras are used to detect cracks. The manipulator system is equipped with binocular charge coupled devices (CCD) for examining structures that may not be accessible to the eye. The system also reduces the danger of accidents to the human inspectors. The manipulator system consists of four arms. Balance weights are placed at the ends of arms 2 and 4, respectively, to maintain the center of gravity during operation. Mechanically, arms 2 and 4 can revolve smoothly. Experiments indicated that the system could be useful for bridge crack inspections", "fulltext": "", "keywords": "ccd cameras;imaging system;binocular ccd;four-axis manipulator;bridge crack inspection;automation;charge coupled devices;mobile manipulator;parallel cameras;eyesight inspection"} +{"name": "test_1946", "title": "Integrating building management system and facilities management on the", "abstract": "Internet Recently, it is of great interest to adopt the Internet/intranet to develop building management systems (BMS) and facilities management systems (FMS). This paper addresses two technical issues: the Web-based access (including database integration) and the integration of BMS and FMS. These should be addressed for accessing BMS remotely via the Internet, integrating control networks using the Internet protocols and infrastructures, and using Internet/intranet for building facilities management. An experimental Internet-enabled system that integrates building and facilities management systems has been developed and tested. This system integrated open control networks with the Internet and is developed utilizing the embedded Web server, the PC Web server and the Distributed Component Object Model (DCOM) software development technology on the platform of an open control network. Three strategies for interconnecting BMS local networks via Internet/intranet are presented and analyzed", "fulltext": "", "keywords": "software development technology;open control network;internet protocols;distributed component object model;facilities management systems;local network interconnection;database integration;bms;building management systems;fms;web-based access;dcom;intranet;pc web server;embedded web server"} +{"name": "test_1947", "title": "Modelling user acceptance of building management systems", "abstract": "This study examines user acceptance of building management systems (BMS) using a questionnaire survey. These systems are crucial for optimising building performance and yet it has been widely reported that users are not making full use of their systems' facilities. Established models of technology acceptance have been employed in this research, and the positive influence of user perceptions of ease of use and compatibility has been demonstrated. Previous research has indicated differing levels of importance of perceived ease of use relative to other factors. Here, perceived ease of use is shown generally to be more important, though the balance between this and compatibility is moderated by the user perceptions of voluntariness", "fulltext": "", "keywords": "compatibility;technology acceptance model;user perceptions;ease of use;user acceptance modelling;innovation characteristics;building management systems;voluntariness;information systems;questionnaire survey"} +{"name": "test_1948", "title": "Estimating populations for collective dose calculations", "abstract": "The collective dose provides an estimate of the effects of facility operations on the public based on an estimate of the population in the area. Geographic information system software, electronic population data resources, and a personal computer were used to develop estimates of population within 80 km radii of two sites", "fulltext": "", "keywords": "public;collective dose calculations;electronic population data resources;geographic information system software;facility operations;personal computer"} +{"name": "test_1949", "title": "A new graphical user interface for fast construction of computation phantoms", "abstract": "and MCNP calculations: application to calibration of in vivo measurement systems Reports on a new utility for development of computational phantoms for Monte Carlo calculations and data analysis for in vivo measurements of radionuclides deposited in tissues. The individual properties of each worker can be acquired for a rather precise geometric representation of his (her) anatomy, which is particularly important for low energy gamma ray emitting sources such as thorium, uranium, plutonium and other actinides. The software enables automatic creation of an MCNP input data file based on scanning data. The utility includes segmentation of images obtained with either computed tomography or magnetic resonance imaging by distinguishing tissues according to their signal (brightness) and specification of the source and detector. In addition, a coupling of individual voxels within the tissue is used to reduce the memory demand and to increase the calculational speed. The utility was tested for low energy emitters in plastic and biological tissues as well as for computed tomography and magnetic resonance imaging scanning information", "fulltext": "", "keywords": "in vivo measurement systems;anatomy;monte carlo calculations;in vivo measurements;computation phantoms;memory demand;u;magnetic resonance imaging scanning information;low energy gamma ray emitting sources;computational phantoms;graphical user interface;calibration;software;computed tomography;radionuclides;automatic creation;brightness;precise geometric representation;pu;signal;calculational speed;scanning data;actinides;detector;tissues;individual voxels;biological tissues;plastic;th;worker;mcnp input data file"} +{"name": "test_195", "title": "The acquisition of out-of-print music", "abstract": "Non-specialist librarians are alerted to factors important in the successful acquisition of out-of-print music, both scholarly editions and performance editions. The appropriate technical music vocabulary, the music publishing industry, specialized publishers and vendors, and methods of acquisition of out-of-print printed music are introduced, and the need for familiarity with them is emphasized", "fulltext": "", "keywords": "scholarly editions;technical music vocabulary;out-of-print printed music;specialized vendors;performance editions;out-of-print music;specialized publishers;music publishing industry"} +{"name": "test_1950", "title": "General solution of a density functionally gradient piezoelectric cantilever", "abstract": "and its applications We have used the plane strain theory of transversely isotropic bodies to study a piezoelectric cantilever. In order to find the general solution of a density functionally gradient piezoelectric cantilever, we have used the inverse method (i.e. the Airy stress function method). We have obtained the stress and induction functions in the form of polynomials as well as the general solution of the beam. Based on this general solution, we have deduced the solutions of the cantilever under different loading conditions. Furthermore, as applications of this general solution in engineering, we have studied the tip deflection and blocking force of a piezoelectric cantilever actuator. Finally, we have addressed a method to determine the density distribution profile for a given piezoelectric material", "fulltext": "", "keywords": "loading conditions;airy stress function;inverse method;transversely isotropic bodies;polynomials;plane strain theory;piezoelectric material;piezoelectric cantilever actuator;density distribution profile"} +{"name": "test_1951", "title": "Recording quantum properties of light in a long-lived atomic spin state:", "abstract": "towards quantum memory We report an experiment on mapping a quantum state of light onto the ground state spin of an ensemble of Cs atoms with the lifetime of 2 ms. Recording of one of the two quadrature phase operators of light is demonstrated with vacuum and squeezed states of light. The sensitivity of the mapping procedure at the level of approximately 1 photon/sec per Hz is shown. The results pave the road towards complete (storing both quadrature phase observables) quantum memory for Gaussian states of light. The experiment also sheds new light on fundamental limits of sensitivity of the magneto-optical resonance method", "fulltext": "", "keywords": "2 ms;two quadrature phase operators;cs;magnetooptical resonance method;vacuum states;mapping procedure;ensemble;quantum memory;squeezed states;light quantum properties recording;ground state spin;long-lived atomic spin state"} +{"name": "test_1952", "title": "Comprehensive encoding and decoupling solution to problems of decoherence and", "abstract": "design in solid-state quantum computing Proposals for scalable quantum computing devices suffer not only from decoherence due to the interaction with their environment, but also from severe engineering constraints. Here we introduce a practical solution to these major concerns, addressing solid-state proposals in particular. Decoherence is first reduced by encoding a logical qubit into two qubits, then completely eliminated by an efficient set of decoupling pulse sequences. The same encoding removes the need for single-qubit operations, which pose a difficult design constraint. We further show how the dominant decoherence processes can be identified empirically, in order to optimize the decoupling pulses", "fulltext": "", "keywords": "pulse sequence decoupling;exchange hamiltonian;decoupling pulse optimization;decoherence;logical qubit encoding;engineering constraints;solid-state quantum computing;scalable quantum computing devices"} +{"name": "test_1953", "title": "Social percolation and the influence of mass media", "abstract": "In the marketing model of Solomon and Weisbuch, people buy a product only if their neighbours tell them of its quality, and if this quality is higher than their own quality expectations. Now we introduce additional information from the mass media, which is analogous to the ghost field in percolation theory. The mass media shift the percolative phase transition observed in the model, and decrease the time after which the stationary state is reached", "fulltext": "", "keywords": "cinema;solomon-weisbuch marketing model;stationary state;external field;customers;social percolation;ghost field;mass media influence;quality expectations;percolative phase transition"} +{"name": "test_1954", "title": "Estimating long-range dependence: finite sample properties and confidence", "abstract": "intervals A major issue in financial economics is the behavior of asset returns over long horizons. Various estimators of long-range dependence have been proposed. Even though some have known asymptotic properties, it is important to test their accuracy by using simulated series of different lengths. We test R/S analysis, detrended fluctuation analysis and periodogram regression methods on samples drawn from Gaussian white noise. The DFA statistics turns out to be the unanimous winner. Unfortunately, no asymptotic distribution theory has been derived for this statistics so far. We were able, however, to construct empirical (i.e. approximate) confidence intervals for all three methods. The obtained values differ largely from heuristic values proposed by some authors for the R/S statistics and are very close to asymptotic values for the periodogram regression method", "fulltext": "", "keywords": "long-range dependence;detrended fluctuation analysis;asymptotic properties;confidence intervals;periodogram regression methods;financial economics;heuristic values;asset returns;gaussian white noise;finite sample properties;long horizons"} +{"name": "test_1955", "title": "Simulation of evacuation processes using a bionics-inspired cellular automaton", "abstract": "model for pedestrian dynamics We present simulations of evacuation processes using a recently introduced cellular automaton model for pedestrian dynamics. This model applies a bionics approach to describe the interaction between the pedestrians using ideas from chemotaxis. Here we study a rather simple situation, namely the evacuation from a large room with one or two doors. It is shown that the variation of the model parameters allows to describe different types of behaviour, from regular to panic. We find a non-monotonic dependence of the evacuation times on the coupling constants. These times depend on the strength of the herding behaviour, with minimal evacuation times for some intermediate values of the couplings, i.e., a proper combination of herding and use of knowledge about the shortest way to the exit", "fulltext": "", "keywords": "bionics-inspired cellular automaton model;nonmonotonic dependence;herding behaviour;pedestrian dynamics;coupling constants;chemotaxis;evacuation processes simulation"} +{"name": "test_1956", "title": "Dynamical transition to periodic motions of a recurrent bus induced by nonstops", "abstract": "We study the dynamical behavior of a recurrent bus on a circular route with many bus stops when the recurrent bus passes some bus stops without stopping. The recurrent time (one period) is described in terms of a nonlinear map. It is shown that the recurrent bus exhibits the complex periodic behaviors. The dynamical transitions to periodic motions occur by increasing nonstops. The periodic motions depend on the property of an attractor of the nonlinear map. The period n of the attractor varies sensitively with the number of nonstops", "fulltext": "", "keywords": "periodic motions;nonstops;circular route;attractor;recurrent time;nonlinear map;dynamical transition;complex periodic behaviors;recurrent bus"} +{"name": "test_1957", "title": "The two populations' cellular automata model with predation based on the Penna", "abstract": "model In Penna's (1995) single-species asexual bit-string model of biological ageing, the Verhulst factor has too strong a restraining effect on the development of the population. Danuta Makowiec gave an improved model based on the lattice, where the restraining factor of the four neighbours take the place of the Verhulst factor. Here, we discuss the two populations' Penna model with predation on the planar lattice of two dimensions. A cellular automata model containing movable wolves and sheep has been built. The results show that both the quantity of the wolves and the sheep fluctuate in accordance with the law that one quantity increases while the other one decreases", "fulltext": "", "keywords": "wolves;verhulst factor;predation;lotka-volterra model;penna model;cellular automata model;sheep;restraining effect;biological ageing;lattice;single-species asexual bit-string model;population;fluctuation"} +{"name": "test_1958", "title": "Option pricing from path integral for non-Gaussian fluctuations. Natural", "abstract": "martingale and application to truncated Levy distributions Within a path integral formalism for non-Gaussian price fluctuations, we set up a simple stochastic calculus and derive a natural martingale for option pricing from the wealth balance of options, stocks, and bonds. The resulting formula is evaluated for truncated Levy distributions", "fulltext": "", "keywords": "natural martingale;nongaussian fluctuations;bonds;option pricing;truncated levy distributions;stochastic calculus;path integrals;stocks"} +{"name": "test_1959", "title": "Quantum market games", "abstract": "We propose a quantum-like description of markets and economics. The approach has roots in the recently developed quantum game theory", "fulltext": "", "keywords": "quantum strategies;economics;quantum market games;financial markets;quantum game theory"} +{"name": "test_196", "title": "On the emergence of rules in neural networks", "abstract": "A simple associationist neural network learns to factor abstract rules (i.e., grammars) from sequences of arbitrary input symbols by inventing abstract representations that accommodate unseen symbol sets as well as unseen but similar grammars. The neural network is shown to have the ability to transfer grammatical knowledge to both new symbol vocabularies and new grammars. Analysis of the state-space shows that the network learns generalized abstract structures of the input and is not simply memorizing the input strings. These representations are context sensitive, hierarchical, and based on the state variable of the finite-state machines that the neural network has learned. Generalization to new symbol sets or grammars arises from the spatial nature of the internal representations used by the network, allowing new symbol sets to be encoded close to symbol sets that have already been learned in the hidden unit space of the network. The results are counter to the arguments that learning algorithms based on weight adaptation after each exemplar presentation (such as the long term potentiation found in the mammalian nervous system) cannot in principle extract symbolic knowledge from positive examples as prescribed by prevailing human linguistic theory and evolutionary psychology", "fulltext": "", "keywords": "symbolic knowledge;cognitive neurosciences;neural network;abstract rules;learns;associationist neural network;state-space;associationist learning"} +{"name": "test_1960", "title": "Streaming, disruptive interference and power-law behavior in the exit dynamics", "abstract": "of confined pedestrians We analyze the exit dynamics of pedestrians who are initially confined in a room. Pedestrians are modeled as cellular automata and compete to escape via a known exit at the soonest possible time. A pedestrian could move forward, backward, left or right within each iteration time depending on adjacent cell vacancy and in accordance with simple rules that determine the compulsion to move and physical capability relative to his neighbors. The arching signatures of jamming were observed and the pedestrians exited in bursts of various sizes. Power-law behavior is found in the burst-size frequency distribution for exit widths w greater than one cell dimension (w > 1). The slope of the power-law curve varies with w from -1.3092 (w = 2) to -1.0720 (w = 20). Streaming which is a diffusive behavior, arises in large burst sizes and is more likely in a single-exit room with w = 1 and leads to a counterintuitive result wherein an average exit throughput Q is obtained that is higher than with w = 2, 3, or 4. For a two-exit room (w = 1), Q is not greater than twice the yield of a single-exit room. If the doors are not separated far enough (< 4w), Q becomes even significantly less due to a collective slow-down that emerges among pedestrians crossing in each other's path (disruptive interference effect). For the same w and door number, Q is also higher with relaxed pedestrians than with anxious ones", "fulltext": "", "keywords": "power-law behavior;streaming;arching signatures;burst-size frequency distribution;collective slow-down;self-organised criticality;iteration time;jamming;cellular automata;confined pedestrians;disruptive interference;exit dynamics;adjacent cell vacancy"} +{"name": "test_1961", "title": "The influence of tollbooths on highway traffic", "abstract": "We study the effects of tollbooths on the traffic flow. The highway traffic is simulated by the Nagel-Schreckenberg model. Various types of toll collection are examined, which can be characterized either by a waiting time or a reduced speed. A first-order phase transition is observed. The phase separation results a saturated flow, which is observed as a plateau region in the fundamental diagram. The effects of lane expansion near the tollbooth are examined. The full capacity of a highway can be restored. The emergence of vehicle queuing is studied. Besides the numerical results, we also obtain analytical expressions for various quantities. The numerical simulations can be well described by the analytical formulas. We also discuss the influence on the travel time and its variance. The tollbooth increases the travel time but decreases its variance. The differences between long- and short-distance travelers are also discussed", "fulltext": "", "keywords": "first-order phase transition;reduced speed;vehicle queuing;saturated flow;lane expansion;numerical simulations;tollbooths;highway traffic;waiting time;toll collection;nagel-schreckenberg model"} +{"name": "test_1962", "title": "The Bagsik Oscillator without complex numbers", "abstract": "We argue that the analysis of the so-called Bagsik Oscillator, recently published by Piotrowski and Sladkowski (2001), is erroneous due to: (1) the incorrect banking data used and (2) the application of statistical mechanism apparatus to processes that are totally deterministic", "fulltext": "", "keywords": "game theory;statistical mechanism apparatus;incorrect banking data;deterministic processes;noncomplex numbers;bagsik oscillator"} +{"name": "test_1963", "title": "The variance of firm growth rates: the 'scaling' puzzle", "abstract": "Recent evidence suggests that a power-law relationship exists between a firm's size and the variance of its growth rate. The flatness of the relation is regarded as puzzling, in that it suggests that large firms are not much more stable than small firms. It has been suggested that the powerlaw nature of the relationship reflects the presence of some form of correlation of growth rates across the firm's constituent businesses. Here, it is shown that a model of independent businesses which allows for the fact that these businesses vary in size, as modelled by a simple 'partitions of integers' model, provides a good representation of what is observed empirically", "fulltext": "", "keywords": "partitions of integers model;corporate growth;power-law;flatness;size distribution;firm growth rates;constituent businesses;scaling puzzle;correlation"} +{"name": "test_1964", "title": "Antipersistent Markov behavior in foreign exchange markets", "abstract": "A quantitative check of efficiency in US dollar/Deutsche mark exchange rates is developed using high-frequency (tick by tick) data. The antipersistent Markov behavior of log-price fluctuations of given size implies, in principle, the possibility of a statistical forecast. We introduce and measure the available information of the quote sequence, and we show how it can be profitable following a particular trading rule", "fulltext": "", "keywords": "exchange rates;us dollar;foreign exchange markets;antipersistent markov behavior;deutsche mark;forecasting;efficiency;statistical forecast;quote sequence;trading rule;log-price fluctuations;shannon entropy;high-frequency data"} +{"name": "test_1965", "title": "Stock market dynamics", "abstract": "We elucidate on several empirical statistical observations of stock market returns. Moreover, we find that these properties are recurrent and are also present in invariant measures of low-dimensional dynamical systems. Thus, we propose that the returns are modeled by the first Poincare return time of a low-dimensional chaotic trajectory. This modeling, which captures the recurrent properties of the return fluctuations, is able to predict well the evolution of the observed statistical quantities. In addition, it explains the reason for which stocks present simultaneously dynamical properties and high uncertainties. In our analysis, we use data from the S&P 500 index and the Brazilian stock Telebras", "fulltext": "", "keywords": "econophysics;low-dimensional chaotic trajectory;stock market returns;statistical quantities;brazilian stock;first poincare return time;low-dimensional dynamical systems;invariant measures;empirical statistical observations"} +{"name": "test_1966", "title": "Application of nonlinear time series analysis techniques to high-frequency", "abstract": "currency exchange data In this work we have applied nonlinear time series analysis to high-frequency currency exchange data. The time series studied are the exchange rates between the US Dollar and 18 other foreign currencies from within and without the Euro zone. Our goal was to determine if their dynamical behaviours were in some way correlated. The nonexistence of stationarity called for the application of recurrence quantification analysis as a tool for this analysis, and is based on the definition of several parameters that allow for the quantification of recurrence plots. The method was checked using the European Monetary System currency exchanges. The results show, as expected, the high correlation between the currencies that are part of the Euro, but also a strong correlation between the Japanese Yen, the Canadian Dollar and the British Pound. Singularities of the series are also demonstrated taking into account historical events, in 1996, in the Euro zone", "fulltext": "", "keywords": "exchange rates;us dollar;econophysics;stationarity;historical events;foreign currencies;canadian dollar;euro zone;european monetary system;british pound;recurrence quantification analysis;nonlinear time series;nonlinear dynamics;high-frequency currency exchange data;japanese yen;recurrence plots"} +{"name": "test_1967", "title": "Modeling daily realized futures volatility with singular spectrum analysis", "abstract": "Using singular spectrum analysis (SSA), we model the realized volatility and logarithmic standard deviations of two important futures return series. The realized volatility and logarithmic standard deviations are constructed following the methodology of Andersen et al. [J. Am. Stat. Ass. 96 (2001) 42-55] using intra-day transaction data. We find that SSA decomposes the volatility series quite well and effectively captures both the market trend (accounting for about 34-38% of the total variance in the series) and, more importantly, a number of underlying market periodicities. Reliable identification of any periodicities is extremely important for options pricing and risk management and we believe that SSA can be a useful addition to the financial practitioners' toolbox", "fulltext": "", "keywords": "econophysics;singular spectrum analysis;intraday transaction data;ssa;daily realized futures volatility;market periodicities;return series;financial practitioners;logarithmic standard deviations;options pricing;market trend;risk management;asset return"} +{"name": "test_1968", "title": "Phase control of higher-order squeezing of a quantum field", "abstract": "In a recent experiment [Phys. Rev. Lett. 88 (2002) 023601], phase-dependent photon statistics in a c.w. system has been observed in the mixing of a coherent field with a two-photon source. Their system has the advantage over other atomic transition-based fluorescent systems. In this paper, we examine further the squeezing properties of higher-order quantum fluctuations in one of the quadrature components of the combined field in this system. We demonstrate that efficient and lasting higher-order squeezing effects could be observed with proper choice of the relative phase between the pump and coherent fields. This nonclassical feature is attributed to a constructive two-photon interference. Relationship between the second- and higher-order squeezing of the field is discussed", "fulltext": "", "keywords": "phase-dependent photon statistics;quantum fluctuations;two-photon interference;atomic transition-based fluorescent systems;quantum field;phase control;higher-order squeezing;coherent field mixing"} +{"name": "test_1969", "title": "Modeling self-consistent multi-class dynamic traffic flow", "abstract": "In this study, we present a systematic self-consistent multiclass multilane traffic model derived from the vehicular Boltzmann equation and the traffic dispersion model. The multilane domain is considered as a two-dimensional space and the interaction among vehicles in the domain is described by a dispersion model. The reason we consider a multilane domain as a two-dimensional space is that the driving behavior of road users may not be restricted by lanes, especially motorcyclists. The dispersion model, which is a nonlinear Poisson equation, is derived from the car-following theory and the equilibrium assumption. Under the concept that all kinds of users share the finite section, the density is distributed on a road by the dispersion model. In addition, the dynamic evolution of the traffic flow is determined by the systematic gas-kinetic model derived from the Boltzmann equation. Multiplying Boltzmann equation by the zeroth, first- and second-order moment functions, integrating both side of the equation and using chain rules, we can derive continuity, motion and variance equation, respectively. However, the second-order moment function, which is the square of the individual velocity, is employed by previous researches does not have physical meaning in traffic flow", "fulltext": "", "keywords": "car-following theory;vehicular boltzmann equation;motion equation;traffic dispersion model;poisson equation;multilane traffic model;self-consistent multiclass dynamic traffic flow modeling;nonlinear poisson equation;road users;dynamic evolution;variance equation"} +{"name": "test_197", "title": "Mixture of experts classification using a hierarchical mixture model", "abstract": "A three-level hierarchical mixture model for classification is presented that models the following data generation process: (1) the data are generated by a finite number of sources (clusters), and (2) the generation mechanism of each source assumes the existence of individual internal class-labeled sources (subclusters of the external cluster). The model estimates the posterior probability of class membership similar to a mixture of experts classifier. In order to learn the parameters of the model, we have developed a general training approach based on maximum likelihood that results in two efficient training algorithms. Compared to other classification mixture models, the proposed hierarchical model exhibits several advantages and provides improved classification performance as indicated by the experimental results", "fulltext": "", "keywords": "classification;experts classifier;posterior probability of class membership;bayes classifier;hierarchical mixture model;data generation process"} +{"name": "test_1970", "title": "eMarketing: restaurant Web sites that click", "abstract": "A number of global companies have adopted electronic commerce as a means of reducing transaction related expenditures, connecting with current and potential customers, and enhancing revenues and profitability. If a restaurant is to have an Internet presence, what aspects of the business should be highlighted? Food service companies that have successfully ventured onto the web have employed assorted web-based technologies to create a powerful marketing tool of unparalleled strength. Historically, it has been difficult to create a set of criteria against which to evaluate website effectiveness. As practitioners consider additional resources for website development, the effectiveness of e-marketing investment becomes increasingly important. Care must be exercised to ensure that the quality of the site adheres to high standards and incorporates evolving technology, as appropriate. Developing a coherent website strategy, including an effective website design, are proving critical to an effective web presence", "fulltext": "", "keywords": "food service companies;electronic commerce;internet presence;e-marketing;profitability;revenues;restaurant web sites"} +{"name": "test_1971", "title": "Exploring developments in Web based relationship marketing within the hotel", "abstract": "industry This paper provides a content analysis study of the application of World Wide Web marketing by the hotel industry. There is a lack of historical perspective on industry related Web marketing applications and this paper attempts to resolve this with a two-year follow-up case study of the changing use of the Web to develop different types of relationships. Specifically, the aims are: (1) to identify key changes in the way hotels are using the Web; (2) to look for evidence of the adoption of a relationship marketing (RM) model as a strategy for the development of hotel Web sites and the use of new technologies; and, (3) To investigate the use of multimedia in hotel Web sites. The development and strategic exploitation of the Internet has transformed the basis of marketing. Using the evidence from a Web content survey this study reveals the way relationships are being created and managed within the hotel industry by its use of the Web as a marketing tool. The authors have collected evidence by means of a descriptive study on the way hotels build and create relationships with their Web presence delivering multimedia information as well as channel and interactive means of communication. In addition a strategic framework is offered as the means to describe the mechanism and orientation of Web based marketing by hotels. The study utilizes a model by Gilbert (1996) as a means of developing a measurement instrument to allow a content analysis of the current approach by hotels to the development of Web sites. The results indicate hotels are aware of the new uses of Web technology and are promoting hotel products in the global electronic market in new and sophisticated ways", "fulltext": "", "keywords": "hotel web sites;multimedia;hotel industry;web content survey;world wide web marketing;global electronic market;web based relationship marketing"} +{"name": "test_1972", "title": "Online auctions: dynamic pricing and the lodging industry", "abstract": "The traditional channels of distribution for overnight accommodation are rapidly being displaced by Web site scripting, online intermediaries, and specialty brokers. Businesses that pioneered Internet usage relied on it as a sales and marketing alternative to predecessor product distribution channels. As such, Web sites replace the traditional trading model to the Internet. Web-enabled companies are popular because the medium renders the process faster, less costly, highly reliable, and secure. Auction-based models impact business models by converting the price setting mechanism from supplier-centric to market-centric and transforming the trading model from \"one to many\" to \"many to many.\" Historically, pricing was based on the cost of production plus a margin of profit. Traditionally, as products and services move through the supply chain, from the producer to the consumer, various intermediaries added their share of profit to the price. As Internet based mediums of distribution become more prevalent, traditional pricing models are being supplanted with dynamic pricing. A dynamic pricing model represents a flexible system that changes prices not only from product to product, but also from customer to customer and transaction to transaction. Many industry leaders are skeptical of the long run impact of online auctions on lodging industry profit margins, despite the fact pricing theory suggests that an increase in the flow of information results in efficient market pricing. The future of such endeavors remains promising, but controversial", "fulltext": "", "keywords": "sales;lodging industry;internet usage;price setting mechanism;specialty brokers;supply chain;overnight accommodations;dynamic pricing;online auctions;marketing;trading model;online intermediaries;web site scripting;business models"} +{"name": "test_1973", "title": "Affine invariants of convex polygons", "abstract": "In this correspondence, we prove that the affine invariants, for image registration and object recognition, proposed recently by Yang and Cohen (see ibid., vol.8, no.7, p.934-46, July 1999) are algebraically dependent. We show how to select an independent and complete set of the invariants. The use of this new set leads to a significant reduction of the computing complexity without decreasing the discrimination power", "fulltext": "", "keywords": "feature vector;object recognition;convex polygons;convex quadruplet;affine invariants;algebraically dependent. invariants;complexity reduction;image registration"} +{"name": "test_1974", "title": "Real-time implementation of a new low-memory SPIHT image coding algorithm using", "abstract": "DSP chip Among all algorithms based on wavelet transform and zerotree quantization, Said and Pearlman's (1996) set partitioning in hierarchical trees (SPIHT) algorithm is well-known for its simplicity and efficiency. This paper deals with the real-time implementation of SPIHT algorithm using DSP chip. In order to facilitate the implementation and improve the codec's performance, some relative issues are thoroughly discussed, such as the optimization of program structure to speed up the wavelet decomposition. SPIHT's high memory requirement is a major drawback for hardware implementation. In this paper, we modify the original SPIHT algorithm by presenting two new concepts-number of error bits and absolute zerotree. Consequently, the memory cost is significantly reduced. We also introduce a new method to control the coding process by number of error bits. Our experimental results show that the implementation meets common requirement of real-time video coding and is proven to be a practical and efficient DSP solution", "fulltext": "", "keywords": "zerotree quantization;absolute zerotree;codec;set partitioning in hierarchical trees;wavelet transform;number of error bits;wavelet decomposition;dsp chip;spiht algorithm;real-time implementation;video coding;memory cost reduction"} +{"name": "test_1975", "title": "Efficient computation of local geometric moments", "abstract": "Local moments have attracted attention as local features in applications such as edge detection and texture segmentation. The main reason for this is that they are inherently integral-based features, so that their use reduces the effect of uncorrelated noise. The computation of local moments, when viewed as a neighborhood operation, can be interpreted as a convolution of the image with a set of masks. Nevertheless, moments computed inside overlapping windows are not independent and convolution does not take this fact into account. By introducing a matrix formulation and the concept of accumulation moments, this paper presents an algorithm which is computationally much more efficient than convolving and yet as simple", "fulltext": "", "keywords": "edge detection;computationally efficient algorithm;accumulation moments;image convolution;neighborhood operation;matrix formulation;overlapping windows;integral-based features;texture segmentation;local features;local geometric moments computation;image analysis"} +{"name": "test_1976", "title": "Adaptive image denoising using scale and space consistency", "abstract": "This paper proposes a new method for image denoising with edge preservation, based on image multiresolution decomposition by a redundant wavelet transform. In our approach, edges are implicitly located and preserved in the wavelet domain, whilst image noise is filtered out. At each resolution level, the image edges are estimated by gradient magnitudes (obtained from the wavelet coefficients), which are modeled probabilistically, and a shrinkage function is assembled based on the model obtained. Joint use of space and scale consistency is applied for better preservation of edges. The shrinkage functions are combined to preserve edges that appear simultaneously at several resolutions, and geometric constraints are applied to preserve edges that are not isolated. The proposed technique produces a filtered version of the original image, where homogeneous regions appear separated by well-defined edges. Possible applications include image presegmentation, and image denoising", "fulltext": "", "keywords": "edge preservation;gradient magnitudes;adaptive image denoising;redundant wavelet transform;image multiresolution decomposition;scale consistency;shrinkage function;image edges;geometric constraints;edge enhancement;space consistency"} +{"name": "test_1977", "title": "Tracking nonparameterized object contours in video", "abstract": "We propose a new method for contour tracking in video. The inverted distance transform of the edge map is used as an edge indicator function for contour detection. Using the concept of topographical distance, the watershed segmentation can be formulated as a minimization. This new viewpoint gives a way to combine the results of the watershed algorithm on different surfaces. In particular, our algorithm determines the contour as a combination of the current edge map and the contour, predicted from the tracking result in the previous frame. We also show that the problem of background clutter can be relaxed by taking the object motion into account. The compensation with object motion allows to detect and remove spurious edges in background. The experimental results confirm the expected advantages of the proposed method over the existing approaches", "fulltext": "", "keywords": "watershed segmentation;contour tracking;edge detection;video;motion estimation;motion analysis;background clutter;nonparameterized object contours;minimization;edge indicator function;inverted distance transform;edge map;object motion;topographical distance"} +{"name": "test_1978", "title": "Multilayered image representation: application to image compression", "abstract": "The main contribution of this work is a new paradigm for image representation and image compression. We describe a new multilayered representation technique for images. An image is parsed into a superposition of coherent layers: piecewise smooth regions layer, textures layer, etc. The multilayered decomposition algorithm consists in a cascade of compressions applied successively to the image itself and to the residuals that resulted from the previous compressions. During each iteration of the algorithm, we code the residual part in a lossy way: we only retain the most significant structures of the residual part, which results in a sparse representation. Each layer is encoded independently with a different transform, or basis, at a different bitrate, and the combination of the compressed layers can always be reconstructed in a meaningful way. The strength of the multilayer approach comes from the fact that different sets of basis functions complement each others: some of the basis functions will give reasonable account of the large trend of the data, while others will catch the local transients, or the oscillatory patterns. This multilayered representation has a lot of beautiful applications in image understanding, and image and video coding. We have implemented the algorithm and we have studied its capabilities", "fulltext": "", "keywords": "image compression;transform coding;textures layer;wavelet transforms;sparse representation;residual part;cosine transforms;image representation;piecewise smooth regions layer;multilayered representation;multilayered decomposition algorithm;basis functions"} +{"name": "test_1979", "title": "Combining spatial and scale-space techniques for edge detection to provide a", "abstract": "spatially adaptive wavelet-based noise filtering algorithm New methods for detecting edges in an image using spatial and scale-space domains are proposed. A priori knowledge about geometrical characteristics of edges is used to assign a probability factor to the chance of any pixel being on an edge. An improved double thresholding technique is introduced for spatial domain filtering. Probabilities that pixels belong to a given edge are assigned based on pixel similarity across gradient amplitudes, gradient phases and edge connectivity. The scale-space approach uses dynamic range compression to allow wavelet correlation over a wider range of scales. A probabilistic formulation is used to combine the results obtained from filtering in each domain to provide a final edge probability image which has the advantages of both spatial and scale-space domain methods. Decomposing this edge probability image with the same wavelet as the original image permits the generation of adaptive filters that can recognize the characteristics of the edges in all wavelet detail and approximation images regardless of scale. These matched filters permit significant reduction in image noise without contributing to edge distortion. The spatially adaptive wavelet noise-filtering algorithm is qualitatively and quantitatively compared to a frequency domain and two wavelet based noise suppression algorithms using both natural and computer generated noisy images", "fulltext": "", "keywords": "adaptive filters;pixel similarity;final edge probability image;noise suppression;gradient amplitudes;dynamic range compression;probabilistic formulation;probability factor;wavelet correlation;scale-space techniques;geometrical characteristics;spatially adaptive wavelet-based noise filtering algorithm;matched filters;approximation images;double thresholding technique;edge connectivity;spatially adaptive wavelet noise-filtering algorithm;spatial domain filtering;edge detection;gradient phases;a priori knowledge;image noise;spatial techniques"} +{"name": "test_198", "title": "Computational capacity of an odorant discriminator: the linear separability of", "abstract": "curves We introduce and study an artificial neural network inspired by the probabilistic receptor affinity distribution model of olfaction. Our system consists of N sensory neurons whose outputs converge on a single processing linear threshold element. The system's aim is to model discrimination of a single target odorant from a large number p of background odorants within a range of odorant concentrations. We show that this is possible provided p does not exceed a critical value p/sub c/ and calculate the critical capacity alpha c=p/sub c//N. The critical capacity depends on the range of concentrations in which the discrimination is to be accomplished. If the olfactory bulb may be thought of as a collection of such processing elements, each responsible for the discrimination of a single odorant, our study provides a quantitative analysis of the potential computational properties of the olfactory bulb. The mathematical formulation of the problem we consider is one of determining the capacity for linear separability of continuous curves, embedded in a large-dimensional space. This is accomplished here by a numerical study, using a method that signals whether the discrimination task is realizable, together with a finite-size scaling analysis", "fulltext": "", "keywords": "receptor affinity distribution;olfaction;odorant discriminator;linear separability;sensory neurons;artificial neural network;linear threshold element"} +{"name": "test_1980", "title": "Lossy to lossless object-based coding of 3-D MRI data", "abstract": "We propose a fully three-dimensional (3-D) object-based coding system exploiting the diagnostic relevance of the different regions of the volumetric data for rate allocation. The data are first decorrelated via a 3-D discrete wavelet transform. The implementation via the lifting steps scheme allows to map integer-to-integer values, enabling lossless coding, and facilitates the definition of the object-based inverse transform. The coding process assigns disjoint segments of the bitstream to the different objects, which can be independently accessed and reconstructed at any up-to-lossless quality. Two fully 3-D coding strategies are considered: embedded zerotree coding (EZW-3D) and multidimensional layered zero coding (MLZC), both generalized for region of interest (ROI)-based processing. In order to avoid artifacts along region boundaries, some extra coefficients must be encoded for each object. This gives rise to an overheading of the bitstream with respect to the case where the volume is encoded as a whole. The amount of such extra information depends on both the filter length and the decomposition depth. The system is characterized on a set of head magnetic resonance images. Results show that MLZC and EZW-3D have competitive performances. In particular, the best MLZC mode outperforms the others state-of-the-art techniques on one of the datasets for which results are available in the literature", "fulltext": "", "keywords": "roi-based processing;object-based inverse transform;filter length;lossy object-based coding;head magnetic resonance images;integer-to-integer values;multidimensional layered zero coding;ezw-3d;lossless object-based coding;embedded zerotree coding;region boundaries;rate allocation;nmzq;3-d mri data;decomposition depth;3-d discrete wavelet transform;bitstream;lifting steps scheme;region of interest-based processing;disjoint segments;volumetric data;diagnostic relevance"} +{"name": "test_1981", "title": "Optimal linear control in stabilizer design", "abstract": "The most common method of improving stability of the power system is the synthesis of the turbine and generator control systems, because of the high effectiveness and relatively low cost of these elements. The synthesis and construction of the effective synchronous generator and turbine controller is a very difficult task. This paper describes the seven step mu -synthesis approach to PSS design enabling the synchronous generator to remain stable over a wide range of system operating conditions", "fulltext": "", "keywords": "mu -synthesis approach;pss design;optimal linear control;turbine control system synthesis;synchronous generator control system synthesis"} +{"name": "test_1982", "title": "Verifying resonant grounding in distribution systems", "abstract": "The authors describe RESFAL, a software tool that can check on the behavior of distribution network resonant grounding systems with regard to compensation coil tuning and to fault detection", "fulltext": "", "keywords": "compensation coil tuning;resfal software tool;fault detection;computer simulation;resonant grounding systems;power distribution systems"} +{"name": "test_1983", "title": "Power electronics spark new simulation challenges", "abstract": "This article discusses some of the changes that have taken place in power systems and explores some of the inherent requirements for simulation technologies in order to keep up with this rapidly changing environment. The authors describe how energy utilities are realizing that, with the appropriate tools, they can train and sustain engineers who can maintain a great insight into system dynamics", "fulltext": "", "keywords": "simulation challenges;simulation technologies;electric utilities;power system computer simulation;power electronics"} +{"name": "test_1984", "title": "Deriving model parameters from field test measurements [generator control", "abstract": "simulation] A major component of any power system simulation is the generating plant. The purpose of DeriveAssist is to speed up the parameter derivation process and to allow engineers less versed in parameter matching and identification to get involved in the process of power plant electric generator modelling", "fulltext": "", "keywords": "power system stability analysis;control simulation;computer simulation;parameter matching;parameter identification;turbine/governor;deriveassist;parameter derivation process;steady-state parameters derivation;generator parameter derivation process;power system simulation"} +{"name": "test_1985", "title": "Prospective on computer applications in power", "abstract": "The so-called \"deregulation\" and restructuring of the electric power industry have made it very difficult to keep up with industry changes and have made it much more difficult to envision the future. In this article, current key issues and major developments of the past few years are reviewed to provide perspective, and prospects for future computer applications in power are suggested. Technology changes are occurring at an exponential rate. The interconnected bulk electric systems are becoming integrated with vast networked information systems. This article discusses the skills that will be needed by future power engineers to keep pace with these developments and trends", "fulltext": "", "keywords": "technology changes;interconnected bulk electric systems;electricity industry restructuring;computer applications;networked information systems;electric power industry deregulation"} +{"name": "test_1986", "title": "Control centers are here to stay", "abstract": "Despite changes with different structures, market rules, and uncertainties, a control center must always be in place to maintain the security, reliability, and quality of electric service. This article focuses on the energy management system (EMS) control center, identifying the major functions that have become standard components of every application software package. The two most important control center functions, security control and load-following control, guarantee the continuity of electric service, which after all, is the end-product of the utility business. New technology trends in the design of control center infrastructures are emerging in the liberalized environment of the energy market. An example of a control center infrastructure is described. The article ends with a concern for the security of the control center itself", "fulltext": "", "keywords": "ems control centers;load-following control;application software package;standard components;energy market;energy management system;electric service continuity;security control;control center infrastructures;liberalized environment"} +{"name": "test_1987", "title": "Virus hunting", "abstract": "We all appreciate the need for, and hopefully we have all deployed, anti-virus software. The good news is that AV software has come a long way fast. Four or so years ago it was true to write that AV software could not detect Trojan Horses and similar intrusion attempts. Now it can and does. McAfee's VirusScan, for example, goes one further; it detects viruses, worms and Trojan Horses and deploys itself as a firewall to filter data packets, control access to Internet resources, activate rule sets for specific applications, in general to protect against hackers. But like so much software, we use it with little thought as to how it came to do its job. Behind the scenes there is an army of top notch programmers trying to stay ahead of the baddies who, at the last count, had produced some 60,000 viruses", "fulltext": "", "keywords": "anti-virus software;programmers;trojan horses;worms"} +{"name": "test_1988", "title": "Integration is key - an introduction to enterprise application integration", "abstract": "(EAI) technology Over the past few years, numerous organisations have invested in the latest software applications to drive their business forward. But many are now finding that these systems are becoming redundant on their own. The key to staying ahead of the competition in today's current climate is now to integrate all of these systems, says Justin Opie, Portfolio Director at Imark Communications", "fulltext": "", "keywords": "imark communications;enterprise application integration"} +{"name": "test_1989", "title": "Managing system risk", "abstract": "Companies are increasingly required to provide assurance that their systems are secure and conform to commercial security standards. Senior business managers are ultimately responsible for the security of their corporate systems and for the implications in the event of a failure. Businesses will be exposed to unquantified security risks unless they have a formal risk management framework in place to enable risks to be identified, evaluated and managed. Failure to assess and manage risks can lead to a business suffering serious financial impacts, commercial embarrassment and fines or sanctions from regulators. This is both a key responsibility and opportunity for Management Services Practitioners", "fulltext": "", "keywords": "commercial security standards;risk management framework;it projects"} +{"name": "test_199", "title": "On optimality in auditory information processing", "abstract": "We study limits for the detection and estimation of weak sinusoidal signals in the primary part of the mammalian auditory system using a stochastic Fitzhugh-Nagumo model and an action-recovery model for synaptic depression. Our overall model covers the chain from a hair cell to a point just after the synaptic connection with a cell in the cochlear nucleus. The information processing performance of the system is evaluated using so-called phi -divergences from statistics that quantify \"dissimilarity\" between probability measures and are intimately related to a number of fundamental limits in statistics and information theory (IT). We show that there exists a set of parameters that can optimize several important phi -divergences simultaneously and that this set corresponds to a constant quiescent firing rate (QFR) of the spiral ganglion neuron. The optimal value of the QFR is frequency dependent but is essentially independent of the amplitude of the signal (for small amplitudes). Consequently, optimal processing according to several standard IT criteria can be accomplished for this model if and only if the parameters are \"tuned\" to values that correspond to one and the same QFR. This offers a new explanation for the QFR and can provide new insight into the role played by several other parameters of the peripheral auditory system", "fulltext": "", "keywords": "stochastic fitzhugh-nagumo model;peripheral auditory system;mammalian auditory system;spiral ganglion neuron;action-recovery model;weak sinusoidal signals;brain;quiescent firing rate"} +{"name": "test_1990", "title": "Electrical facility construction work for information network structuring by", "abstract": "the use of sewage conduits To confront the advent of the advanced information society, there has been a pressing demand for the adjustment of the communications infrastructure and the structuring of the information network by utilizing the sewage conduits. The City of Tokyo is promoting a project by the name of the sewer optical fiber teleway (SOFT) network plan. According to this plan, the total distance of the optical fiber network laid in the sewer conduits is scheduled to reach about 470 km by the end of March 2000. At the final stage, this distance will reach 800 km as a whole. We completed the construction work for the information control facilities scattered in 11 places inclusive of the Treatment Site S, with the intention to adjust and extend the information transmission network laid through the above-mentioned optical fiber network, to be used exclusively by the Bureau of Sewerage. This construction work is described in the paper", "fulltext": "", "keywords": "information control facilities;sewer optical fiber teleway network plan;communications infrastructure;atm switches;bureau of sewerage;asynchronous transmission mode switches;treatment site s;information network structuring;tokyo;information transmission network;sewage conduits;electrical facility construction work"} +{"name": "test_1991", "title": "A framework for evaluating the data-hiding capacity of image sources", "abstract": "An information-theoretic model for image watermarking and data hiding is presented in this paper. Previous theoretical results are used to characterize the fundamental capacity limits of image watermarking and data-hiding systems. Capacity is determined by the statistical model used for the host image, by the distortion constraints on the data hider and the attacker, and by the information available to the data hider, to the attacker, and to the decoder. We consider autoregressive, block-DCT, and wavelet statistical models for images and compute data-hiding capacity for compressed and uncompressed host-image sources. Closed-form expressions are obtained under sparse-model approximations. Models for geometric attacks and distortion measures that are invariant to such attacks are considered", "fulltext": "", "keywords": "uncompressed host-image sources;sparse-model approximations;information-theoretic model;data-hiding capacity;watermarking;distortion constraints;closed-form expressions;geometric attacks;block-dct statistical models;capacity limits;wavelet statistical models;autoregressive statistical models;statistical model;distortion measures;compressed host-image sources;image sources"} +{"name": "test_1992", "title": "Geometrically invariant watermarking using feature points", "abstract": "This paper presents a new approach for watermarking of digital images providing robustness to geometrical distortions. The weaknesses of classical watermarking methods to geometrical distortions are outlined first. Geometrical distortions can be decomposed into two classes: global transformations such as rotations and translations and local transformations such as the StirMark attack. An overview of existing self-synchronizing schemes is then presented. Theses schemes can use periodical properties of the mark, invariant properties of transforms, template insertion, or information provided by the original image to counter geometrical distortions. Thereafter, a new class of watermarking schemes using the image content is presented. We propose an embedding and detection scheme where the mark is bound with a content descriptor defined by salient points. Three different types of feature points are studied and their robustness to geometrical transformations is evaluated to develop an enhanced detector. The embedding of the signature is done by extracting feature points of the image and performing a Delaunay tessellation on the set of points. The mark is embedded using a classical additive scheme inside each triangle of the tessellation. The detection is done using correlation properties on the different triangles. The performance of the presented scheme is evaluated after JPEG compression, geometrical attack and transformations. Results show that the fact that the scheme is robust to these different manipulations. Finally, in our concluding remarks, we analyze the different perspectives of such content-based watermarking scheme", "fulltext": "", "keywords": "transforms;feature extraction;content descriptor;jpeg compression;global transformations;geometrical attack;stirmark attack;additive scheme;geometrically invariant watermarking;rotations;digital images;periodical properties;invariant properties;delaunay tessellation;embedding;correlation properties;template insertion;detection scheme;geometrical distortions;translations;local transformations;image content;feature points;self-synchronizing schemes"} +{"name": "test_1993", "title": "Color plane interpolation using alternating projections", "abstract": "Most commercial digital cameras use color filter arrays to sample red, green, and blue colors according to a specific pattern. At the location of each pixel only one color sample is taken, and the values of the other colors must be interpolated using neighboring samples. This color plane interpolation is known as demosaicing; it is one of the important tasks in a digital camera pipeline. If demosaicing is not performed appropriately, images suffer from highly visible color artifacts. In this paper we present a new demosaicing technique that uses inter-channel correlation effectively in an alternating-projections scheme. We have compared this technique with six state-of-the-art demosaicing techniques, and it outperforms all of them, both visually and in terms of mean square error", "fulltext": "", "keywords": "color artifacts;digital cameras;color plane interpolation;alternating projections;demosaicing;color filter arrays;inter-channel correlation"} +{"name": "test_1994", "title": "A comparison of computational color constancy Algorithms. II. Experiments with", "abstract": "image data For pt.I see ibid., vol. 11, no.9, p.972-84 (2002). We test a number of the leading computational color constancy algorithms using a comprehensive set of images. These were of 33 different scenes under 11 different sources representative of common illumination conditions. The algorithms studied include two gray world methods, a version of the Retinex method, several variants of Forsyth's (1990) gamut-mapping method, Cardei et al.'s (2000) neural net method, and Finlayson et al.'s color by correlation method (Finlayson et al. 1997, 2001; Hubel and Finlayson 2000). We discuss a number of issues in applying color constancy ideas to image data, and study in depth the effect of different preprocessing strategies. We compare the performance of the algorithms on image data with their performance on synthesized data. All data used for this study are available online at http://www.cs.sfu.ca/~color/data, and implementations for most of the algorithms are also available (http://www.cs.sfu.ca/~color/code). Experiments with synthesized data (part one of this paper) suggested that the methods which emphasize the use of the input data statistics, specifically color by correlation and the neural net algorithm, are potentially the most effective at estimating the chromaticity of the scene illuminant. Unfortunately, we were unable to realize comparable performance on real images. Here exploiting pixel intensity proved to be more beneficial than exploiting the details of image chromaticity statistics, and the three-dimensional (3-D) gamut-mapping algorithms gave the best performance", "fulltext": "", "keywords": "computational color constancy algorithms;input data statistics;retinex method;images;gray world methods;synthesized data;scene illuminant;illumination conditions;pixel intensity;image data;chromaticity;preprocessing strategies;gamut-mapping method;neural net method;color by correlation method"} +{"name": "test_1995", "title": "A comparison of computational color constancy algorithms. I: Methodology and", "abstract": "experiments with synthesized data We introduce a context for testing computational color constancy, specify our approach to the implementation of a number of the leading algorithms, and report the results of three experiments using synthesized data. Experiments using synthesized data are important because the ground truth is known, possible confounds due to camera characterization and pre-processing are absent, and various factors affecting color constancy can be efficiently investigated because they can be manipulated individually and precisely. The algorithms chosen for close study include two gray world methods, a limiting case of a version of the Retinex method, a number of variants of Forsyth's (1990) gamut-mapping method, Cardei et al.'s (2000) neural net method, and Finlayson et al.'s color by correlation method (Finlayson et al. 1997, 2001; Hubel and Finlayson 2000) . We investigate the ability of these algorithms to make estimates of three different color constancy quantities: the chromaticity of the scene illuminant, the overall magnitude of that illuminant, and a corrected, illumination invariant, image. We consider algorithm performance as a function of the number of surfaces in scenes generated from reflectance spectra, the relative effect on the algorithms of added specularities, and the effect of subsequent clipping of the data. All data is available on-line at http://www.cs.sfu.ca/~color/data, and implementations for most of the algorithms are also available (http://www.cs.sfu.ca/~color/code)", "fulltext": "", "keywords": "computational color constancy algorithms;retinex method;gray world methods;synthesized data;reflectance spectra;scene illuminant;specularities;algorithm performance;chromaticity;clipping;illumination invariant image;gamut-mapping method;neural net method;color by correlation method"} +{"name": "test_1996", "title": "Quality image metrics for synthetic images based on perceptual color", "abstract": "differences Due to the improvement of image rendering processes, and the increasing importance of quantitative comparisons among synthetic color images, it is essential to define perceptually based metrics which enable to objectively assess the visual quality of digital simulations. In response to this need, this paper proposes a new methodology for the determination of an objective image quality metric, and gives an answer to this problem through three metrics. This methodology is based on the LLAB color space for perception of color in complex images, a modification of the CIELab1976 color space. The first metric proposed is a pixel by pixel metric which introduces a local distance map between two images. The second metric associates, to a pair of images, a global value. Finally, the third metric uses a recursive subdivision of the images to obtain an adaptative distance map, rougher but less expensive to compute than the first method", "fulltext": "", "keywords": "cielab1976 color space;perceptual color differences;local distance map;recursive subdivision;pixel by pixel metric;llab color space;digital simulations;synthetic images;image rendering;visual quality;global value;color images;quality image metrics;perceptually based metrics;adaptative distance map"} +{"name": "test_1997", "title": "Exact controllability of shells in minimal time", "abstract": "We prove an exact controllability result for thin cups using the Fourier method and recent improvements of Ingham (1936) type theorems", "fulltext": "", "keywords": "controllability;young modulus;hilbert space;minimal time;ingham type theorems;partial differential equations;shells;fourier method;thin cups"} +{"name": "test_1998", "title": "A friction compensator for pneumatic control valves", "abstract": "A procedure that compensates for static friction (stiction) in pneumatic control valves is presented. The compensation is obtained by adding pulses to the control signal. The characteristics of the pulses are determined from the control action. The compensator is implemented in industrial controllers and control systems, and the industrial experiences show that the procedure reduces the control error during stick-slip motion significantly compared to standard control without stiction compensation", "fulltext": "", "keywords": "control error reduction;stick-slip motion;stiction compensation;static friction compensation;friction compensator;pneumatic control valves;industrial controllers;standard control"} +{"name": "test_1999", "title": "Performance comparison between PID and dead-time compensating controllers", "abstract": "This paper is intended to answer the question: \"When can a simple dead-time compensator be expected to perform better than a PID?\". The performance criterion used is the integrated absolute error (IAE). It is compared for PI and PID controllers and a simple dead-time compensator (DTC) when a step load disturbance is applied at the plant input. Both stable and integrating processes are considered. For a fair comparison the controllers should provide equal robustness in some sense. Here, as a measure of robustness, the H/sub infinity / norm of the sum of the absolute values of the sensitivity function and the complementary sensitivity function is used. Performance of the DTC's is given also as a function of dead-time margin (D/sub M/)", "fulltext": "", "keywords": "dead-time compensator;stable processes;dead-time margin;pi controllers;dead-time compensating controllers;performance comparison;dtc;pid controllers;equal robustness;absolute value sum h/sub infinity / norm;iae;performance criterion;integrating processes;complementary sensitivity function;step load disturbance;integrated absolute error"} +{"name": "test_2", "title": "Waiting for the wave to crest [wavelength services]", "abstract": "Wavelength services have been hyped ad nauseam for years. But despite their quick turn-up time and impressive margins, such services have yet to live up to the industry's expectations. The reasons for this lukewarm reception are many, not the least of which is the confusion that still surrounds the technology, but most industry observers are still convinced that wavelength services with ultimately flourish", "fulltext": "", "keywords": "fiber optic networks;wavelength services;looking glass networks;pointeast research"} +{"name": "test_20", "title": "Adaptive state feedback control for a class of linear systems with unknown bounds of uncertainties", "abstract": "The problem of adaptive robust stabilization for a class of linear time-varying systems with disturbance and nonlinear uncertainties is considered. The bounds of the disturbance and uncertainties are assumed to be unknown, being even arbitrary. For such uncertain dynamical systems, the adaptive robust state feedback controller is obtained. And the resulting closed-loop systems are asymptotically stable in theory. Moreover, an adaptive robust state feedback control scheme is given. The scheme ensures the closed-loop systems exponentially practically stable and can be used in practical engineering. Finally, simulations show that the control scheme is effective", "fulltext": "", "keywords": "closed-loop systems;nonlinear uncertainties;state feedback;linear time-varying systems;adaptive stabilization;robust control;adaptive controller;uncertain systems;uncertain dynamical systems;robust stabilization"} +{"name": "test_200", "title": "Preintegration lateral inhibition enhances unsupervised learning", "abstract": "A large and influential class of neural network architectures uses postintegration lateral inhibition as a mechanism for competition. We argue that these algorithms are computationally deficient in that they fail to generate, or learn, appropriate perceptual representations under certain circumstances. An alternative neural network architecture is presented here in which nodes compete for the right to receive inputs rather than for the right to generate outputs. This form of competition, implemented through preintegration lateral inhibition, does provide appropriate coding properties and can be used to learn such representations efficiently. Furthermore, this architecture is consistent with both neuroanatomical and neuropsychological data. We thus argue that preintegration lateral inhibition has computational advantages over conventional neural network architectures while remaining equally biologically plausible", "fulltext": "", "keywords": "neural network;unsupervised learning;competition;preintegration lateral inhibition;neural network architectures;postintegration lateral inhibition"} +{"name": "test_2000", "title": "Generalized predictive control for non-uniformly sampled systems", "abstract": "In this paper, we study digital control systems with non-uniform updating and sampling patterns, which include multirate sampled-data systems as special cases. We derive lifted models in the state-space domain. The main obstacle for generalized predictive control (GPC) design using the lifted models is the so-called causality constraint. Taking into account this design constraint, we propose a new GPC algorithm, which results in optimal causal control laws for the non-uniformly sampled systems. The solution applies immediately to multirate sampled-data systems where rates are integer multiples of some base period", "fulltext": "", "keywords": "optimal causal control laws;nonuniformly sampled systems;gpc;generalized predictive control design;causality constraint;nonuniform sampling patterns;state-space models;digital control systems;nonuniform updating patterns;multirate sampled-data systems;integer multiples"} +{"name": "test_2001", "title": "A simple graphic approach for observer decomposition", "abstract": "Based upon the proposition that the roles of inputs and outputs in a physical system and those in the corresponding output-injection observer do not really have to be consistent, a systematic procedure is developed in this work to properly divide a set of sparse system models and measurement models into a number of independent subsets with the help of a visual aid. Several smaller sub-observers can then be constructed accordingly to replace the original one. The size of each sub-observer may be further reduced by strategically selecting one or more appended states. These techniques are shown to be quite effective in relieving on-line computation load of the output-injection observers and also in identifying detectable sub-systems", "fulltext": "", "keywords": "observer decomposition;independent subsets;measurement models;detectable subsystems;sub-observers;online computation load;sparse system models;output-injection observer;graphic approach"} +{"name": "test_2002", "title": "A new subspace identification approach based on principal component analysis", "abstract": "Principal component analysis (PCA) has been widely used for monitoring complex industrial processes with multiple variables and diagnosing process and sensor faults. The objective of this paper is to develop a new subspace identification algorithm that gives consistent model estimates under the errors-in-variables (EIV) situation. In this paper, we propose a new subspace identification approach using principal component analysis. PCA naturally falls into the category of EIV formulation, which resembles total least squares and allows for errors in both process input and output. We propose to use PCA to determine the system observability subspace, the matrices and the system order for an EIV formulation. Standard PCA is modified with instrumental variables in order to achieve consistent estimates of the system matrices. The proposed subspace identification method is demonstrated using a simulated process and a real industrial process for model identification and order determination. For comparison the MOESP algorithm and N4SID algorithm are used as benchmarks to demonstrate the advantages of the proposed PCA based subspace model identification (SMI) algorithm", "fulltext": "", "keywords": "system observability subspace;n4sid algorithm;principal component analysis;subspace model identification;smi;complex industrial process monitoring;eiv situation;sensor fault diagnosis;moesp algorithm;process fault diagnosis;subspace identification approach;errors-in-variables situation;total least-squares approximation;consistent system matrix estimates;pca"} +{"name": "test_2003", "title": "Nonlinear modeling and adaptive fuzzy control of MCFC stack", "abstract": "To improve availability and performance of fuel cells, the operating temperature of the molten carbonate fuel cells (MCFC) stack should be controlled within a specified range. However, most existing models of MCFC are not ready to be applied in synthesis. In the paper, a radial basis function neural networks identification model of a MCFC stack is developed based on the input-output sampled data. An adaptive fuzzy control procedure for the temperature of the MCFC stack is also developed. The parameters of the fuzzy control system are regulated by back-propagation algorithm, and the rule database of the fuzzy system is also adaptively adjusted by the nearest-neighbor-clustering algorithm. Finally using the neural networks model of MCFC stack, the simulation results of the control algorithm are presented. The results show the effectiveness of the proposed modeling and design procedures for the MCFC stack based on neural networks identification and the novel adaptive fuzzy control", "fulltext": "", "keywords": "adaptive fuzzy control;nonlinear modeling;fuel cells;molten carbonate fuel cells stack;input-output sampled data;radial basis function neural networks identification model;rule database;mcfc stack;nearest-neighbor-clustering algorithm;backpropagation algorithm"} +{"name": "test_2004", "title": "New paradigms for interactive 3D volume segmentation", "abstract": "We present a new virtual reality-based interaction metaphor for semi-automatic segmentation of medical 3D volume data. The mouse-based, manual initialization of deformable surfaces in 3D represents a major bottleneck in interactive segmentation. In our multi-modal system we enhance this process with additional sensory feedback. A 3D haptic device is used to extract the centreline of a tubular structure. Based on the obtained path a cylinder with varying diameter is generated, which in turn is used as the initial guess for a deformable surface", "fulltext": "", "keywords": "tubular structure;virtual reality;interaction metaphor;medical image segmentation;sensory feedback;3d haptic device;multi-modal system;deformable surface;haptic interaction;interactive 3d volume segmentation;mouse;deformable surfaces;interactive segmentation;varying diameter cylinder"} +{"name": "test_2005", "title": "State-of-the-art in orthopaedic surgical navigation with a focus on medical", "abstract": "image modalities This paper presents a review of surgical navigation systems in orthopaedics and categorizes these systems according to the image modalities that are used for the visualization of surgical action. Medical images used to be an essential part of surgical education and documentation as well as diagnosis and operation planning over many years. With the recent introduction of navigation techniques in orthopaedic surgery, a new field of application has been opened. Today surgical navigation systems - also known as image-guided surgery systems - are available for various applications in orthopaedic surgery. They visualize the position and orientation of surgical instruments as graphical overlays onto a medical image of the operated anatomy on a computer monitor. Preoperative image data such as computed tomography scans or intra operatively generated images (for example, ultrasonic, endoscopic or fluoroscopic images) are suitable for this purpose. A new category of medical images termed 'surgeon-defined anatomy' has been developed that exclusively relies upon the usage of navigation technology. Points on the anatomy are digitized interactively by the surgeon and are used to build up an abstract geometrical model of the bony structures to be operated on. This technique may be used when no other image data is available or appropriate for a given application", "fulltext": "", "keywords": "medical image modalities;computed tomography scans;surgical action visualization;surgeon-defined anatomy;bony structures;surgical instruments;intra operatively generated images;medical image processing;image-guided surgery systems;abstract geometrical model;surgical education;computer monitor;image registration;graphical overlays;orthopaedic surgical navigation"} +{"name": "test_2006", "title": "Lung metastasis detection and visualization on CT images: a knowledge-based", "abstract": "method A solution to the problem of lung metastasis detection on computed tomography (CT) scans of the thorax is presented. A knowledge-based top-down approach for image interpretation is used. The method is inspired by the manner in which a radiologist and radiotherapist interpret CT images before radiotherapy is planned. A two-dimensional followed by a three-dimensional analysis is performed. The algorithm first detects the thorax contour, the lungs and the ribs, which further help the detection of metastases. Thus, two types of tumors are detected: nodules and metastases located at the lung extremities. A method to visualize the anatomical structures segmented is also presented. The system was tested on 20 patients (988 total images) from the Oncology Department of La Chaux-de-Fonds Hospital and the results show that the method is reliable as a computer-aided diagnostic tool for clinical purpose in an oncology department", "fulltext": "", "keywords": "image interpretation;data visualization;thorax;computer-aided diagnostic tool;computed tomography;knowledge-based top-down approach;three-dimensional analysis;knowledge representation;medical imaging;ct images;oncology;two-dimensional analysis;lung metastasis detection"} +{"name": "test_2007", "title": "The creation of a high-fidelity finite element model of the kidney for use in", "abstract": "trauma research A detailed finite element model of the human kidney for trauma research has been created directly from the National Library of Medicine Visible Human Female (VHF) Project data set. An image segmentation and organ reconstruction software package has been developed and employed to transform the 2D VHF images into a 3D polygonal representation. Nonuniform rational B-spline (NURBS) surfaces were then mapped to the polygonal surfaces, and were finally utilized to create a robust 3D hexahedral finite element mesh within a commercially available meshing software. The model employs a combined viscoelastic and hyperelastic material model to successfully simulate the behaviour of biological soft tissues. The finite element model was then validated for use in biomechanical research", "fulltext": "", "keywords": "high-fidelity finite element model;image segmentation;2d vhf images;hyperelastic material model;3d hexahedral finite element mesh;trauma research;medical data set;polygonal surfaces;3d polygonal representation;visible human female project;software package;nurbs;kidney;nonuniform rational b-spline surfaces;national library of medicine;organ reconstruction;viscoelastic model;biomechanical research;physically based animation;biological soft tissues"} +{"name": "test_2008", "title": "Building 3D anatomical scenes on the Web", "abstract": "We propose a new service for building user-defined 3D anatomical structures on the Web. The Web server is connected to a database storing more than 1000 3D anatomical models reconstructed from the Visible Human. Users may combine existing models as well as planar oblique slices in order to create their own structured anatomical scenes. Furthermore, they may record sequences of scene construction and visualization actions. These actions enable the server to construct high-quality video animations, downloadable by the user. Professionals and students in anatomy, medicine and related disciplines are invited to use the server and create their own anatomical scenes", "fulltext": "", "keywords": "3d anatomical scenes;high-quality video animation;structured anatomical scenes;web server;user-defined 3d anatomical structures;planar oblique slices;visualization;volume visualization;database;scene construction;visible human;applet-based rendering engine;java;world wide web;surface reconstruction;3d anatomical models"} +{"name": "test_2009", "title": "A survey of interactive mesh-cutting techniques and a new method for", "abstract": "implementing generalized interactive mesh cutting using virtual tools In our experience, mesh-cutting methods can be distinguished by how their solutions address the following major issues: definition of the cut path, primitive removal and re-meshing, number of new primitives created, when re-meshing is performed, and representation of the cutting tool. Many researchers have developed schemes for interactive mesh cutting with the goals of reducing the number of new primitives created, creating new primitives with good aspect ratios, avoiding a disconnected mesh structure between primitives in the cut path, and representing the path traversed by the tool as accurately as possible. The goal of this paper is to explain how, by using a very simple framework, one can build a generalized cutting scheme. This method allows for any arbitrary cut to be made within a virtual object, and can simulate cutting surface, layered surface or tetrahedral objects using a virtual scalpel, scissors, or loop cautery tool. This method has been implemented in a real-time, haptic-rate surgical simulation system allowing arbitrary cuts to be made on high-resolution patient-specific models", "fulltext": "", "keywords": "cutting tool;virtual object;disconnected mesh structure;cut path definition;rendering;high-resolution patient-specific models;haptic-rate surgical simulation system;re-meshing;layered surface;haptic interfaces;tetrahedral objects;virtual tools;generalized interactive mesh cutting;real-time system"} +{"name": "test_201", "title": "Correction to construction of panoramic image mosaics with global and local", "abstract": "alignment For original paper see ibid., vol. 36, no. 2, p. 101-30 (2000). The authors had given a method for the construction of panoramic image mosaics with global and local alignment. Unfortunately a mistake had led to an incorrect equation which whilst making little difference in many cases, for faster (and assured) convergence, the correct formulae given here should be used", "fulltext": "", "keywords": "global alignment;panoramic image mosaics;local alignment;resampled image"} +{"name": "test_2010", "title": "Scale-invariant segmentation of dynamic contrast-enhanced perfusion MR images", "abstract": "with inherent scale selection Selection of the best set of scales is problematic when developing signal-driven approaches for pixel-based image segmentation. Often, different possibly conflicting criteria need to be fulfilled in order to obtain the best trade-off between uncertainty (variance) and location accuracy. The optimal set of scales depends on several factors: the noise level present in the image material, the prior distribution of the different types of segments, the class-conditional distributions associated with each type of segment as well as the actual size of the (connected) segments. We analyse, theoretically and through experiments, the possibility of using the overall and class-conditional error rates as criteria for selecting the optimal sampling of the linear and morphological scale spaces. It is shown that the overall error rate is optimized by taking the prior class distribution in the image material into account. However, a uniform (ignorant) prior distribution ensures constant class-conditional error rates. Consequently, we advocate for a uniform prior class distribution when an uncommitted, scale-invariant segmentation approach is desired. Experiments with a neural net classifier developed for segmentation of dynamic magnetic resonance (MR) images, acquired with a paramagnetic tracer, support the theoretical results. Furthermore, the experiments show that the addition of spatial features to the classifier, extracted from the linear or morphological scale spaces, improves the segmentation result compared to a signal-driven approach based solely on the dynamic MR signal. The segmentation results obtained from the two types of features are compared using two novel quality measures that characterize spatial properties of labelled images", "fulltext": "", "keywords": "scale-invariant segmentation;inherent scale selection;class-conditional error rates;neural net classifier;dynamic magnetic resonance images;optimal sampling;class-conditional distributions;pixel-based image segmentation;labelled images;quality measures;dynamic contrast-enhanced perfusion mr images;experiments;paramagnetic tracer;noise level"} +{"name": "test_2011", "title": "Innovative phase unwrapping algorithm: hybrid approach", "abstract": "We present a novel algorithm based on a hybrid of the global and local treatment of a wrapped map. The proposed algorithm is especially effective for the unwrapping of speckle-coded interferogram contour maps. In contrast to earlier unwrapping algorithms by region, we propose a local discontinuity-restoring criterion to serve as the preprocessor or postprocessor of our hybrid algorithm, which makes the unwrapping by region much easier and more efficient. With this hybrid algorithm, a robust, stable, and especially time effective phase unwrapping can be achieved. Additionally, the criterion and limitation of this hybrid algorithm are fully described. The robustness, stability, and speed of this hybrid algorithm are also studied. The proposed algorithm can be easily upgraded with minor modifications to solve the unwrapping problem of maps with phase inconsistency. Both numerical simulation and experimental applications demonstrate the effectiveness of the proposed algorithm", "fulltext": "", "keywords": "light interferometry;postprocessor;phase inconsistency;local treatment;hybrid algorithm;global treatment;wrapped map;speckle-coded interferogram contour maps;interferogram analysis;local discontinuity-restoring criterion;robust stable time effective phase unwrapping;unwrapping algorithms;phase unwrapping algorithm;unwrapping problem;numerical simulation"} +{"name": "test_2012", "title": "Strain contouring using Gabor filters: principle and algorithm", "abstract": "Moire interferometry is a powerful technique for high sensitivity in-plane deformation contouring. However, from an engineering viewpoint, the derivatives of displacement, i.e., strain, are the desired parameter. Thus there is a need to differentiate the displacement field. Optical and digital methods have been proposed for this differentiation. Optical methods provide contours that still need to be quantified, while digital methods suffer from drawbacks inherent in the digital differentiation process. We describe a novel approach of strain segmentation for the moire pattern using a multichannel Gabor filter. Appropriate filter design allows for user-specific segmentation, which is essentially in engineering design and analysis", "fulltext": "", "keywords": "spatial filters;engineering design;algorithm;strain contouring;strain segmentation;differentiation;image segmentation;displacement;moire interferometry;digital differentiation process;high sensitivity in-plane deformation contouring;engineering analysis;multichannel gabor filter;filter design;displacement field;optical methods;digital methods;user-specific segmentation;gabor filters"} +{"name": "test_2013", "title": "Novel denoising algorithm for obtaining a superresolved position estimation", "abstract": "We present a new algorithm that uses the randomness of the noise pattern to achieve high positioning accuracy by applying a modified averaging operation. Using the suggested approach, noise sensitivity of the positioning accuracy can be significantly reduced. This new improved algorithm can improve the performances of tracking systems used for military as well as civil applications. The concept is demonstrated theoretically as well as by optical experiment", "fulltext": "", "keywords": "civil applications;noise sensitivity;superresolved position estimation;noise pattern randomness;optical experiment;modified averaging operation;denoising algorithm;military applications;tracking systems;high positioning accuracy"} +{"name": "test_2014", "title": "Adaptive filtering for noise reduction in hue saturation intensity color space", "abstract": "Even though the hue saturation intensity (HSI) color model has been widely used in color image processing and analysis, the conversion formulas from the RGB color model to HSI are nonlinear and complicated in comparison with the conversion formulas of other color models. When an RGB image is degraded by random Gaussian noise, this nonlinearity leads to a nonuniform noise distribution in HSI, making accurate image analysis more difficult. We have analyzed the noise characteristics of the HSI color model and developed an adaptive spatial filtering method to reduce the magnitude of noise and the nonuniformity of noise variance in the HSI color space. With this adaptive filtering method, the filter kernel for each pixel is dynamically adjusted, depending on the values of intensity and saturation. In our experiments we have filtered the saturation and hue components and generated edge maps from color gradients. We have found that by using the adaptive filtering method, the minimum error rate in edge detection improves by approximately 15%", "fulltext": "", "keywords": "saturation;adaptive spatial filtering method;rgb color model;filter kernel;hue saturation intensity color space;random gaussian noise;color gradients;accurate image analysis;intensity;pixel;noise reduction;color image processing;nonuniformity;nonuniform noise distribution;generated edge maps;edge detection;adaptive filtering;hsi color space;noise variance;color image analysis;minimum error rate"} +{"name": "test_2015", "title": "Optical recognition of three-dimensional objects with scale invariance using a", "abstract": "classical convergent correlator We present a real-time method for recognizing three-dimensional (3-D) objects with scale invariance. The 3-D information of the objects is codified in deformed fringe patterns using the Fourier transform profilometry technique and is correlated using a classical convergent correlator. The scale invariance property is achieved using two different approaches: the Mellin radial harmonic decomposition and the logarithmic radial harmonic filter. Thus, the method is invariant for changes in the scale of the 3-D target within a defined interval of scale factors. Experimental results show the utility of the proposed method", "fulltext": "", "keywords": "scale invariance;mellin radial harmonic decomposition;logarithmic radial harmonic filter;scale factors;3-d information;scale invariance property;3d object recognition;classical convergent correlator;deformed fringe patterns;real-time method;fourier transform profilometry technique;optical recognition;invariant"} +{"name": "test_2016", "title": "Fully automatic algorithm for region of interest location in camera calibration", "abstract": "We present an automatic method for region of interest (ROI) location in camera calibration used in computer vision inspection. An intelligent ROI location algorithm based on the Radon transform is developed to automate the calibration process. The algorithm remains robust even if the anchor target has a notable rotation angle in the target plane. This method functions well although the anchor target is not carefully positioned. Several improvement methods are studied to avoid the algorithm's huge time/space consumption problem. The algorithm runs about 100 times faster if these improvement methods are applied. Using this method fully automatic camera calibration is achieved without human interactive ROI specification. Experiments show that this algorithm can help to calibrate the intrinsic parameters of the zoom lens and the camera parameters quickly and automatically", "fulltext": "", "keywords": "roi location algorithm;fully automatic camera calibration;human interactive specification;region of interest location;computer vision inspection;interest location;camera calibration;intrinsic parameters;zoom lens;fully automatic algorithm;radon transform;rotation angle;time/space consumption problem;camera parameters;calibration process"} +{"name": "test_2017", "title": "Autofocus system for microscope", "abstract": "A technique is developed for microscope autofocusing, which is called the eccentric light beam approach with high resolution, wide focusing range, and compact construction. The principle is described. The theoretical formula of the eccentric light beam approach deduced can be applied not only to an object lens whose objective plane is just at the focal plane, but also to an object lens whose objective plane is not at the focal plane. The experimental setup uses a semiconductor laser device as the light source. The laser beam that enters into the microscope is eccentric with the main light axis. A defocused signal is acquired by a symmetrical silicon photocell for the change of the reflected light position caused by differential amplification and processed by a microprocessor. Then the electric signal is power-amplified and drives a dc motor, which moves a fine working platform to an automatic focus of the microscope. The result of the experiments shows a +or-0.1- mu m precision of autofocusing for a range of +or-500- mu m defocusing. The system has high reliability and can meet the requirements of various accurate micro measurement systems", "fulltext": "", "keywords": "object lens;micro measurement systems;semiconductor laser;eccentric light beam approach;main light axis;objective plane;high reliability;dc motor;reflected light position;differential amplification;microscope autofocusing;microprocessor;symmetrical silicon photocell;fine working platform;autofocus system;defocused signal;power-amplified electric signal"} +{"name": "test_2018", "title": "Design and implementation of a 3-D mapping system for highly irregular shaped", "abstract": "objects with application to semiconductor manufacturing The basic technology for a robotic system is developed to automate the packing of polycrystalline silicon nuggets into fragile fused silica crucible in Czochralski (melt pulling) semiconductor wafer production. The highly irregular shapes of the nuggets and the packing constraints make this a difficult and challenging task. It requires the delicate manipulation and packing of highly irregular polycrystalline silicon nuggets into a fragile fused silica crucible. For this application, a dual optical 3-D surface mapping system that uses active laser triangulation has been developed and successfully tested. One part of the system measures the geometry profile of a nugget being packed and the other the profile of the nuggets already in the crucible. A resolution of 1 mm with 15-KHz sampling frequency is achieved. Data from the system are used by the packing algorithm, which determines optimal nugget placement. The key contribution is to describe the design and implementation of an efficient and robust 3-D imaging system to map highly irregular shaped objects using conventional components in context of real commercial manufacturing processes", "fulltext": "", "keywords": "fragile fused silica crucible;sampling frequency;3d mapping system;highly irregular shaped objects;polycrystalline silicon nuggets;highly irregular polycrystalline silicon nuggets;irregular shaped objects;optical nugget placement;active laser triangulation;robust 3-d imaging system;packing algorithm;semiconductor manufacturing;robotic system;commercial manufacturing processes;dual optical 3d surface mapping system;czochralski semiconductor wafer production"} +{"name": "test_2019", "title": "Effective moving cast shadow detection for monocular color traffic image", "abstract": "sequences For an accurate scene analysis using monocular color traffic image sequences, a robust segmentation of moving vehicles from the stationary background is generally required. However, the presence of moving cast shadow may lead to an inaccurate vehicle segmentation, and as a result, may lead to further erroneous scene analysis. We propose an effective method for the detection of moving cast shadow. By observing the characteristics of cast shadow in the luminance, chrominance, gradient density, and geometry domains, a combined probability map, called a shadow confidence score (SCS), is obtained. From the edge map of the input image, each edge pixel is examined to determine whether it belongs to the vehicle region based on its neighboring SCSs. The cast shadow is identified as those regions with high SCSs, which are outside the convex hull of the selected vehicle edge pixels. The proposed method is tested on 100 vehicle images taken under different lighting conditions (sunny and cloudy), viewing angles (roadside and overhead), vehicle sizes (small, medium, and large), and colors (similar to the road and not). The results indicate that an average error rate of around 14% is obtained while the lowest error rate is around 3% for large vehicles", "fulltext": "", "keywords": "moving vehicles;image segmentation;geometry domains;luminance;effective moving cast shadow detection;gradient density;vehicle images;lighting conditions;viewing angles;combined probability map;cloudy;robust segmentation;input image;accurate scene analysis;convex hull;erroneous scene analysis;shadow confidence score;inaccurate vehicle segmentation;average error rate;sunny;chrominance;cast shadow;monocular color traffic image sequences;moving cast shadow;vehicle sizes;selected vehicle edge pixels;stationary background"} +{"name": "test_202", "title": "Estimation of error in curvature computation on multi-scale free-form surfaces", "abstract": "A novel technique for multi-scale curvature computation on a free-form 3-D surface is presented. This is achieved by convolving local parametrisations of the surface with 2-D Gaussian filters iteratively. In our technique, semigeodesic coordinates are constructed at each vertex of the mesh. Smoothing results are shown for 3-D surfaces with different shapes indicating that surface noise is eliminated and surface details are removed gradually. A number of evolution properties of 3-D surfaces are described. Next, the surface Gaussian and mean curvature values are estimated accurately at multiple scales which are then mapped to colours and displayed directly on the surface. The performance of the technique when selecting different directions as an arbitrary direction for the geodesic at each vertex are also presented. The results indicate that the error observed for the estimation of Gaussian and mean curvatures is quite low after only one iteration. Furthermore, as the surface is smoothed iteratively, the error is further reduced. The results also show that the estimation error of Gaussian curvature is less than that of mean curvature. Our experiments demonstrate that estimation of smoothed surface curvatures are very accurate and not affected by the arbitrary direction of the first geodesic line when constructing semigeodesic coordinates. Our technique is independent of the underlying triangulation and is also more efficient than volumetric diffusion techniques since 2-D rather than 3-D convolutions are employed. Finally, the method presented here is a generalisation of the Curvature Scale Space method for 2-D contours. The CSS method has outperformed comparable techniques within the MPEG-7 evaluation framework. As a result, it has been selected for inclusion in the MPEG-7 package of standards", "fulltext": "", "keywords": "mean curvature values;semigeodesic coordinates;2d gaussian filters;mpeg-7 evaluation framework;curvature scale space method;surface gaussian values;multi-scale curvature computation;volumetric diffusion techniques;evolution properties;free-form 3d surface;convolutions;surface noise;underlying triangulation;local parametrisations"} +{"name": "test_2020", "title": "Restoration of broadband imagery steered with a liquid-crystal optical phased", "abstract": "array In many imaging applications, it is highly desirable to replace mechanical beam-steering components (i.e., mirrors and gimbals) with a nonmechanical device. One such device is a nematic liquid crystal optical phased array (LCOPA). An LCOPA can implement a blazed phase grating to steer the incident light. However, when a phase grating is used in a broadband imaging system, two adverse effects can occur. First, dispersion will cause different incident wavelengths arriving at the same angle to be steered to different output angles, causing chromatic aberrations in the image plane. Second, the device will steer energy not only to the first diffraction order, but to others as well. This multiple-order effect results in multiple copies of the scene appearing in the image plane. We describe a digital image restoration technique designed to overcome these degradations. The proposed postprocessing technique is based on a Wiener deconvolution filter. The technique, however, is applicable only to scenes containing objects with approximately constant reflectivities over the spectral region of interest. Experimental results are presented to demonstrate the effectiveness of this technique", "fulltext": "", "keywords": "liquid-crystal optical phased array steering;mirrors;output angles;multiple copies;mechanical beam-steering components;digital image restoration technique;gimbals;halogen lamp;nematic liquid crystal optical phased array;first diffraction order;broadband imagery;nonmechanical device;approximately constant reflectivities;incident wavelengths;chromatic aberrations;optical phased array;dispersion;wiener deconvolution filter;broadband imaging system;incident light steering;multiple-order effect;blazed phase grating;image plane;imaging applications;spectral region of interest;postprocessing technique"} +{"name": "test_2021", "title": "One-step digit-set-restricted modified signed-digit adder using an incoherent", "abstract": "correlator based on a shared content-addressable memory An efficient one-step digit-set-restricted modified signed-digit (MSD) adder based on symbolic substitution is presented. In this technique, carry propagation is avoided by introducing reference digits to restrict the intermediate carry and sum digits to {1,0} and {0,1}, respectively. The proposed technique requires significantly fewer minterms and simplifies system complexity compared to the reported one-step MSD addition techniques. An incoherent correlator based on an optoelectronic shared content-addressable memory processor is suggested to perform the addition operation. In this technique, only one set of minterms needs to be stored, independent of the operand length", "fulltext": "", "keywords": "shared content-addressable memory;incoherent correlator;intermediate carry;one-step digit-set-restricted modified signed-digit adder;system complexity;optoelectronic shared content-addressable memory processor;sum digits;reference digits;addition operation;operand length;minterms;symbolic substitution"} +{"name": "test_2022", "title": "Two-step integral imaging for orthoscopic three-dimensional imaging with", "abstract": "improved viewing resolution We present a two-step integral imaging system to obtain 3-D orthoscopic real images. By adopting a nonstationary micro-optics technique, we demonstrate experimentally the potential usefulness of two-step integral imaging", "fulltext": "", "keywords": "pickup lenslet array;display device;two-step integral imaging system;lclv;liquid crystal light valve;3-d orthoscopic real images;two-step integral imaging;nonstationary micro-optics technique;3-d image reconstruction;resolution improved viewing"} +{"name": "test_2023", "title": "Diffraction limit for a circular mask with a periodic rectangular apertures", "abstract": "array A mask with periodic apertures imaging system is adopted very widely and plays a leading role in modern technology for uses such as pinhole cameras, coded imaging systems, optical information processing, etc. because of its high resolution, its infinite depth of focus, and its usefulness over a broad frequency spectra ranging from visible light to X-rays and gamma rays. While the masks with periodic apertures investigated in the literature are limited only to far-field diffraction, they do not take the shift of apertures within the mask into consideration. Therefore the derivation of the far-field diffraction for a single aperture cannot be applied to a mask with periodic apertures. The far-field diffraction formula modified for a multiaperture mask has been proposed in the past, the analysis remains too complicated to offer some practical guidance for mask design. We study a circular mask with periodic rectangular apertures and develop an easier way to interpret it. First, the near-field diffraction intensity of a circular aperture is calculated by means of Lommel's function. Then the convolution of the circular mask diffraction with periodic rectangular apertures is put together, and we can present a simple mathematical tool to analyze the mask properties including the intensity distribution, blurring aberration, and the criterion of defining the far- or near-field diffraction. This concept can also be expanded to analyze different types of masks with the arbitrarily shaped apertures", "fulltext": "", "keywords": "multiaperture mask;mask;circular mask;convolution;periodic rectangular apertures;optical information processing;far-field diffraction;single aperture;periodic rectangular apertures array;near-field diffraction;infinite depth of focus;visible light;x rays;circular mask diffraction;coded imaging systems;broad frequency spectra;periodic apertures;far-field diffraction formula;diffraction limit;gamma rays;high resolution;pinhole cameras;arbitrarily shaped apertures"} +{"name": "test_2024", "title": "Binocular model for figure-ground segmentation in translucent and occluding", "abstract": "images A Fourier-based solution to the problem of figure-ground segmentation in short baseline binocular image pairs is presented. Each image is modeled as an additive composite of two component images that exhibit a spatial shift due to the binocular parallax. The segmentation is accomplished by decoupling each Fourier component in one of the resultant additive images into its two constituent phasors, allocating each to its appropriate object-specific spectrum, and then reconstructing the foreground and background using the inverse Fourier transform. It is shown that the foreground and background shifts can be computed from the differences of the magnitudes and phases of the Fourier transform of the binocular image pair. While the model is based on translucent objects, it also works with occluding objects", "fulltext": "", "keywords": "fourier-based solution;occluding objects;image segmentation;binocular image pair;images;foreground;binocular parallax;object-specific spectrum;component images;phasors;spatial shift;translucent images;short baseline binocular image pairs;inverse fourier transform;fourier component decoupling;occluding images;background;binocular model;figure-ground segmentation;translucent objects"} +{"name": "test_2025", "title": "Multispectral color image capture using a liquid crystal tunable filter", "abstract": "We describe the experimental setup of a multispectral color image acquisition system consisting of a professional monochrome CCD camera and a tunable filter in which the spectral transmittance can be controlled electronically. We perform a spectral characterization of the acquisition system taking into account the acquisition noise. To convert the camera output signals to device-independent color data, two main approaches are proposed and evaluated. One consists in applying regression methods to convert from the K camera outputs to a device-independent color space such as CIEXYZ or CIELAB. Another method is based on a spectral model of the acquisition system. By inverting the model using a principal eigenvector approach, we estimate the spectral reflectance of each pixel of the imaged surface", "fulltext": "", "keywords": "acquisition system;spectral characterization;principal eigenvector approach;camera output signals;spectral transmittance;cielab;spectral reflectance;multispectral color image acquisition system;device-independent color data;camera outputs;monochrome ccd camera;ciexyz;pixel;imaged surface;acquisition noise;regression methods;liquid crystal tunable filter;spectral model;multispectral color image capture;independent color space;tunable filter"} +{"name": "test_2026", "title": "Iterative regularized least-mean mixed-norm image restoration", "abstract": "We develop a regularized mixed-norm image restoration algorithm to deal with various types of noise. A mixed-norm functional is introduced, which combines the least mean square (LMS) and the least mean fourth (LMF) functionals, as well as a smoothing functional. Two regularization parameters are introduced: one to determine the relative importance of the LMS and LMF functionals, which is a function of the kurtosis, and another to determine the relative importance of the smoothing functional. The two parameters are chosen in such a way that the proposed functional is convex, so that a unique minimizer exists. An iterative algorithm is utilized for obtaining the solution, and its convergence is analyzed. The novelty of the proposed algorithm is that no knowledge of the noise distribution is required, and the relative contributions of the LMS, the LMF, and the smoothing functionals are adjusted based on the partially restored image. Experimental results demonstrate the effectiveness of the proposed algorithm", "fulltext": "", "keywords": "mean fourth functionals;least mean square functionals;partially restored image;iterative algorithm;noise distribution;noise;kurtosis;convex functional;mixed-norm functional;regularization parameters;convergence;smoothing functional;unique minimizer;iterative regularized least-mean mixed-norm image restoration"} +{"name": "test_2027", "title": "Motion estimation using modified dynamic programming", "abstract": "A new method for computing precise estimates of the motion vector field of moving objects in a sequence of images is proposed. Correspondence vector-field computation is formulated as a matching optimization problem for multiple dynamic images. The proposed method is a heuristic modification of dynamic programming applied to the 2-D optimization problem. Motion-vector-field estimates using real movie images demonstrate good performance of the algorithm in terms of dynamic motion analysis", "fulltext": "", "keywords": "algorithm;heuristic modification;image sequence;motion vector field estimates;modified dynamic programming;2-d optimization problem;motion estimation;dynamic motion analysis;real movie images;precise estimates;vector-field computation;matching optimization problem;moving objects;dynamic programming;motion vector field;multiple dynamic images"} +{"name": "test_2028", "title": "Centroid detection based on optical correlation", "abstract": "We propose three correlation-based methods to simultaneously detect the centroids of multiple objects in an input scene. The first method is based on the modulus of the moment function, the second method is based on squaring the moment function, and the third method works with a single intensity filter. These methods are invariant to changes in the position, orientation, and scale of the object and result in good noise-smoothing performance. We use spatial light modulators (SLMs) to directly implement the input of the image and filter information for the purpose of these approaches. We present results showing simulations from different approaches and provide comparisons between optical-correlation- and digital-moment-based methods. Experimental results corresponding to an optical correlator using SLMs for the centroid detection are also presented", "fulltext": "", "keywords": "multiple objects;orientation;spatial light modulators;centroids;position;input scene;moment function squaring;digital-moment-based methods;single intensity filter;optical correlation;correlation-based methods;noise-smoothing performance;scale;optical correlator;centroid detection;moment function modulus"} +{"name": "test_2029", "title": "Block truncation image bit plane coding", "abstract": "Block truncation coding (BTC) is a successful image compression technique due to its simple and fast computational burden. The bit rate is fixed to 2.0 bits/pixel, whose performance is moderate in terms of compression ratio compared to other compression schemes such as discrete cosine transform (DCT), vector quantization (VQ), wavelet transform coding (WTC), etc. Two kinds of overheads are required for BTC coding: bit plane and quantization values, respectively. A new technique is presented to reduce the bit plane overhead. Conventional bit plane overhead is 1.0 bits/pixel; we decrease it to 0.734 bits/pixel while maintaining the same decoded quality as absolute moment BTC (AMBTC) does for the \"Lena\" image. Compared to other published bit plane coding strategies, the proposed method outperforms all of the existing methods", "fulltext": "", "keywords": "bit rate;performance;image compression technique;image bit plane coding;ambtc;lena image;decoded quality;quantization values;bit plane overhead;absolute moment btc;compression ratio;block truncation coding"} +{"name": "test_203", "title": "Plenoptic image editing", "abstract": "This paper presents a new class of interactive image editing operations designed to maintain consistency between multiple images of a physical 3D scene. The distinguishing feature of these operations is that edits to any one image propagate automatically to all other images as if the (unknown) 3D scene had itself been modified. The modified scene can then be viewed interactively from any other camera viewpoint and under different scene illuminations. The approach is useful first as a power-assist that enables a user to quickly modify many images by editing just a few, and second as a means for constructing and editing image-based scene representations by manipulating a set of photographs. The approach works by extending operations like image painting, scissoring, and morphing so that they alter a scene's plenoptic function in a physically-consistent way, thereby affecting scene appearance from all viewpoints simultaneously. A key element in realizing these operations is a new volumetric decomposition technique for reconstructing an scene's plenoptic function from an incomplete set of camera viewpoints", "fulltext": "", "keywords": "plenoptic image editing;morphing;multiple images;image-based scene representations;volumetric decomposition technique;interactive image editing operations;physical 3d scene;camera viewpoint;plenoptic function;modified scene;scissoring;image painting"} +{"name": "test_2030", "title": "Elimination of zero-order diffraction in digital holography", "abstract": "A simple method to suppress the zero-order diffraction in the reconstructed image of digital holography is presented. In this method, the Laplacian of a detected hologram is used instead of the hologram itself for numerical reconstruction by computing the discrete Fresnel integral. This method can significantly improve the image quality and give better resolution and higher accuracy of the reconstructed image. The main advantages of this method are its simplicity in experimental requirements and convenience in data processing", "fulltext": "", "keywords": "data processing;image. processing;zero-order diffraction suppression;reconstructed image;detected hologram;image quality;discrete fresnel integral;numerical image reconstruction;laplacian;image resolution;digital holography;accuracy"} +{"name": "test_2031", "title": "Efficient two-level image thresholding method based on Bayesian formulation and", "abstract": "the maximum entropy principle An efficient method for two-level thresholding is proposed based on the Bayes formula and the maximum entropy principle, in which no assumptions of the image histogram are made. An alternative criterion is derived based on maximizing entropy and used for speeding up the searching algorithm. Five forms of conditional probability distributions-simple, linear, parabola concave, parabola convex, and S-function-are employed and compared to each other for optimal threshold determination. The effect of precision on optimal threshold determination is discussed and a trade-off precision epsilon =0.001 is selected experimentally. Our experiments demonstrate that the proposed method achieves a significant improvement in speed from 26 to 57 times faster than the exhaustive search method", "fulltext": "", "keywords": "two-level image thresholding method;trade-off precision;image segmentation;bayesian formulation;entropy;parabola concave;image thresholding;parabola convex;maximum entropy principle;optimal threshold determination;searching algorithm;conditional probability distributions;s-function;image histogram"} +{"name": "test_2032", "title": "Adaptive digital watermarking using fuzzy logic techniques", "abstract": "Digital watermarking has been proposed for copyright protection in our digital society. We propose an adaptive digital watermarking scheme based on the human visual system model and a fuzzy logic technique. The fuzzy logic approach is employed to obtain the different strengths and lengths of a watermark by the local characteristics of the image in our proposed scheme. In our experiments, this scheme provides a more robust and imperceptible watermark", "fulltext": "", "keywords": "image processing;local characteristics;copyright protection;human visual system model;adaptive digital watermarking;digital society;imperceptible watermark;robust watermark;fuzzy logic techniques"} +{"name": "test_2033", "title": "Optical encoding of color three-dimensional correlation", "abstract": "Three-dimensional (3D) correlation of color images, considering the color distribution as the third dimension, has been shown to be useful for color pattern recognition tasks. Nevertheless, 3D correlation cannot be directly performed on an optical correlator, that can only process two-dimensional (2D) signals. We propose a method to encode 3D functions onto 2D ones in such a way that the Fourier transform and correlation of these signals, that can be optically performed, encode the 3D Fourier transform and correlation of the 3D signals. The theory for the encoding is given and experimental results obtained in an optical correlator are shown", "fulltext": "", "keywords": "color distribution;3d correlation;color three-dimensional correlation;color pattern recognition tasks;color images;3d fourier transform;3d function encoding;fourier transform;optical encoding;optical correlator"} +{"name": "test_2035", "title": "Search for efficient solutions of multi-criterion problems by target-level", "abstract": "method The target-level method is considered for solving continuous multi-criterion maximization problems. In the first step, the decision-maker specifies a target-level point (the desired criterion values); then in the set of vector evaluations we seek points that are closest to the target point in the Chebyshev metric. The vector evaluations obtained in this way are in general weakly efficient. To identify the efficient evaluations, the second step maximizes the sum of the criteria on the set generated in step 1. We prove the relationship between the evaluations and decisions obtained by the proposed procedure, on the one hand, and the efficient (weakly efficient) evaluations and decisions, on the other hand. If the Edgeworth-Pareto hull of the set of vector evaluations is convex, the set of efficient vector evaluations can be approximated by the proposed method", "fulltext": "", "keywords": "target-level point;continuous multi-criterion maximization problems;target-level method;chebyshev metric;edgeworth-pareto hull;multi-criterion problems"} +{"name": "test_2036", "title": "Computer processing of data on mental impairments during the acute period of", "abstract": "concussion The article presents results of computer processing of experimental information obtained from patients during the acute period of concussion. A number of computational procedures are described", "fulltext": "", "keywords": "computer processing;acute period of concussion;computational procedures;mental impairments"} +{"name": "test_2037", "title": "Regularization of linear regression problems", "abstract": "The study considers robust estimation of linear regression parameters by the regularization method, the pseudoinverse method, and the Bayesian method allowing for correlations and errors in the data. Regularizing algorithms are constructed and their relationship with pseudoinversion, the Bayesian approach, and BLUE is investigated", "fulltext": "", "keywords": "linear regression parameters;bayesian method;pseudoinversion;bayesian approach;blue;pseudoinverse method;linear regression problems regularization;robust estimation"} +{"name": "test_2038", "title": "Choice from a three-element set: some lessons of the 2000 presidential campaign", "abstract": "in the United States We consider the behavior of four choice rules - plurality voting, approval voting, Borda count, and self-consistent choice - when applied to choose the best option from a three-element set. It is assumed that the two main options are preferred by a large majority of the voters, while the third option gets a very small number of votes and influences the election outcome only when the two main options receive a close number of votes. When used to rate the main options, Borda count and self-consistent choice contain terms that allow both for the \"strength of preferences\" of the voters and the rating of the main candidates by voters who vote for the third option. In this way, it becomes possible to determine more reliably the winner when plurality voting or approval voting produce close results", "fulltext": "", "keywords": "three-element set;borda count;self-consistent choice;approval voting;2000 presidential campaign;plurality voting"} +{"name": "test_2039", "title": "An inverse problem for a model of a hierarchical structure", "abstract": "We consider the inverse problem for the identification of the coefficient in a parabolic equation. The model is applied to describe the functioning of a hierarchical structure; it is also relevant for heat-conduction theory. Unique solvability of the inverse problem is proved", "fulltext": "", "keywords": "unique solvability;heat-conduction theory;parabolic equation;hierarchical structure;inverse problem"} +{"name": "test_204", "title": "Self-calibration from image derivatives", "abstract": "This study investigates the problem of estimating camera calibration parameters from image motion fields induced by a rigidly moving camera with unknown parameters, where the image formation is modeled with a linear pinhole-camera model. The equations obtained show the flow to be separated into a component due to the translation and the calibration parameters and a component due to the rotation and the calibration parameters. A set of parameters encoding the latter component is linearly related to the flow, and from these parameters the calibration can be determined. However, as for discrete motion, in general it is not possible to decouple image measurements obtained from only two frames into translational and rotational components. Geometrically, the ambiguity takes the form of a part of the rotational component being parallel to the translational component, and thus the scene can be reconstructed only up to a projective transformation. In general, for full calibration at least four successive image frames are necessary, with the 3D rotation changing between the measurements. The geometric analysis gives rise to a direct self-calibration method that avoids computation of optical flow or point correspondences and uses only normal flow measurements. New constraints on the smoothness of the surfaces in view are formulated to relate structure and motion directly to image derivatives, and on the basis of these constraints the transformation of the viewing geometry between consecutive images is estimated. The calibration parameters are then estimated from the rotational components of several flow fields. As the proposed technique neither requires a special set up nor needs exact correspondence it is potentially useful for the calibration of active vision systems which have to acquire knowledge about their intrinsic parameters while they perform other tasks, or as a tool for analyzing image sequences in large video databases", "fulltext": "", "keywords": "direct self-calibration method;image measurements;active vision systems;rotational components;rigidly moving camera;normal flow measurements;linear pinhole-camera model;camera calibration parameters;translational components;calibration parameters;image motion fields;image sequences;depth distortion;point correspondences;large video databases;optical flow;image formation"} +{"name": "test_2040", "title": "Inverse problems for a mathematical model of ion exchange in a compressible ion", "abstract": "exchanger A mathematical model of ion exchange is considered, allowing for ion exchanger compression in the process of ion exchange. Two inverse problems are investigated for this model, unique solvability is proved, and numerical solution methods are proposed. The efficiency of the proposed methods is demonstrated by a numerical experiment", "fulltext": "", "keywords": "unique solvability;numerical solution methods;mathematical model;ion exchanger compression;inverse problems;compressible ion exchanger;ion exchange"} +{"name": "test_2041", "title": "Application of multiprocessor systems for computation of jets", "abstract": "The article describes the implementation of methods for numerical solution of gas-dynamic problems on a wide class of multiprocessor systems, conventionally characterized as \"cluster\" systems. A standard data-transfer interface - the so-called message passing interface - is used for parallelization of application algorithms among processors. Simulation of jets escaping into a low-pressure region is chosen as a computational example", "fulltext": "", "keywords": "cluster systems;gas-dynamic problems;message passing interface;low-pressure region;computation of jets;multiprocessor systems;data-transfer interface"} +{"name": "test_2042", "title": "Hybrid simulation of space plasmas: models with massless fluid representation", "abstract": "of electrons. IV. Kelvin-Helmholtz instability For pt.III. see Prikl. Mat. Informatika, MAKS Press, no. 4, p. 5-56 (2000). This is a survey of the literature on hybrid simulation of the Kelvin-Helmholtz instability. We start with a brief review of the theory: the simplest model of the instability - a transition layer in the form of a tangential discontinuity; compressibility of the medium; finite size of the velocity shear region; pressure anisotropy. We then describe the electromagnetic hybrid model (ions as particles and electrons as a massless fluid) and the main numerical schemes. We review the studies on two-dimensional and three-dimensional hybrid simulation of the process of particle mixing across the magnetopause shear layer driven by the onset of a Kelvin-Helmholtz instability. The article concludes with a survey of literature on hybrid simulation of the Kelvin-Helmholtz instability in finite-size objects: jets moving across the magnetic field in the middle of the field reversal layer; interaction between a magnetized plasma flow and a cylindrical plasma source with zero own magnetic field", "fulltext": "", "keywords": "magnetopause shear layer;space plasmas;field reversal layer;cylindrical plasma source;massless fluid representation;tangential discontinuity;magnetized plasma flow;hybrid simulation;electromagnetic hybrid model;three-dimensional hybrid simulation;pressure anisotropy;kelvin-helmholtz instability;transition layer"} +{"name": "test_2043", "title": "Limits for computational electromagnetics codes imposed by computer", "abstract": "architecture The algorithmic complexity of the innermost loops that determine the complexity of algorithms in computational electromagnetics (CEM) codes are analyzed according to their operation count and the impact of the underlying computer hardware. As memory chips are much slower than arithmetic processors, codes that involve a high data movement compared to the number of arithmetic operations are executed comparatively slower. Hence, matrix-matrix multiplications are much faster than matrix-vector multiplications. It is seen that it is not sufficient to compare only the complexity, but also the actual performance of algorithms to judge on faster execution. Implications involve FDTD loops, LU factorizations, and iterative solvers for dense matrices. Run times on two reference platforms, namely an Athlon 900 MHz and an HP PA 8600 processor, verify the findings", "fulltext": "", "keywords": "lu factorizations;operation count;computational electromagnetics codes;cem codes;iterative solvers;computer architecture;dense matrices;innermost loops;data movement;matrix-vector multiplications;computer hardware;algorithmic complexity;fdtd loops;memory chips;matrix-matrix multiplications"} +{"name": "test_2044", "title": "Three-dimensional geometrical optics code for indoor propagation", "abstract": "This paper presents a program, GO 3D, for computing the fields of a transmitter in an indoor environment using geometrical optics. The program uses an \"image tree\" data structure to construct the images needed to compute all the rays carrying fields above a preset \"threshold\" value, no matter how many reflections are needed. The paper briefly describes the input file required to define wall construction, the floor plan, the transmitter, and the receiver locations. A case study consisting of a long corridor with a small room on one side is used to demonstrate the features of the GO 3D program", "fulltext": "", "keywords": "transmitter;ray tracing;receiver locations;3d geometrical optics code;image construction;wall construction;indoor propagation;data visualisation;image tree data structure;floor plan;three-dimensional geometrical optics"} +{"name": "test_2045", "title": "Building a better game through dynamic programming: a Flip analysis", "abstract": "Flip is a solitaire board game produced by craft woodworkers. We analyze Flip and suggest modifications to the rules to make the game more marketable. In addition to being an interesting application of dynamic programming, this case shows the use of operations research in managerial decision making", "fulltext": "", "keywords": "flip analysis;craft woodworkers;operations research;managerial decision making;dynamic programming;solitaire board game"} +{"name": "test_2046", "title": "Designing and delivering a university course - a process (or operations)", "abstract": "management perspective With over 30 years of academic experience in both engineering and management faculties, involving trial and error experimentation in teaching as well as reading relevant literature and observing other instructors in action, the author has accumulated a number of ideas, regarding the preparation and delivery of a university course, that should be of interest to other instructors. This should be particularly the case for those individuals who have had little or no teaching experience (e.g. those whose graduate education was recently completed at research-oriented institutions providing little guidance with respect to teaching). A particular perspective is used to convey the ideas, namely one of viewing the preparation and delivery of a course as two major processes that should provide outputs or outcomes that are of value to a number of customers, in particular, students", "fulltext": "", "keywords": "search-oriented institutions;university course delivery;management faculties;engineering faculties;academic experience;management perspective"} +{"name": "test_2047", "title": "A generalized PERT/CPM implementation in a spreadsheet", "abstract": "This paper describes the implementation of the traditional PERT/CPM algorithm for finding the critical path in a project network in a spreadsheet. The problem is of importance due to the recent shift of attention to using the spreadsheet environment as a vehicle for delivering management science/operations research (MS/OR) techniques to end-users", "fulltext": "", "keywords": "spreadsheet;generalized pert/cpm implementation;critical path;ms/or techniques"} +{"name": "test_2048", "title": "An object-oriented version of SIMLIB (a simple simulation package)", "abstract": "This paper introduces an object-oriented version of SIMLIB (an easy-to-understand discrete-event simulation package). The object-oriented version is preferable to the original procedural language versions of SIMLIB in that it is easier to understand and teach simulation from an object point of view. A single-server queue simulation is demonstrated using the object-oriented SIMLIB", "fulltext": "", "keywords": "teach simulation;discrete-event simulation;object-oriented version;simlib"} +{"name": "test_2049", "title": "The maximum possible EVPI", "abstract": "In this paper we calculate the maximum expected value of perfect information (EVPI) for any probability distribution for the states of the world. This maximum EVPI is an upper bound for the EVPI with given probabilities and thus an upper bound for any partial information about the states of the world", "fulltext": "", "keywords": "operations research;probability distribution;decision analysis;optimisation;management science;expected value of perfect information"} +{"name": "test_205", "title": "Geotensity: combining motion and lighting for 3D surface reconstruction", "abstract": "This paper is about automatically reconstructing the full 3D surface of an object observed in motion by a single static camera. Based on the two paradigms, structure from motion and linear intensity subspaces, we introduce the geotensity constraint that governs the relationship between four or more images of a moving object. We show that it is possible in theory to solve for 3D Lambertian surface structure for the case of a single point light source and propose that a solution exists for an arbitrary number point light sources. The surface may or may not be textured. We then give an example of automatic surface reconstruction of a face under a point light source using arbitrary unknown object motion and a single fixed camera", "fulltext": "", "keywords": "automatic surface reconstruction;linear intensity subspaces;arbitrary number point light sources;single point light source;single static camera;point light source;3d lambertian surface structure;structure-from-motion;linear image subspaces;full 3d surface;geotensity constraint"} +{"name": "test_2050", "title": "Who Wants To Be A Millionaire(R): The classroom edition", "abstract": "This paper introduces a version of the internationally popular television game show Who Wants To Be A Millionaire(R) that has been created for use in the classroom using Microsoft PowerPoint(R). A suggested framework for its classroom use is presented, instructions on operating and editing the classroom version of Who Wants To Be A Millionaire(R) are provided, and sample feedback from students who have played the classroom version of Who Wants To Be A Millionaire(R) is offered", "fulltext": "", "keywords": "who wants to be a millionaire(r);student contestants;classroom version;classroom;undergraduate business students"} +{"name": "test_2051", "title": "Who wants to see a $million error?", "abstract": "Inspired by the popular television show \"Who Wants to Be a Millionaire?\", this case discusses the monetary decisions contestants face on a game consisting of 15 increasingly difficult multiple choice questions. Since the game continues as long as a contestant answers correctly, this case, at its core, is one of sequential decision analysis, amenable to analysis via stochastic dynamic programming. The case is also suitable for a course dealing with single decision analysis, allowing for discussion of utility theory and Bayesian probability revision. In developing a story line for the case, the author has sprinkled in much background material on probability and statistics. This material is placed in a historical context, illuminating some of the influential scholars involved in the development of these subjects as well as the birth of operations research and the management sciences", "fulltext": "", "keywords": "game theory;operations research;stochastic dynamic programming;decision analysis;statistics;probabilistic models;educational course"} +{"name": "test_2052", "title": "Blitzograms - interactive histograms", "abstract": "As computers become ever faster, more and more procedures that were once viewed as iterative will continue to become instantaneous. The blitzogram is the application of this trend to histograms, which the author hopes will lead to a better tacit understanding of probability distributions among both students and managers. And this is not just an academic exercise. Commercial Monte Carlo simulation packages like @RISK and Crystal Ball, and my INSIGHT.xla are widely available", "fulltext": "", "keywords": "histograms;blitzogram;operations research;mba;probability distributions;statistics;management science"} +{"name": "test_2053", "title": "Teaching management science with spreadsheets: From decision models to decision", "abstract": "support The 1990s were a decade of enormous change for management science (MS) educators. While the outlook at the beginning of the decade was somewhat bleak, the renaissance in MS education brought about by the use of spreadsheets as the primary delivery vehicle for quantitative modeling techniques has resulted in a much brighter future. This paper takes inventory of the current state of MS education and suggests some promising new directions in the area of decision support systems for MS educators to consider for the future", "fulltext": "", "keywords": "ms education;decision support systems;management science;spreadsheets;quantitative modeling"} +{"name": "test_2054", "title": "Teaching modeling in management science", "abstract": "This essay discusses how we can most effectively teach Management Science to students in MBA or similar programs who will be, at best, part-time practitioners of these arts. I take as a working hypothesis the radical proposition that the heart of Management Science itself is not the impressive array of tools that have been built up over the years (optimization, simulation, decision analysis, queuing, and so on) but rather the art of reasoning logically with formal models. I believe it is necessary with this group of students to teach basic modeling skills, and in fact it is only when such students have these basic skills as a foundation that they are prepared to acquire the more sophisticated skills needed to employ Management Science. In this paper I present a hierarchy of modeling skills, from numeracy skills through sophisticated Management Science skills, as a framework within which to plan courses for the occasional practitioner", "fulltext": "", "keywords": "formal models;decision analysis;numeracy skills;management science;modeling"} +{"name": "test_2055", "title": "Causes of the decline of the business school management science course", "abstract": "The business school management science course is suffering serious decline. The traditional model- and algorithm-based course fails to meet the needs of MBA programs and students. Poor student mathematical preparation is a reality, and is not an acceptable justification for poor teaching outcomes. Management science Ph.D.s are often poorly prepared to teach in a general management program, having more experience and interest in algorithms than management. The management science profession as a whole has focused its attention on algorithms and a narrow subset of management problems for which they are most applicable. In contrast, MBA's rarely encounter problems that are suitable for straightforward application of management science tools, living instead in a world where problems are ill-defined, data is scarce, time is short, politics is dominant, and rational \"decision makers\" are non-existent. The root cause of the profession's failure to address these issues seems to be (in Russell Ackoff's words) a habit of professional introversion that caused the profession to be uninterested in what MBA's really do on the job and how management science can help them", "fulltext": "", "keywords": "mba programs;profession;business school management science course;management science;mba students"} +{"name": "test_2056", "title": "Gifts to a science academic librarian", "abstract": "Gifts, by their altruistic nature, perfectly fit into the environment of universities and academic libraries. As a university's community and general public continue to donate materials, libraries accept donations willingly, both in-kind and monetary. Eight steps of gift processing are listed in the paper. Positive and negative aspects of gift acceptance are discussed. Gifts bring value for academic libraries. Gifts can be considered additional routes to contribute to library collections without direct purchases, options to add money to the library budget, and the cement of social relationships. But, unfortunately, large donations are time-consuming, labor-intensive and costly to process. Great amounts of staff time and processing space are two main negative aspects that cause concern and put the value of gift acceptance under consideration by librarians. Some strategies in handling gifts are recommended. To be effective, academic science librarians need to approach gifts as an investment. Librarians are not to be forced by moral and public notions and should be able to make professional decisions in evaluating proposed collections", "fulltext": "", "keywords": "library collections;science academic librarian;donations;research libraries;budget;gift processing;acquisitions;professional decisions;gift books;staff time;academic libraries"} +{"name": "test_2057", "title": "Four factors influencing the fair market value of out-of print books. 2", "abstract": "Fot pt.1 see ibid., p.71-8 (2002). Data from the fifty-six titles examined qualitatively in the Patterson study are examined quantitatively. In addition to the four factors of edition, condition, dust jacket, and autograph that were hypothesized to influence the value of a book, four other factors for which information was available in the data were examined", "fulltext": "", "keywords": "fair market value;economics;out-of-print books;quantitative analysis;pricing;publisher"} +{"name": "test_2058", "title": "Four factors influencing the fair market value of out-of-print books.1", "abstract": "Four factors (edition, condition, dust jacket, and autograph) that are hypothesized to influence the value of books are identified and linked to basic economic principles, which are explained. A sample of fifty-six titles is qualitatively examined to test the hypothesis", "fulltext": "", "keywords": "out-of-print books;economic principles;pricing;fair market value"} +{"name": "test_2059", "title": "Acquiring materials in the history of science, technology, and medicine", "abstract": "This article provides detailed advice on acquiring new, out-of-print, and rare materials in the history of science, technology, and medicine for the beginner in these fields. The focus is on the policy formation, basic reference tools, and methods of collection development and acquisitions that are the necessary basis for success in this endeavor", "fulltext": "", "keywords": "medicine;policy formation;science;out-of-print books;technology;library acquisitions;special collections;collection development;basic reference tools;rare materials"} +{"name": "test_206", "title": "Information architecture: looking ahead", "abstract": "It may be a bit strange to consider where the field of information architecture (IA) is headed. After all, many would argue that it's too new to be considered as a field at all, or that it is mislabeled, and by no means is there a widely accepted definition of what information architecture actually is. Practicing information architects probably number in the thousands, and this vibrant group is already building various forms of communal infrastructure, ranging from an IA journal and a self-organizing \"library\" of resources to a passel of local professional groups and degree-granting academic programs. So the profession has achieved a beachhead that will enable it to stabilize and perhaps even grow during these difficult times", "fulltext": "", "keywords": "information architecture;communal infrastructure;degree-granting academic programs;information architects;local professional groups"} +{"name": "test_2060", "title": "Decisions, decisions, decisions: a tale of special collections in the small", "abstract": "academic library A case study of a special collections department in a small academic library and how its collections have been acquired and developed over the years is described. It looks at the changes that have occurred in the academic environment and what effect, if any, these changes may have had on the department and how it has adapted to them. It raises questions about development and acquisitions policies and procedures", "fulltext": "", "keywords": "case study;acquisitions policies;out-of-print books;university library;special collections;small academic library"} +{"name": "test_2061", "title": "Acquisitions in the James Ford Bell Library", "abstract": "This article presents basic acquisitions philosophy and approaches in a noted special collection, with commentary on \"just saying no\" and on how the electronic revolution has changed the acquisition of special collections materials", "fulltext": "", "keywords": "out-of-print books;james ford bell library;university library;special collections;electronic revolution;library acquisitions philosophy"} +{"name": "test_2062", "title": "Underground poetry, collecting poetry, and the librarian", "abstract": "A powerful encounter with underground poetry and its important role in poetry, literature, and culture is discussed. The acquisitions difficulties encountered in the unique publishing world of underground poetry are introduced. Strategies for acquiring underground poetry for library collections are proposed, including total immersion and local focus, with accompanying action", "fulltext": "", "keywords": "library collections;literature;culture;underground poetry;librarian;out-of-print books;publishing;special collections"} +{"name": "test_2063", "title": "On emotion and bounded rationality: reply to Hanoch", "abstract": "The author refers to the comment made by Hanoch (see ibid. vol.49 (2000)) on his model of bounded rationality and the role of the Yerkes-Dodson law and emotional arousal in it. The author points out that Hanoch's comment, however, conspicuously fails to challenge - much less contradict - the central hypothesis of his paper. In addition, several of Hanoch's criticisms are based on a wrong characterization of the positions", "fulltext": "", "keywords": "bounded rationality;psychology;yerkes-dodson law;emotion;decision-making"} +{"name": "test_2064", "title": "The effects of emotions on bounded rationality: a comment on Kaufman", "abstract": "Bruce Kaufman's article (1999), \"Emotional arousal as a source of bounded rationality\", objective is to present an additional source of bounded rationality, one that is not due to cognitive constraints, but to high emotional arousal. In doing so, Kaufman is following a long tradition of thinkers who have contrasted emotion with reason, claiming, for the most part, that emotions are a violent force hindering rational thinking. This paper aims to challenge Kaufman's unidimensional idea regarding the connection between high emotional arousal and decision making", "fulltext": "", "keywords": "decision making;rational thinking;bounded rationality;psychology;yerkes-dodson law;emotion"} +{"name": "test_2065", "title": "Emotion and self-control", "abstract": "A biology-based model of choice is used to examine time-inconsistent preferences and the problem of self-control. Emotion is shown to be the biological substrate of choice, in that emotional systems assign value to 'goods' in the environment and also facilitate the learning of expectations regarding alternative options for acquiring those goods. A third major function of the emotional choice systems is motivation. Self-control is shown to be the result of a problem with the inhibition of the motive force of emotion, where this inhibition is necessary for higher level deliberation", "fulltext": "", "keywords": "learning;time-inconsistent preferences;emotional choice systems;emotion;choice model;inhibition;self-control"} +{"name": "test_2066", "title": "Product and process innovations in the life cycle of an industry", "abstract": "Filson (2001) uses industry-level data on firm numbers, price, quantity and quality along with an equilibrium model of industry evolution to estimate the nature and effects of quality and cost improvements in the personal computer industry and four other new industries. This paper studies the personal computer industry in more detail and shows that the model explains some peculiar patterns that cannot be explained by previous life-cycle models. The model estimates are evaluated using historical studies of the evolution of the personal computer industry and patterns that require further model development are described", "fulltext": "", "keywords": "equilibrium model;industry evolution;microelectronics;technological change;industry dynamics;life-cycle models;production cost;pc industry;personal computer market"} +{"name": "test_2067", "title": "A comparison of the discounted utility model and hyperbolic discounting models", "abstract": "in the case of social and private intertemporal preferences for health Whilst there is substantial evidence that hyperbolic discounting models describe intertemporal preferences for monetary outcomes better than the discounted utility (DU) model, there is only very limited evidence in the context of health outcomes. This study elicits private and social intertemporal preferences for non-fatal changes in health. Specific functional forms of the DU model and three hyperbolic models are fitted. The results show that the stationarity axiom is violated, and that the hyperbolic models fit the data better than the DU model. Intertemporal preferences for private and social decisions are found to be very similar", "fulltext": "", "keywords": "social decisions;intertemporal preferences;health outcomes;discounted utility model;hyperbolic discounting models;private decisions"} +{"name": "test_2068", "title": "Modeling the labor market as an evolving institution: model ARTEMIS", "abstract": "A stylized French labor market is modeled as an endogenously evolving institution. Boundedly rational firms and individuals strive to decrease the cost or increase utility. The labor market is coordinated by a search process and decentralized setting of hiring standards, but intermediaries can speed up matching. The model reproduces the dynamics of the gross flows and spectacular changes in mobility patterns of some demographic groups when the oil crisis in the 1970's occurred, notably the sudden decline of the integration in good jobs. The internal labor markets of large firms are shown to increase unemployment if the secondary (temporary or bad) jobs do not exist", "fulltext": "", "keywords": "endogenously evolving institution;simulation model;mobility patterns;endogenous intermediary;spectacular changes;french labor market;artemis model;jobs;demographic groups"} +{"name": "test_2069", "title": "The ultimate control group", "abstract": "Empirical research on the organization of firms requires that firms be classified on the basis of their control structures. This should be done in a way that can potentially be made operational. It is easy to identify the ultimate controller of a hierarchical organization, and the literature has largely focused on this case. However, many organizational structures mix hierarchy with collective choice procedures such as voting, or use circular structures under which superiors are accountable to their subordinates. The author develops some analytic machinery that can be used to map the authority structures of such organizations, and show that under mild restrictions there is a well-defined ultimate control group. The results are consistent with intuitions about the nature of control in familiar economic settings", "fulltext": "", "keywords": "control rights;ultimate control group;authority structures;firm organization;committees;organizational structures;hierarchical organization"} +{"name": "test_207", "title": "Information architecture for bilingual Web sites", "abstract": "Creating an information architecture for a bilingual Web site presents particular challenges beyond those that exist for single and multilanguage sites. This article reports work in progress on the development of a content-based bilingual Web site to facilitate the sharing of resources and information between Speech and Language Therapists. The development of the information architecture is based on a combination of two aspects: an abstract structural analysis of existing bilingual Web designs focusing on the presentation of bilingual material, and a bilingual card-sorting activity conducted with potential users. Issues for bilingual developments are discussed, and some observations are made regarding the use of card-sorting activities", "fulltext": "", "keywords": "speech therapists;information architecture;content-based bilingual web site;bilingual developments;bilingual card-sorting activity;language therapists;world wide web"} +{"name": "test_2070", "title": "Modularity in technology and organization", "abstract": "The paper is an attempt to raid both the literature on modular design and the literature on property rights to create the outlines of a modularity theory of the firm. Such a theory will look at firms, and other organizations, in terms of the partitioning of rights-understood as protected spheres of authority-among cooperating parties. It will assert that organizations reflect nonmodular structures, that is, structures in which decision rights, rights of alienation, and residual claims to income do not all reside in the same hands", "fulltext": "", "keywords": "modularity;organization;decision rights;authority;technology;property rights;nonmodular structures;transaction costs;cooperating parties;rights of alienation;partitioning of rights"} +{"name": "test_208", "title": "Designing a new urban Internet", "abstract": "The parallel between designing a Web site and the construction of a building is a familiar one, but how often do we think of the Internet as having parks and streets? It would be absurd to say that the Internet could ever take the place of real, livable communities; however, it is safe to say that the context for using the Internet is on a path of change. As the Internet evolves beyond a simple linkage of disparate Web sites and applications, the challenge for Information Architects is establishing a process by which to structure, organize, and design networked environments. The principles that guide New Urbanism can offer much insight into networked electronic environment design. At the core of every New Urbanism principle is the idea of \"wholeness\"-of making sure that neighborhoods and communities are knit together in a way that supports civic activities, economic development, efficient ecosystems, aesthetic beauty, and human interaction", "fulltext": "", "keywords": "internet;networked environments;networked electronic environment design;private-public sector cooperation;global information networks;web site;information architects;communities"} +{"name": "test_2081", "title": "Three-dimensional optimum design of the cooling lines of injection moulds based", "abstract": "on boundary element design sensitivity analysis A three-dimensional numerical simulation using the boundary element method is proposed, which can predict the cavity temperature distributions in the cooling stage of injection moulding. Then, choosing the radii and positions of cooling lines as design variables, the boundary integral sensitivity formulations are deduced. For the optimum design of cooling lines, the squared difference between the objective temperature and temperature of the cavity is taken as the objective function. Based on the optimization techniques with design sensitivity analysis, an iterative algorithm to reach the minimum value of the objective function is introduced, which leads to the optimum design of cooling lines at the same time", "fulltext": "", "keywords": "cavity temperature distributions;iterative algorithm;boundary integral sensitivity analysis;objective function;optimization;heat conduction;injection moulding;3d numerical simulation;cooling stage;boundary element method"} +{"name": "test_2082", "title": "Managing safety and strategic stocks to improve materials requirements planning", "abstract": "performance This paper provides a methodology for managing safety and strategic stocks in materials requirements planning (MRP) environments to face uncertainty in market demand. A set of recommended guidelines suggest where to position, how to dimension and when to replenish both safety and strategic stocks. Trade-offs between stock positioning and dimensioning and between stock positioning and replenishment order triggering are outlined. The study reveals also that most of the decisions are system specific, so that they should be evaluated in a quantitative manner through simulation. A case study is reported, where the benefits from adopting the new proposed methodology lie in achieving the target service level even under peak demand conditions, with the value of safety stocks as a whole growing only by about 20 per cent", "fulltext": "", "keywords": "mrp;safety stocks;service level;peak demand;strategic stocks;inventory management;market demand;stock replenishment;variance control;materials requirements planning"} +{"name": "test_2083", "title": "Innovative manufacture of impulse turbine blades for wave energy power", "abstract": "conversion An innovative approach to the manufacture of impulse turbine blades using rapid prototyping, fused decomposition modelling (FDM), is presented. These blades were designed and manufactured by the Wave Energy Research Team (WERT) at the University of Limerick for the experimental analysis of a 0.6 m impulse turbine with fixed guide vanes for wave energy power conversion. The computer aided design/manufacture (CAD/CAM) package Pro-Engineer 2000i was used for three-dimensional solid modelling of the individual blades. A detailed finite element analysis of the blades under centrifugal loads was performed using Pro-Mechanica. based on this analysis and FDM machine capabilities, blades were redesigned. Finally, Pro-E data were transferred to an FDM machine for the manufacture of turbine blades. The objective of this paper is to present the innovative method used to design, modify and manufacture blades in a time and cost effective manner using a concurrent engineering approach", "fulltext": "", "keywords": "solid modelling;impulse turbine blades;fused decomposition modelling;cad/cam;finite element analysis;manufacturing;concurrent engineering;rapid prototyping;wave energy power conversion;university of limerick"} +{"name": "test_2084", "title": "Evaluation of combined dispatching and routeing strategies for a flexible", "abstract": "manufacturing system This paper deals with the evaluation of combined dispatching and routeing strategies on the performance of a flexible manufacturing system. Three routeing policies - no alternative routings, alternative routeing dynamics and alternative routeing plans - are considered with four dispatching rules with finite buffer capacity. In addition, the effect of changing part mix ratios is also discussed. The performance measures considered are makespan, average machine utilization, average flow time and average delay at local input buffers. Simulation results indicate that the alternative routings dynamic policy gives the best results in three performance measures except for average delay at local input buffers. Further, the effect of changing part mix ratios is not significant", "fulltext": "", "keywords": "dispatching rules;average flow time;alternative routings;fms;part mix ratios;finite buffer capacity;flexible manufacturing system"} +{"name": "test_2085", "title": "An intelligent fuzzy decision system for a flexible manufacturing system with", "abstract": "multi-decision points This paper describes an intelligent fuzzy decision support system for real-time scheduling and dispatching of parts in a flexible manufacturing system (FMS), with alternative routing possibilities for all parts. A fuzzy logic approach is developed to improve the system performance by considering multiple performance measures and at multiple decision points. The characteristics of the system status, instead of parts, are fed back to assign priority to the parts waiting to be processed. A simulation model is developed and it is shown that the proposed intelligent fuzzy decision support system keeps all performance measures at a good level. The proposed intelligent system is a promising tool for dealing with scheduling FMSs, in contrast to traditional rules", "fulltext": "", "keywords": "intelligent decision support system;fuzzy logic;simulation;fms;scheduling;multiple decision points;real-time system;flexible manufacturing system"} +{"name": "test_2086", "title": "A design to cost system for innovative product development", "abstract": "Presents a prototype object-oriented and rule-based system for product cost modelling and design for automation at an early design stage. The developed system comprises a computer aided design (CAD) solid modelling system, a material selection module, a knowledge-based system (KBS), a process optimization module, a design for assembly module, a cost estimation module and a user interface. Two manufacturing processes, namely machining and injection moulding processes, were considered in the developed system. The main function of the system, besides estimating the product cost, is to generate initial process planning, including the generation and selection of machining processes, their sequence and their machining parameters, and to recommend the most economical assembly technique for a product and provide design improvement suggestions based on a design feasibility technique. In addition, a feature-by-feature cost estimation report is generated using the proposed system to highlight the features of high manufacturing cost. Two case studies were used to validate the developed system", "fulltext": "", "keywords": "innovative product development;material selection module;product cost modelling;process planning;computer aided design solid modelling system;knowledge-based system;process optimization module;design for automation;feature-by-feature cost estimation report;design for assembly module;object-oriented programming;injection moulding;concurrent engineering;user interface;design to cost system;object-oriented rule-based system;machining;fuzzy logic;cost estimation module"} +{"name": "test_2087", "title": "Re-examining the machining frictional boundary conditions using fractals", "abstract": "Presents experimental evidence for the existence of non-Euclidean contact geometry at the tool-chip interface in the machining of aluminium alloy, which challenges conventional assumptions. The geometry of contact at the tool rake face is modelled using fractals and a dimension is computed for its description. The variation in the fractal dimension with the cutting speed is explored", "fulltext": "", "keywords": "contact geometry;machining frictional boundary conditions;fractals;tool rake face;al;tool-chip interface;noneuclidean contact geometry;cutting speed;aluminium alloy"} +{"name": "test_2088", "title": "Layer-based machining: recent development and support structure design", "abstract": "There is growing interest in additive and subtractive shaping theories that are synthesized to integrate the layered manufacturing process and material removal process. Layer-based machining has emerged as a promising method for integrated additive and subtractive shaping theory. In the paper, major layer-based machining systems are reviewed and compared according to characteristics of stock layers, numerical control machining configurations, stacking operations, input format and raw materials. Support structure, a major issue in machining-based systems which has seldom been addressed in previous research, is investigated in the paper with considerations of four situations: floating overhang, cantilever, vaulted overhang and ceiling. Except for the floating overhang where a support structure should not be overlooked, the necessity for support structures for the other three situations is determined by stress and deflection analysis. This is demonstrated by the machining of a large castle model", "fulltext": "", "keywords": "ceiling;numerical control machining configurations;subtractive shaping theories;additive shaping theories;stacking operations;material removal process;cantilever;vaulted overhang;stock layers;raw materials;support structure design;layer-based machining;stress;input format;deflection analysis;floating overhang;layered manufacturing process"} +{"name": "test_2089", "title": "World's biggest battery helps to stabilise Alaska", "abstract": "In this paper, the author describes a battery energy storage system which is under construction to provide voltage compensation in support of Alaska's 138 kV Northern Intertie", "fulltext": "", "keywords": "77 mw;power system stabilisation;voltage compensation;battery energy storage system;138 kv;interconnected power systems;usa"} +{"name": "test_209", "title": "Information interaction: providing a framework for information architecture", "abstract": "Information interaction is the process that people use in interacting with the content of an information system. Information architecture is a blueprint and navigational aid to the content of information-rich systems. As such information architecture performs an important supporting role in information interactivity. This article elaborates on a model of information interactivity that crosses the \"no-man's land\" between user and computer articulating a model that includes user, content and system, illustrating the context for information architecture", "fulltext": "", "keywords": "information interactivity;information-rich systems;navigational aid;information interaction"} +{"name": "test_2090", "title": "All-optical logic NOR gate using two-cascaded semiconductor optical amplifiers", "abstract": "The authors present a novel all-optical logic NOR gate using two-cascaded semiconductor optical. amplifiers (SOAs) in a counterpropagating feedback configuration. This configuration accentuates the gain nonlinearity due to the mutual gain modulation of the two SOAs. The all-optical NOR gate feasibility has been demonstrated delivering an extinction ratio higher than 12 dB over a wide range of wavelength", "fulltext": "", "keywords": "counterpropagating feedback configuration;extinction ratio;two-cascaded semiconductor optical amplifiers;all-optical logic nor gate;gain nonlinearity;wide wavelength range;mutual gain modulation;soa"} +{"name": "test_2091", "title": "Prospects for quantitative computed tomography imaging in the presence of", "abstract": "foreign metal bodies using statistical image reconstruction X-ray computed tomography (CT) images of patients bearing metal intracavitary applicators or other metal foreign objects exhibit severe artifacts including streaks and aliasing. We have systematically evaluated via computer simulations the impact of scattered radiation, the polyenergetic spectrum, and measurement noise on the performance of three reconstruction algorithms: conventional filtered backprojection (FBP), deterministic iterative deblurring, and a new iterative algorithm, alternating minimization (AM), based on a CT detector model that includes noise, scatter, and polyenergetic spectra. Contrary to the dominant view of the literature, FBP streaking artifacts are due mostly to mismatches between FBP's simplified model of CT detector response and the physical process of signal acquisition. Artifacts on AM images are significantly mitigated as this algorithm substantially reduces detector-model mismatches. However, metal artifacts are reduced to acceptable levels only when prior knowledge of the metal object in the patient, including its pose, shape, and attenuation map, are used to constrain AM's iterations. AM image reconstruction, in combination with object-constrained CT to estimate the pose of metal objects in the patient, is a promising approach for effectively mitigating metal artifacts and making quantitative estimation of tissue attenuation coefficients a clinical possibility", "fulltext": "", "keywords": "statistical image reconstruction;iterative algorithm;ct detector model;noise;foreign metal bodies;quantitative computed tomography imaging;alternating minimization;polyenergetic spectra;clinical possibility;scatter;medical diagnostic imaging;deterministic iterative deblurring;filtered backprojection;signal acquisition physical process;brachytherapy;metal artifact reduction;object-constrained ct"} +{"name": "test_2092", "title": "Matching PET and CT scans of the head and neck area: Development of method and", "abstract": "validation Positron emission tomography (PET) provides important information on tumor biology, but lacks detailed anatomical information. Our aim in the present study was to develop and validate an automatic registration method for matching PET and CT scans of the head and neck. Three difficulties in achieving this goal are (1) nonrigid motions of the neck can hamper the use of automatic ridged body transformations; (2) emission scans contain too little anatomical information to apply standard image fusion methods; and (3) no objective way exists to quantify the quality of the match results. These problems are solved as follows: accurate and reproducible positioning of the patient was achieved by using a radiotherapy treatment mask. The proposed method makes use of the transmission rather than the emission scan. To obtain sufficient (anatomical) information for matching, two bed positions for the transmission scan were included in the protocol. A mutual information-based algorithm was used as a registration technique. PET and CT data were obtained in seven patients. Each patient had two CT scans and one PET scan. The datasets were used to estimate the consistency by matching PET to CT/sub 1/, CT/sub 1/ to CT/sub 2/, and CT/sub 2/ to PET using the full circle consistency test. It was found that using our method, consistency could be obtained of 4 mm and 1.3 degrees on average. The PET voxels used for registration were 5.15 mm, so the errors compared quite favorably with the voxel size. Cropping the images (removing the scanner bed from images) did not improve the consistency of the algorithm. The transmission scan, however, could potentially be reduced to a single position using this approach. In conclusion, the represented algorithm and validation technique has several features that are attractive from both theoretical and practical point of view, it is a user-independent, automatic validation technique for matching CT and PET scans of the head and neck, which gives the opportunity to compare different image enhancements", "fulltext": "", "keywords": "automatic registration method;mutual information-based algorithm;positron emission tomography scans;user-independent automatic validation technique;standard image fusion methods;radiotherapy treatment mask;errors;automatic ridged body transformations;head;anatomical information;registration technique;tumor biology;full circle consistency test;image enhancements;transmission scan;bed positions;neck;scanner bed;computerised tomography scans;patients;nonrigid motions"} +{"name": "test_2093", "title": "Fresh tracks [food processing]", "abstract": "Bar code labels and wireless terminals linked to a centralized database accurately track meat products from receiving to customers for Farmland Foods", "fulltext": "", "keywords": "farmland foods;food processing;bar code labels;intermec technologies;automatic data capture;wireless terminals"} +{"name": "test_2094", "title": "Statistical inference with partial prior information based on a Gauss-type", "abstract": "inequality Potter and Anderson (1983) have developed a Bayesian decision procedure requiring the specification of a class of prior distributions restricted to have a minimal probability content for a given subset of the parameter space. They do not, however, provide a method for the selection of that subset. We show how a generalization of Gauss' inequality can be used to determine the relevant parameter subset", "fulltext": "", "keywords": "bayesian decision procedure;prior-to-posterior sensitivity;gauss inequality;minimal probability content;prior distributions;partial prior information;parameter space"} +{"name": "test_2095", "title": "Global stability of the attracting set of an enzyme-catalysed reaction system", "abstract": "The essential feature of enzymatic reactions is a nonlinear dependency of reaction rate on metabolite concentration taking the form of saturation kinetics. Recently, it has been shown that this feature is associated with the phenomenon of \"loss of system coordination\" (Liu, 1999). In this paper, we study a system of ordinary differential equations representing a branched biochemical system of enzyme-mediated reactions. We show that this system can become very sensitive to changes in certain maximum enzyme activities. In particular, we show that the system exhibits three distinct responses: a unique, globally-stable steady-state, large amplitude oscillations, and asymptotically unbounded solutions, with the transition between these states being almost instantaneous. It is shown that the appearance of large amplitude, stable limit cycles occurs due to a \"false\" bifurcation or canard explosion. The subsequent disappearance of limit cycles corresponds to the collapse of the domain of attraction of the attracting set for the system and occurs due to a global bifurcation in the flow, namely, a saddle connection. Subsequently, almost all nonnegative data become unbounded under the action of the dynamical system and correspond exactly to loss of system coordination. We discuss the relevance of these results to the possible consequences of modulating such systems", "fulltext": "", "keywords": "saturation kinetics;stable limit cycles;bifurcation;biochemical system;nonlinear dependency;saddle connection;ordinary differential equations;metabolite concentration;enzyme-mediated reactions;enzymatic reactions"} +{"name": "test_2096", "title": "A spatial rainfall simulator for crop production modeling in Southern Africa", "abstract": "This paper describes a methodology for simulating rainfall in dekads across a set of spatial units in areas where long-term meteorological records are available for a small number of sites only. The work forms part of a larger simulation model of the food system in a district of Zimbabwe, which includes a crop production component for yields of maize, small grains and groundnuts. Only a limited number of meteorological stations are available within or surrounding the district that have long time series of rainfall records. Preliminary analysis of rainfall data for these stations suggested that intra-seasonal temporal correlation was negligible, but that rainfall at any given station was correlated with rainfall at neighbouring stations. This spatial correlation structure can be modeled using a multivariate normal distribution consisting of 30 related variables, representing dekadly rainfall in each of the 30 wards. For each ward, log-transformed rainfall for each of the 36 dekads in the year was characterized by a mean and standard deviation, which were interpolated from surrounding meteorological stations. A covariance matrix derived from a distance measure was then used to represent the spatial correlation between wards. Sets of random numbers were then drawn from this distribution to simulate rainfall across the wards in any given dekad. Cross-validation of estimated rainfall parameters against observed parameters for the one meteorological station within the district suggests that the interpolation process works well. The methodology developed is useful in situations where long-term climatic records are scarce and where rainfall shows pronounced spatial correlation, but negligible temporal correlation", "fulltext": "", "keywords": "covariance matrix;southern africa;parameter estimation;rainfall data;rainfall records;crop production modeling;spatial correlation;simulating rainfall;multivariate normal distribution;zimbabwe"} +{"name": "test_2097", "title": "An algorithm to generate all spanning trees with flow", "abstract": "Spanning tree enumeration in undirected graphs is an important issue and task in many problems encountered in computer network and circuit analysis. This paper discusses the spanning tree with flow for the case that there are flow requirements between each node pair. An algorithm based on minimal paths (MPs) is proposed to generate all spanning trees without flow. The proposed algorithm is a structured approach, which splits the system into structural MPs first, and also all steps in it are easy to follow", "fulltext": "", "keywords": "undirected graphs;minimal paths;spanning trees;computer network analysis;circuit analysis"} +{"name": "test_2098", "title": "Nonlinear systems arising from nonisothermal, non-Newtonian Hele-Shaw flows in", "abstract": "the presence of body forces and sources In this paper, we first give a formal derivation of several systems of equations for injection moulding. This is done starting from the basic equations for nonisothermal, non-Newtonian flows in a three-dimensional domain. We derive systems for both (T/sup 0/, p/sup 0/) and (T/sup 1/, p/sup 1/) in the presence of body forces and sources. We find that body forces and sources have a nonlinear effect on the systems. We also derive a nonlinear \"Darcy law\". Our formulation includes not only the pressure gradient, but also body forces and sources, which play the role of a nonlinearity. Later, we prove the existence of weak solutions to certain boundary value problems and initial-boundary value problems associated with the resulting equations for (T/sup 0/, p/sup 0/) but in a more general mathematical setting", "fulltext": "", "keywords": "hele-shaw flows;darcy law;injection moulding;sources;boundary value problems;nonlinear systems;body forces"} +{"name": "test_2099", "title": "Analyzing the potential of a firm: an operations research approach", "abstract": "An approach to analyzing the potential of a firm, which is understood as the firm's ability to provide goods or (and) services to be supplied to a marketplace under restrictions imposed by a business environment in which the firm functions, is proposed. The approach is based on using linear inequalities and, generally, mixed variables in modelling this ability for a broad spectrum of industrial, transportation, agricultural, and other types of firms and allows one to formulate problems of analyzing the potential of a firm as linear programming problems or mixed programming problems with linear constraints. This approach generalizes a previous one which was proposed for a more narrow class of models, and allows one to effectively employ a widely available software for solving practical problems of the considered kind, especially for firms described by large scale models of mathematical programming", "fulltext": "", "keywords": "linear programming;large-scale models;or;operations research;agricultural firms;firm potential analysis;linear inequalities;mixed programming;mathematical programming;industrial firms;transportation firms"} +{"name": "test_21", "title": "Discrete output feedback sliding mode control of second order systems - a moving switching line approach", "abstract": "The sliding mode control systems (SMCS) for which the switching variable is designed independent of the initial conditions are known to be sensitive to parameter variations and extraneous disturbances during the reaching phase. For second order systems this drawback is eliminated by using the moving switching line technique where the switching line is initially designed to pass the initial conditions and is subsequently moved towards a predetermined switching line. In this paper, we make use of the above idea of moving switching line together with the reaching law approach to design a discrete output feedback sliding mode control. The main contributions of this work are such that we do not require to use system states as it makes use of only the output samples for designing the controller. and by using the moving switching line a low sensitivity system is obtained through shortening the reaching phase. Simulation results show that the fast output sampling feedback guarantees sliding motion similar to that obtained using state feedback", "fulltext": "", "keywords": "discrete output feedback;fast output sampling feedback;state feedback;moving switching line;switching variable;sliding mode control;parameter variations"} +{"name": "test_210", "title": "When a better interface and easy navigation aren't enough: examining the", "abstract": "information architecture in a law enforcement agency An information architecture that allows users to easily navigate through a system and quickly recover from mistakes is often defined as a highly usable system. But usability in systems design goes beyond a good interface and efficient navigation. In this article we describe two database systems in a law enforcement agency. One system is a legacy, text-based system with cumbersome navigation (RMS); the newer system is a graphical user interface with simplified navigation (CopNet). It is hypothesized that law enforcement users will evaluate CopNet higher than RMS, but experts of the older system will evaluate it higher than others will. We conducted two user studies. One study examined what users thought of RMS and CopNet, and compared RMS experts' evaluations with nonexperts. We found that all users evaluated CopNet as more effective, easier to use, and easier to navigate than RMS, and this was especially noticeable for users who were not experts with the older system. The second, follow-up study examined use behavior after CopNet was deployed some time later. The findings revealed that evaluations of CopNet were not associated with its use. If the newer system had a better interface and was easier to navigate than the older, legacy system, why were law enforcement personnel reluctant to switch? We discuss reasons why switching to a new system is difficult, especially for those who are most adept at using the older system. Implications for system design and usability are also discussed", "fulltext": "", "keywords": "law enforcement agency;information architecture;graphical user interface;law enforcement users;copnet;rms;legacy text-based system;simplified navigation"} +{"name": "test_2100", "title": "Optimization of planning an advertising campaign of goods and services", "abstract": "A generalization of the mathematical model and operations research problems formulated on its basis, which were presented by Belenky (2001) in the framework of an approach to planning an advertising campaign of goods and services, is considered, and corresponding nonlinear programming problems with linear constraints are formulated", "fulltext": "", "keywords": "or;operations research;optimization;nonlinear programming;advertising campaign planning"} +{"name": "test_2101", "title": "All-optical XOR gate using semiconductor optical amplifiers without additional", "abstract": "input beam The novel design of an all-optical XOR gate by using cross-gain modulation of semiconductor optical amplifiers has been suggested and demonstrated successfully at 10 Gb/s. Boolean AB and AB of the two input signals A and B have been obtained and combined to achieve the all-optical XOR gate. No additional input beam such as a clock signal or continuous wave light is used in this new design, which is required in other all-optical XOR gates", "fulltext": "", "keywords": "all-optical-xor gate;10 gbit/s;cross-gain modulation;design;boolean logic;semiconductor optical amplifiers"} +{"name": "test_2102", "title": "Trust in online advice", "abstract": "Many people are now influenced by the information and advice they find on the Internet, much of it of dubious quality. This article describes two studies concerned with those factors capable of influencing people's response to online advice. The first study is a qualitative account of a group of house-hunters attempting to find worthwhile information online. The second study describes a survey of more than 2,500 people who had actively sought advice over the Internet. A framework for understanding trust in online advice is proposed in which first impressions are distinguished from more detailed evaluations. Good Web design can influence the first process, but three key factors-source credibility, personalization, and predictability-are shown to predict whether people actually follow the advice given", "fulltext": "", "keywords": "online mortgage advice;internet;survey;source credibility;house buying advice;personalization;predictability;online advice trust;e-commerce;web design"} +{"name": "test_2103", "title": "The social impact of Internet gambling", "abstract": "Technology has always played a role in the development of gambling practices and continues to provide new market opportunities. One of the fastest growing areas is that of Internet gambling. The effect of such technologies should not be accepted uncritically, particularly as there may be areas of potential concern based on what is known about problem gambling offline. This article has three aims. First, it overviews some of the main social concerns about the rise of Internet gambling. Second, it looks at the limited research that has been carried out in this area. Third, it examines whether Internet gambling is doubly addictive, given research that suggests that the Internet can be addictive itself. It is concluded that technological developments in Internet gambling will increase the potential for problem gambling globally, but that many of the ideas and speculations outlined in this article need to be addressed further by large-scale empirical studies", "fulltext": "", "keywords": "technological developments;market opportunities;psychology;electronic cash;internet gambling;social impact;addiction"} +{"name": "test_2104", "title": "Computer-mediated communication and remote management: integration or", "abstract": "isolation? The use of intranets and e-mails to communicate with remote staff is increasing rapidly within organizations. For many companies this is viewed as a speedy and cost-effective way of keeping in contact with staff and ensuring their continuing commitment to company goals. This article highlights the problems experienced by staff when managers use intranets and e-mails in an inappropriate fashion for these purposes. Issues of remoteness and isolation are discussed, along with the reports of frustration and disidentification experienced. However, it will be shown that when used appropriately, communication using these technologies can facilitate shared understanding and help remote staff to view their company as alive and exciting. Theoretical aspects are highlighted and the implications of these findings are discussed", "fulltext": "", "keywords": "remote staff;organizations;remote management;companies;cost-effective;e-mails;managers;computer-mediated communication;remoteness;intranets"} +{"name": "test_2105", "title": "Collective action in the age of the Internet: mass communication and online", "abstract": "mobilization This article examines how the Internet transforms collective action. Current practices on the Web bear witness to thriving collective action ranging from persuasive to confrontational, individual to collective, undertakings. Even more influential than direct calls for action is the indirect mobilizing influence of the Internet's powers of mass communication, which is boosted by an antiauthoritarian ideology on the Web. Theoretically, collective action through the otherwise socially isolating computer is possible because people rely on internalized group memberships and social identities to achieve social involvement. Empirical evidence from an online survey among environmental activists and nonactivists confirms that online action is considered an equivalent alternative to offline action by activists and nonactivists alike. However, the Internet may slightly alter the motives underlying collective action and thereby alter the nature of collective action and social movements. Perhaps more fundamental is the reverse influence that successful collective action will have on the nature and function of the Internet", "fulltext": "", "keywords": "internet;mass communication;online mobilization;politics;collective action;group memberships;world wide web;online survey;antiauthoritarian ideology;social identities;anonymity"} +{"name": "test_2106", "title": "Explanations for the perpetration of and reactions to deception in a virtual", "abstract": "community Cases of identity deception on the Internet are not uncommon. Several cases of a revealed identity deception have been reported in the media. The authors examine a case of deception in an online community composed primarily of information technology professionals. In this case, an established community member (DF) invented a character (Nowheremom) whom he fell in love with and who was eventually killed in a tragic accident. When other members of the community eventually began to question Nowheremom's actual identity, DF admitted that he invented her. The discussion board was flooded with reactions to DF's revelation. The authors propose several explanations for the perpetration of identity deception, including psychiatric illness, identity play, and expressions of true self. They also analyze the reactions of community members and propose three related explanations (social identity, deviance, and norm violation) to account for their reactions. It is argued that virtual communities' reactions to such threatening events provide invaluable clues for the study of group processes on the Internet", "fulltext": "", "keywords": "internet;information technology professionals;bulletin boards;virtual community;psychology;identity deception;online community;web sites;social processes;group processes;psychiatric illness"} +{"name": "test_2107", "title": "The effects of asynchronous computer-mediated group interaction on group", "abstract": "processes This article reports a study undertaken to investigate some of the social psychological processes underlying computer-supported group discussion in natural computer-mediated contexts. Based on the concept of deindividuation, it was hypothesized that personal identifiability and group identity would be important factors that affect the perceptions and behavior of members of computer-mediated groups. The degree of personal identifiability and the strength of group identity were manipulated across groups of geographically dispersed computer users who took part in e-mail discussions during a 2-week period. The results do not support the association between deindividuation and uninhibited behavior cited in much previous research. Instead, the data provide some support for a social identity perspective of computer-mediated communication, which explains the higher levels uninhibited in identifiable computer-mediated groups. However, predictions based on social identity theory regarding group polarization and group cohesion were not supported. Possible explanations for this are discussed and further research is suggested to resolve these discrepancies", "fulltext": "", "keywords": "group identity;internet;social identity theory;group cohesion;group polarization;psychology;personal identifiability;deindividuation;asynchronous computer-mediated group interaction;social issues;group processes;e-mail discussions;geographically dispersed computer users"} +{"name": "test_2108", "title": "Online longitudinal survey research: viability and participation", "abstract": "This article explores the viability of conducting longitudinal survey research using the Internet in samples exposed to trauma. A questionnaire battery assessing psychological adjustment following adverse life experiences was posted online. Participants who signed up to take part in the longitudinal aspect of the study were contacted 3 and 6 months after initial participation to complete the second and third waves of the research. Issues of data screening and sample attrition rates are considered and the demographic profiles and questionnaire scores of those who did and did not take part in the study during successive time points are compared. The results demonstrate that it is possible to conduct repeated measures survey research online and that the similarity in characteristics between those who do and do not take part during successive time points mirrors that found in traditional pencil-and-paper trauma surveys", "fulltext": "", "keywords": "internet;sample attrition rates;trauma;data screening;online longitudinal survey research;demographic profiles;questionnaire;world wide web;psychological adjustment;psychology research"} +{"name": "test_2109", "title": "Internet-based psychological experimenting: five dos and five don'ts", "abstract": "Internet-based psychological experimenting is presented as a method that needs careful consideration of a number of issues-from potential data corruption to revealing confidential information about participants. Ten issues are grouped into five areas of actions to be taken when developing an Internet experiment (dos) and five errors to be avoided (don'ts). Dos include: (a) utilizing dropout as a dependent variable, (b) the use of dropout to detect motivational confounding, (c) placement of questions for personal information, (d) using a collection of techniques, and (e) using Internet-based tools. Don'ts are about: (a) unprotected directories, (b) public access to confidential data, (c) revealing the experiment's structure, (d) ignoring the Internet's technical variance, and (e) improper use of form elements", "fulltext": "", "keywords": "motivational confounding;web experiment;online research techniques;data corruption;psychology;dropout;unprotected directories;data confidentiality;internet-based psychological experimenting;personal information"} +{"name": "test_211", "title": "Pervasive computing goes to work: interfacing to the enterprise", "abstract": "The paperless office is an idea whose time has come, and come, and come again. To see how pervasive computing applications might bring some substance to this dream, the author spoke recently with key managers and technologists at McKesson Corporation (San Francisco), a healthcare supplier, service, and technology company with US$50 billion in sales last year, and also at AvantGo (Hayward, Calif.), a provider of mobile infrastructure software and services. For the past several years, McKesson has used mobility middleware developed by AvantGo to deploy major supply chain applications with thousands of pervasive clients and multiple servers that replace existing paper-based tracking systems. According to McKesson's managers, their system greatly reduced errors and associated costs caused by redelivery or loss of valuable products, giving McKesson a solid return on its investment", "fulltext": "", "keywords": "multiple servers;pervasive clients;mobile workers;paperless office;enterprise resource planning;data warehousing"} +{"name": "test_2110", "title": "Psychology and the Internet", "abstract": "This article presents an overview of the way that the Internet is being used to assist psychological research and mediate psychological practice. It shows how psychologists are using the Internet to examine the interactions between people and computers, and highlights some of the ways that this research is important to the design and development of useable and acceptable computer systems. In particular, this introduction reviews the research presented at the International Conference on Psychology and the Internet held in the United Kingdom. The final part introduces the eight articles in this special edition. The articles are representative of the breadth of research being conducted on psychology and the Internet: there are two on methodological issues, three on group processes, one on organizational implications, and two on social implications of Internet use", "fulltext": "", "keywords": "organizational implications;internet;human-computer interactions;usability;psychology;social implications;group processes;online research;methodological issues;psychological research"} +{"name": "test_2111", "title": "Extended depth-of-focus imaging of chlorophyll fluorescence from intact leaves", "abstract": "Imaging dynamic changes in chlorophyll a fluorescence provides a valuable means with which to examine localised changes in photosynthetic function. Microscope-based systems provide excellent spatial resolution which allows the response of individual cells to be measured. However, such systems have a restricted depth of focus and, as leaves are inherently uneven, only a small proportion of each image at any given focal plane is in focus. In this report we describe the development of algorithms, specifically adapted for imaging chlorophyll fluorescence and photosynthetic function in living plant cells, which allow extended-focus images to be reconstructed from images taken in different focal planes. We describe how these procedures can be used to reconstruct images of chlorophyll fluorescence and calculated photosynthetic parameters, as well as producing a map of leaf topology. The robustness of this procedure is demonstrated using leaves from a number of different plant species", "fulltext": "", "keywords": "intact leaves;minimum fluorescence yield;microscope-based systems;leaf topology map;extended-focus images reconstruction;calculated photosynthetic parameters;numerical aperture;chlorophyll fluorescence;charge-coupled device;primary quinone acceptor;individual cells response;biophysical research technique;extended depth-of-focus imaging;spatial resolution;maximum fluorescence yield;variable fluorescence;plant species;algorithms development"} +{"name": "test_2112", "title": "Allan variance and fractal Brownian motion", "abstract": "Noise filtering is the subject of a voluminous literature in radio engineering. The methods of filtering require knowledge of the frequency response, which is usually unknown. D.W. Allan (see Proc. IEEE, vol.54, no.2, p.221-30, 1966; IEEE Trans. Instr. Measur., vol.IM-36, p.646-54, 1987) proposed a simple method of determining the interval between equally accurate observations which does without this information. In this method, the variances of the increments of noise and signal are equal, so that, in observations with a greater step, the variations caused by noise are smaller than those caused by the signal. This method is the standard accepted by the USA metrology community. The present paper is devoted to a statistical analysis of the Allan method and acquisition of additional information", "fulltext": "", "keywords": "allan variance;frequency response;white noise;fractal brownian motion;noise filtering;statistical analysis;usa metrology community;radio engineering"} +{"name": "test_2113", "title": "Ideal sliding mode in the problems of convex optimization", "abstract": "The characteristics of the sliding mode that appears with using continuous convex-programming algorithms based on the exact penalty functions were discussed. For the case under study, the ideal sliding mode was shown to occur in the absence of infinite number of switchings", "fulltext": "", "keywords": "continuous convex-programming algorithms;convex optimization;exact penalty functions;ideal sliding mode"} +{"name": "test_2114", "title": "Automation of the recovery of efficiency of complex structure systems", "abstract": "Basic features are set forth of the method for automation of the serviceability recovery of systems of complex structures in real time without the interruption of operation. Specific features of the method are revealed in an important example of the system of control of hardware components of ships", "fulltext": "", "keywords": "ships;complex structure systems;efficiency recovery;hardware components;serviceability recovery"} +{"name": "test_2115", "title": "Control of combustion processes in an internal combustion engine by", "abstract": "low-temperature plasma A new method of operation of internal combustion engines enhances power and reduces fuel consumption and exhaust toxicity. Low-temperature plasma control combines working processes of thermal engines and steam machines into a single process", "fulltext": "", "keywords": "exhaust toxicity;combustion processes;internal combustion engine;thermal engines;steam machines;fuel consumption;low-temperature plasma;working processes"} +{"name": "test_2116", "title": "Optimization of the characteristics of computational processes in scalable", "abstract": "resources The scalableness of resources is taken to mean the possibility of the prior change in the obtained dynamic characteristics of computational processes for a certain basic set of processors and the communication medium in an effort to optimize the dynamics of software applications. A method is put forward for the generation of optimal strategies-a set of the versions of the fulfillment of programs on the basis of a vector criterion. The method is urgent for the effective use of resources of computational clusters and metacomputational media and also for dynamic control of processes in real time on the basis of the static scaling", "fulltext": "", "keywords": "computational clusters;vector criterion;communication medium;dynamic characteristics;metacomputational media;static scaling;software applications;dynamic control;scalable resources;computational processes;optimal strategies"} +{"name": "test_2117", "title": "The p-p rearrangement and failure-tolerance of double p-ary multirings and", "abstract": "generalized hypercubes It is shown that an arbitrary grouped p-element permutation can be implemented in a conflict-free way through the commutation of channels on the double p-ary multiring or the double p-ary hypercube. It is revealed that in arbitrary single-element permutations, these commutators display the property of the (p-1)-nodal failure-tolerance and the generalized hypercube displays in addition the property of the (p-1)-channel failure-tolerance", "fulltext": "", "keywords": "conflict-free implementation;p-p rearrangement;double p-ary multirings;p-element permutation;commutators;single-element permutations;failure-tolerance;generalized hypercubes"} +{"name": "test_2118", "title": "Solutions for cooperative games", "abstract": "A new concept of the characteristic function is defined. It matches cooperative games far better than the classical characteristic function and is useful in reducing the number of decisions that can be used as the unique solution of a game", "fulltext": "", "keywords": "decisions;characteristic function;cooperative games;transferrable utility;unique solution"} +{"name": "test_2119", "title": "Location of transport nets on a heterogeneous territory", "abstract": "The location of transport routes on a heterogeneous territory is studied. The network joins a given set of terminal points and a certain number of additional (branch) points. The problem is formulated, properties of the optimal solution for a. tree-like network, and the number of branch points are studied. A stepwise optimization algorithm for a. network with given adjacency matrix based on an algorithm for constructing minimal-cost routes is designed", "fulltext": "", "keywords": "adjacency matrix;transport routes;stepwise optimization algorithm;terminal points;branch points;heterogeneous territory;tree-like network;transport nets"} +{"name": "test_212", "title": "Knowledge management-capturing the skills of key performers in the power", "abstract": "industry The growing pressure to reduce the cost of electrical power in recent years has resulted in an enormous \"brain-drain\" within the power industry. A novel approach has been developed by Eskom to capture these skills before they are lost and to incorporate these into a computer-based programme called \"knowledge management\"", "fulltext": "", "keywords": "brain-drain;eskom;key performers;skills capture;knowledge management;computer-based programme;personnel management;south africa;power industry"} +{"name": "test_2120", "title": "Control in active systems based on criteria and motivation", "abstract": "For active systems where the principal varies the agents' goal functions by adding to them appropriately weighted goal functions of other agents or a balanced system of inter-agent transfers, the paper formulated and solved the problems of control based on criteria and motivation. Linear active systems were considered by way of example", "fulltext": "", "keywords": "linear active systems;motivation-based control;inter-agent transfers;criteria-based control;goal functions"} +{"name": "test_2121", "title": "Flexibility analysis of complex technical systems under uncertainty", "abstract": "An important problem in designing technical systems under partial uncertainty of the initial physical, chemical, and technological data is the determination of a design in which the technical system is flexible, i.e., its control system is capable of guaranteeing that the constraints hold even under changes in external and internal factors and application of fuzzy mathematical models in its design. Three flexibility problems, viz., the flexibility of a technical system of given structure, structural flexibility of a technical system, and the optimal design guaranteeing the flexibility of a technical system, are studied. Two approaches to these problems are elaborated. Results of a computation experiment are given", "fulltext": "", "keywords": "structural flexibility;partial uncertainty;flexibility analysis;control system;optimal design;fuzzy mathematical models;complex technical systems"} +{"name": "test_2122", "title": "A fuzzy logic adaptation circuit for control systems of deformable space", "abstract": "vehicles: its design A fuzzy-logic adaptation algorithm is designed for adjusting the discreteness period of a control system for ensuring the stability and quality of control process with regard to the elastic structural vibrations of a deformable space vehicle. Its performance is verified by digital modeling of a discrete control system with two objects", "fulltext": "", "keywords": "digital modeling;control systems;discreteness period;fuzzy logic adaptation circuit;stability;elastic structural vibrations;deformable space vehicles"} +{"name": "test_2123", "title": "\"Hidden convexity\" of finite-dimensional stationary linear discrete-time", "abstract": "systems under conical constraints New properties of finite-dimensional linear discrete-time systems under conical control constraints that are similar to the \"hidden convexity\" of continuous-time systems are studied", "fulltext": "", "keywords": "finite-dimensional stationary linear discrete-time systems;hidden convexity;conical constraints;control constraint"} +{"name": "test_2124", "title": "The set of stable polynomials of linear discrete systems: its geometry", "abstract": "The multidimensional stability domain of linear discrete systems is studied. Its configuration is determined from the parameters of its intersection with coordinate axes, coordinate planes, and certain auxiliary planes. Counterexamples for the discrete variant of the Kharitonov theorem are given", "fulltext": "", "keywords": "kharitonov theorem;stable polynomials;linear discrete systems;characteristic polynomial;geometry;multidimensional stability domain"} +{"name": "test_2125", "title": "Stochastic systems with a random jump in phase trajectory: stability of their", "abstract": "motions The probabilistic stability of the perturbed motion of a system with parameters under the action of a general Markov process is studied. The phase vector is assumed to experience random jumps when the structure the system suffers random jumps. Such a situation is encountered, for example, in the motion of a solid with random jumps in its mass. The mean-square stability of random-structure linear systems and stability. of nonlinear systems in the first approximation are studied. The applied approach is helpful in studying the asymptotic probabilistic stability and mean-square exponential stability of stochastic systems through the stability of the respective deterministic systems", "fulltext": "", "keywords": "mean-square exponential stability;stochastic systems;general markov process;asymptotic probabilistic stability;phase trajectory;random jump"} +{"name": "test_2126", "title": "A nonlinear time-optimal control problem", "abstract": "Sufficient conditions for the existence of an optimal control in a time-optimal control problem with fixed ends for a smooth nonlinear control system are formulated. The properties of this system for characterizing the optimal control switching points are studied", "fulltext": "", "keywords": "nonlinear time-optimal control problem;sufficient existence conditions;optimal control switching points;smooth nonlinear control system"} +{"name": "test_2127", "title": "System embedding. Polynomial equations", "abstract": "The class of solutions of the polynomial equations including their generalizations in the form of the Bezout matrix identities was constructed analytically using the technology of constructive system embedding. The structure of a solution depends on the number of steps of the Euclidean algorithm and is obtained explicitly by appropriate substitutions. Illustrative and descriptive examples are presented", "fulltext": "", "keywords": "bezout matrix identities;polynomial equations;determinate systems;euclidean algorithm;constructive system embedding"} +{"name": "test_2128", "title": "An optimal control algorithm based on reachability set approximation and", "abstract": "linearization The terminal functional of a general control system is refined by studying an analogous problem for a variational system and regularization. A sequential refinement method is designed by combining the local approximation of the reachability set and reduction. The corresponding algorithm has relaxation properties. An illustrative example is given", "fulltext": "", "keywords": "determinate systems;variational system;reachability set approximation;local approximation;optimal control algorithm;regularization;relaxation properties;sequential refinement method;linearization;terminal functional"} +{"name": "test_2129", "title": "A universal decomposition of the integration range for exponential functions", "abstract": "The problem of determining the independent constants for decomposition of the integration range of exponential functions was solved on the basis of a similar approach to polynomials. The constants obtained enable one to decompose the integration range in two so that the integrals over them are equal independently of the function parameters. For the nontrigonometrical polynomials of even functions, an alternative approach was presented", "fulltext": "", "keywords": "polynomials;integration range universal decomposition;nontrigonometrical polynomials;exponential functions;even functions;integration range decomposition"} +{"name": "test_213", "title": "An application of fuzzy linear regression to the information technology in", "abstract": "Turkey Fuzzy set theory deals with the vagueness of human thought. A major contribution of fuzzy set theory is its capability of representing vague knowledge. Fuzzy set theory is very practical when sufficient and reliable data isn't available. Information technology (IT) is the acquisition, processing, storage and dissemination of information in all its forms (auditory, pictorial, textual and numerical) through a combination of computers, telecommunication, networks and electronic devices. IT includes matters concerned with the furtherance of computer science and technology, design, development, installation and implementation of information systems and applications. In the paper, assuming that there are n independent variables and the regression function is linear, the possible levels of information technology (the sale levels of computer equipment) in Turkey are forecasted by using fuzzy linear regression. The independent variables assumed are the import level and the export level of computer equipment", "fulltext": "", "keywords": "information technology;electronic devices;it;telecommunication;computer equipment export level;turkey;vague knowledge representation;fuzzy linear regression;computer science;regression function;computers;information systems;computer technology"} +{"name": "test_2130", "title": "Synchronizing experiments with linear interval systems", "abstract": "Concerns generalized control problems without exact information.
A method of constructing a minimal synchronizing sequence for a linear interval system over the field of real numbers is developed. This problem is reduced to a system of linear inequalities", "fulltext": "", "keywords": "generalized control problems;controllability;linear interval systems;linear inequalities;synchronizing experiments;real numbers;minimal synchronizing sequence construction"}
+{"name": "test_2131", "title": "Diagnosis of the technical state of heat systems", "abstract": "A step-by-step approach to the diagnosis of the technical state of heat systems is stated. The class of physical defects is supplemented by the behavioral defects of objects, which are related to the disturbance of the modes of their operation. The implementation of the approach is illustrated by an example of the solution of a specific problem of the diagnosis of a closed heat consumption system", "fulltext": "", "keywords": "step-by-step diagnosis;operational mode disturbance;closed heat consumption system diagnosis;heat system technical state diagnosis"}
+{"name": "test_2132", "title": "Fault-tolerant computer-aided control systems with multiversion-threshold", "abstract": "adaptation: adaptation methods, reliability estimation, and choice of an architecture For multiversion majority-redundant computer-aided control systems, systematization of adaptation methods that are stable to hardware and software failures, a method for estimating their reliability from an event graph model, and a method for selecting a standard architecture with regard for reliability requirements are studied", "fulltext": "", "keywords": "architecture;event graph model;fault-tolerant computer-aided control systems;multiversion majority-redundant computer-aided control systems;reliability estimation;software failure stability;multiversion-threshold adaptation;hardware failure stability"}
+{"name": "test_2133", "title": "Nonlockability in multirings and hypercubes at serial transmission of data", "abstract": "blocks For the multiring and hypercube, a method of conflictless realization of an arbitrary permutation of \"large\" data items that can be divided into many \"smaller\" data blocks was considered, and its high efficiency was demonstrated", "fulltext": "", "keywords": "multirings;multiprocessor computer systems;nonlockability;data block serial transmission;hypercubes"}
+{"name": "test_2134", "title": "Linear models of circuits based on the multivalued components", "abstract": "Linearization and planarization of the circuit models is pivotal to the submicron technologies. On the other hand, the characteristics of the VLSI circuits can be sometimes improved by using the multivalued components. It was shown that any l-level circuit based on the multivalued components is representable as an algebraic model based on l linear arithmetic polynomials mapped correspondingly into l decision diagrams that are linear and planar by nature. Complexity of representing a circuit as the linear decision diagram was estimated as O(G) with G for the number of multivalued components in the circuit. The results of testing the LinearDesignMV algorithm on circuits of more than 8000 LGSynth 93 multivalued components were presented", "fulltext": "", "keywords": "circuit representation complexity;linear planar decision diagrams;vlsi circuits;lineardesignmv algorithm;linear circuit model;planarization;lgsynth 93 multivalued components;linear arithmetic polynomials;linearization;submicron technologies"}
+{"name": "test_2135", "title": "A new approach to the problem of structural identification. II", "abstract": "The subject under discussion is a new approach to the problem of structural identification, which relies on the recognition of a decisive role of the human factor in the process of structural identification. Potential possibilities of the suggested approach are illustrated by the statement of a new mathematical problem of structural identification", "fulltext": "", "keywords": "human factor;mathematical equations;structural identification;decision-maker"}
+{"name": "test_2136", "title": "A method of determining a sequence of the best solutions to the problems of", "abstract": "optimization on finite sets and the problem of network reconstruction A method of determining a sequence of the best solutions to the problems of optimization on finite sets was proposed. Its complexity was estimated by a polynomial of the dimension of problem input, given number of sequence terms, and complexity of completing the design of the original extremal problem. The technique developed was applied to the typical problem of network reconstruction with the aim of increasing its throughput under restricted reconstruction costs", "fulltext": "", "keywords": "optimization;best solutions;complexity;network reconstruction;finite sets"}
+{"name": "test_2137", "title": "Stabilization of a linear object by frequency-modulated pulsed signals", "abstract": "A control system consisting of an unstable continuous linear part and a pulse-frequency modulator in the feedback circuit is studied. Conditions for the boundedness of the solutions of the system under any initial data are determined", "fulltext": "", "keywords": "linear stationary object;stabilization;frequency-modulated pulsed signals;control system;discrete systems;feedback circuit;solution boundedness"}
+{"name": "test_2138", "title": "Reachability sets of a class of multistep control processes: their design", "abstract": "An upper estimate and an iterative \"restriction\" algorithm for the reachability set for determining the optimal control for a class of multistep control processes are designed", "fulltext": "", "keywords": "optimal control;multistep control processes;reachability sets;upper estimate;iterative restriction algorithm;discrete systems"}
+{"name": "test_2139", "title": "Generalized confidence sets for a statistically indeterminate random vector", "abstract": "A problem is considered for the construction of confidence sets for a random vector, the information on distribution parameters of which is incomplete. To obtain exact estimates and a detailed analysis of the problem, the notion is introduced of a generalized confidence set for a statistically indeterminate random vector. Properties of generalized confidence sets are studied. It is shown that the standard method of estimation, which relies on the unification of confidence sets, leads in many cases to wider confidence estimates. For a normally distributed random vector with an inaccurately known mean value, generalized confidence sets are built tip and the dependence of sizes of a generalized confidence set on the forms and parameters of a set of possible mean values is examined", "fulltext": "", "keywords": "generalized confidence sets;normally distributed random vector;distribution parameters;statistically indeterminate random vector"}
+{"name": "test_214", "title": "Evolution of the high-end computing market in the USA", "abstract": "This paper focuses on the technological change in the high-end computing market. The discussion combines historical analysis with strategic analysis to provide a framework to analyse a key component of the computer industry. This analysis begins from the perspective of government research and development spending; then examines the confusion around the evolution of the high-end computing market in the context of standard theories of technology strategy and new product innovation. Rather than the high-end market being 'dead', one should view the market as changing due to increased capability and competition from the low-end personal computer market. The high-end market is also responding to new product innovation from the introduction of new parallel computing architectures. In the conclusion, key leverage points in the market are identified and the trends in high-end computing are highlighted with implications", "fulltext": "", "keywords": "low-end personal computer market;supercomputing;new product innovation;parallel computing architectures;historical analysis;government research;strategic analysis;computer industry;competition;development spending;high-end computing market evolution;usa;technology strategy"}
+{"name": "test_2140", "title": "Strong active solution in non-cooperative games", "abstract": "For the non-cooperative games and the problems of accepting or rejecting a proposal, a new notion of equilibrium was proposed, its place among the known basic equilibria was established, and its application to the static and dynamic game problems was demonstrated", "fulltext": "", "keywords": "static game problems;noncooperative games;strong active solution;dynamic game problems"}
+{"name": "test_2141", "title": "System embedding. Control with reduced observer", "abstract": "Two interrelated problems-design of the reduced observer of plant state separately and together with its control system-were considered from the standpoint of designing the multivariable linear systems from the desired matrix transfer functions. The matrix equations defining the entire constructive class of solutions of the posed problems were obtained using the system embedding technology. As was demonstrated, control based on the reduced observer is capable to provide the desired response to the control input, as well as the response to the nonzero initial conditions, only for the directly measurable part of the components of the state vector. An illustrative example was presented", "fulltext": "", "keywords": "reduced observer control;multivariable linear systems;reduced plant state observer design;matrix transfer functions;state vector;system embedding"}
+{"name": "test_2142", "title": "Spectral characteristics of the linear systems over a bounded time interval", "abstract": "Consideration was given to the spectral characteristics of the linear dynamic systems over a bounded time interval. Singular characteristics of standard dynamic blocks, transcendental characteristic equations, and partial spectra of the singular functions were studied. Relationship between the spectra under study and the classical frequency characteristic was demonstrated", "fulltext": "", "keywords": "standard dynamic blocks;linear dynamic systems;partial spectra;spectral characteristics;bounded time interval;frequency characteristic;singular functions;transcendental characteristic equations;singular characteristics"}
+{"name": "test_2143", "title": "Quantum computing with solids", "abstract": "Science and technology could be revolutionized by quantum computers, but building them from solid-state devices will not be easy. The author outlines the challenges in scaling up the technology from lab experiments to practical devices", "fulltext": "", "keywords": "solid-state devices;quantum computers"}
+{"name": "test_2144", "title": "The perils of privacy", "abstract": "The recent string of failures among dotcom companies has heightened fears of privacy abuse. What should happen to the names and addresses on a customer list if these details were obtained under a privacy policy which specified no disclosure to any third party? Should the personal data in the list be deemed to be an asset of a failing company which can be transferred to any future (third party) purchaser for its purposes? Or should the privacy policy take precedence over the commercial concerns of the purchaser?", "fulltext": "", "keywords": "customer list;disclosure;privacy abuse;privacy policy"}
+{"name": "test_2145", "title": "Enterprise in focus at NetSec 2002", "abstract": "NetSec 2002 took place in San Francisco, amid industry reflection on the balance to be struck between combatting cyber-terrorism and safeguarding civil liberties post-9.11. The author reports on the punditry and the pedagogy at the CSI event, focusing on security in the enterprise", "fulltext": "", "keywords": "csi;netsec 2002;enterprise security"}
+{"name": "test_2146", "title": "Trusted...or...trustworthy: the search for a new paradigm for computer and", "abstract": "network security This paper sets out a number of major questions and challenges which include: (a) just what is meant by `trusted' or `trustworthy' systems after 20 years of experience, or more likely, lack of business level experience, with the 'trusted computer system' criteria anyway; (b) does anyone really care about the adoption of international standards for computer system security evaluation by IT product and system manufacturers and suppliers (IS 15408) and, if so, how does it all relate to business risk management anyway (IS 17799); (c) with the explosion of adoption of the microcomputer and personal computer some 20 years ago, has the industry abandoned all that it learnt about security during the `mainframe era'; or - `whatever happened to MULTICS' and its lessons; (d) has education kept up with security requirements by industry and government alike in the need for safe and secure operation of large scale and networked information systems on national and international bases, particularly where Web or Internet-based information services are being proposed as the major `next best thing' in the IT industry; (e) has the `fourth generation' of computer professionals inherited the spirit of information systems management and control that resided by necessity with the last `generation', the professionals who developed and created the applications for shared mainframe and minicomputer systems?", "fulltext": "", "keywords": "fourth generation computer professionals;network security;multics;web;trustworthy systems;internet-based information services;computer security;business risk management;information systems control;it manufacturers;microcomputer;large scale information systems;is 15408;trusted systems;information systems management;is 17799;education;international standards;personal computer"}
+{"name": "test_2147", "title": "Much ado about nothing: Win32.Perrun", "abstract": "JPEG files do not contain any executable code and it is impossible to infect such files. The author takes a look at the details surrounding the Win32.Perrun virus and make clear exactly what it does. The main virus feature is its ability to affect JPEG image files (compressed graphic images) and to spread via affected JPEG files. The virus affects, or modifies, or alters JPEG files but does not \"infect\" them", "fulltext": "", "keywords": "jpeg files;compressed graphic images;win32.perrun;virus"}
+{"name": "test_2148", "title": "Information security policy - what do international information security", "abstract": "standards say? One of the most important information security controls, is the information security policy. This vital direction-giving document is, however, not always easy to develop and the authors thereof battle with questions such as what constitutes a policy. This results in the policy authors turning to existing sources for guidance. One of these sources is the various international information security standards. These standards are a good starting point for determining what the information security policy should consist of, but should not be relied upon exclusively for guidance. Firstly, they are not comprehensive in their coverage and furthermore, tending to rather address the processes needed for successfully implementing the information security policy. It is far more important the information security policy must fit in with the organisation's culture and must therefore be developed with this in mind", "fulltext": "", "keywords": "information security policy;international information security standards"}
+{"name": "test_2149", "title": "Security crisis management - the basics", "abstract": "Of the more pervasive problems in any kind of security event is how the security event is managed from the inception to the end. There's a lot written about how to manage a specific incident or how to deal with a point problem such as a firewall log, but little tends to be written about how to deal with the management of a security event as part of corporate crisis management. This article discusses the basics of security crisis management and of the logical steps required to ensure that a security crisis does not get out of hand", "fulltext": "", "keywords": "security crisis management;firewall log;security event;corporate crisis management"}
+{"name": "test_215", "title": "A conceptual framework for evaluation of information technology investments", "abstract": "The decision to acquire a new information technology poses a number of serious evaluation and selection problems to technology managers, because the new system must not only meet current information requirements of the organisation, but also the needs for future expansion. Tangible and intangible benefits factors, as well as risks factors, must be identified and evaluated. The paper provides a review of ten major evaluation categories and available models, which fall under each category, showing their advantages and disadvantages in handling the above difficulties. This paper describes strategic implications involved in the selection decision, and the inherent difficulties in: (1) choosing or developing a model, (2) obtaining realistic inputs for the model, and (3) making tradeoffs among the conflicting factors. It proposes a conceptual framework to help the decision maker in choosing the most appropriate methodology in the evaluation process. It also offers a new model, called GAHP, for the evaluation problem combining integer goal linear programming and analytic hierarchy process (AHP) in a single hybrid multiple objective multi-criteria model. A goal programming methodology, with zero-one integer variables and mixed integer constraints, is used to set goal target values against which information technology alternatives are evaluated and selected. AHP is used to structure the evaluation process providing pairwise comparison mechanisms to quantify subjective, nonmonetary, intangible benefits and risks factors, in deriving data for the model. A case illustration is provided showing how GAHP can be formulated and solved", "fulltext": "", "keywords": "risks factors;evaluation categories;decision maker;intangible benefits;goal programming methodology;analytic hierarchy process;tradeoffs;information requirements;group decision process;information technology investments;technology managers;hybrid multiple objective multi-criteria model;information technology alternatives;pairwise comparison mechanisms;goal target values;mixed integer constraints;nonmonetary benefits;selection decision;zero-one integer variables"}
+{"name": "test_2150", "title": "Data storage: re-format. Closely tracking a fast-moving sector", "abstract": "In the past few years the data center market has changed dramatically, forcing many companies into consolidation or bankruptcy. Gone are the days when companies raised millions of dollars to acquire large industrial buildings and transform them into glittering, high-tech palaces filled with the latest telecommunication and data technology. Whereas manufacturers of communication technology deliver the racked equipment in these, often mission-critical, facilities, ABB focuses mainly on the building infrastructure. Besides the very important redundant power supply, ABB also provides the redundant air conditioning and the security system", "fulltext": "", "keywords": "building infrastructure;security system;engineering management;data centers;commissioning;redundant power supply;mission-critical facilities;installation;building management;abb;project management;redundant air conditioning"}
+{"name": "test_2151", "title": "Industrial/sup IT/ for performance buildings", "abstract": "ABB has taken a close look at how buildings are used and has come up with a radical solution for the technical infrastructure that places the end-user's processes at the center and integrates all the building's systems around their needs. The new solution is based on the realization that tasks like setting up an office meeting, registering a hotel guest or moving a patient in a hospital, can all benefit from the same Industrial IT concepts employed by ABB to optimize manufacturing, for example in the automotive industry", "fulltext": "", "keywords": "building systems integration;building management system;technical infrastructure;industrial/sup it/;abb;industrial it concepts"}
+{"name": "test_2152", "title": "Virtual engineering office: a state-of-the-art platform for engineering", "abstract": "collaboration A sales force in Latin America, the design department in Europe, and production in Asia? Arrangements of this kind are the new business reality for today's global manufacturing companies. But how are such global operations to be effectively coordinated? ABB's answer was to develop and implement a new platform for high-performance, real-time collaboration. Globally distributed engineering teams can now work together, regardless of time, location or the CAD system they use, making ABB easier to do business with, for customers as well as suppliers", "fulltext": "", "keywords": "cad system;globally distributed engineering teams;business;engineering collaboration platform;state-of-the-art;abb;virtual engineering office;global manufacturing companies"}
+{"name": "test_2153", "title": "Post-haste. 100th robotic containerization system installed in US mail sorting", "abstract": "center Spot welding, machine tending, material handling, picking, packing, painting, palletizing, assembly...the list of tasks being performed by ABB robots keeps on growing. Adding to this portfolio is a new robot containerization system (RCS) that ABB developed specifically for the United States Postal Service (USPS). The RCS has brought new levels of speed, accuracy, efficiency and productivity to the process of sorting and containerizing mail and packages. Recently, the 100th ABB RCS was installed at the USPS processing and distribution center in Columbus, Ohio", "fulltext": "", "keywords": "mail sorting center;mail sorting;usa;robotic containerization system;abb robots;packages sorting;united states postal service"}
+{"name": "test_2154", "title": "Optimize/sup IT/ robot condition monitoring tool", "abstract": "As robots have gained more and more 'humanlike' capability, users have looked increasingly to their builders for ways to measure the critical variables-the robotic equivalent of a physical check-up-in order to monitor their condition and schedule maintenance more effectively. This is all the more essential considering the tremendous pressure there is to improve productivity in today's global markets. Developed for ABB robots with an S4-family controller and based on the company's broad process know-how, Optimize/sup IT/ robot condition monitoring offers maintenance routines with embedded checklists that give a clear indication of a robot's operating condition. It performs semi-automatic measurements that support engineers during trouble-shooting and enable action to be taken to prevent unplanned stops. By comparing these measurements with reference data, negative trends can be detected early and potential breakdowns predicted. Armed with all these features, Optimize/sup IT/ robot condition monitoring provides the ideal basis for reliability-centered maintenance (RCM) for robots", "fulltext": "", "keywords": "condition monitoring;maintenance scheduling;semi-automatic measurements;reliability-centered maintenance;s4-family controller;optimize/sup it/ robot condition monitoring tool;abb robots"}
+{"name": "test_2156", "title": "Pane relief. Robotic solutions for car windshield assembly", "abstract": "Just looking through a car's windshield doesn't give us much reason to wonder about how it's made. The idea that special manufacturing expertise might be required can hardly occur to anyone, but that's exactly what is needed to ensure crystal-clear visibility, not to mention a perfect fit every time one is pressed into place on a car production line. Comprising two thin glass sheets joined by a vinyl interlayer, windshields are assembled-usually manually-to very precise product and environmental specifications. To make sure this is done as perfectly as possible, the industry invests heavily in the equipment used for their fabrication. ABB has now developed a robot-based Compact Assembling System for the automatic assembly of laminated windshields that speeds up production and increases cost efficiency", "fulltext": "", "keywords": "cost efficiency;car production line;compact assembling system;abb;manufacturing expertise;production;car windshield assembly robots;laminated windshields assembly automation"}
+{"name": "test_2157", "title": "Shaping the future. BendWizard: a tool for off-line programming of robotic", "abstract": "tending systems Setting up a robot to make metal cabinets or cases for desktop computers can be a complex operation. For instance, one expert might be required to carry out a feasibility study, and then another to actually program the robot. Understandably, the need for so much expertise, and the time that's required, generally limits the usefulness of automation to high-volume production. Workshops producing parts in batches smaller than 50 or so, or which rely heavily on semiskilled operators, are therefore often discouraged from investing in automation, and so miss out on its many advantages. What is needed is a software tool that operators without special knowledge of robotics, or with no more than rudimentary CAD skills, can use. One which allows easy offline programming and simulation of the work cell on a PC", "fulltext": "", "keywords": "work cell simulation;feasibility study;workshops;bendwizard offline programming tool;cad skills;metal cabinets;robotic tending systems;desktop computer cases;high-volume production"}
+{"name": "test_2158", "title": "Press shop. Industrial IT solutions for the press shop", "abstract": "Globalization of the world's markets is challenging the traditional limits of manufacturing efficiency. The competitive advantage belongs to those who understand the new requirements and opportunities, and who commit to integrated solutions that span the value chain all the way from demand to production. ABB's automation and IT expertise and the process know-how gained from its long involvement with the automotive industry, have been brought together in new, state-of-the-art software solutions for press shops. Integrated into Industrial IT architecture, they allow the full potential of the shops to be realized, with advantages at every step in the supply chain", "fulltext": "", "keywords": "car manufacturing business;supply chain;industrial it solutions;automation;state-of-the-art;software solutions;press shops;market globalisation;manufacturing efficiency"}
+{"name": "test_2159", "title": "Real-time enterprise solutions for discrete manufacturing and consumer goods", "abstract": "Customer satisfaction and a focus on core competencies have dominated the thinking of a whole host of industries in recent years. However, one outcome, the outsourcing of noncore activities, has made the production of goods-from order entry to final delivery-more and more complex. Suppliers, subsuppliers, producers and customers are therefore busy adopting a new, more collaborative approach. This is mainly taking the form of order-driven planning and scheduling of production, but it is also being steered by a need to reduce inventories and working capital as well as a desire to increase throughput and optimize production", "fulltext": "", "keywords": "customer satisfaction;order-driven planning;discrete manufacturing;consumer goods;real-time enterprise solutions;production scheduling;working capital reduction;inventories reduction;core competencies"}
+{"name": "test_216", "title": "Extinction cross sections of realistic raindrops: data-bank established using", "abstract": "T-matrix method and nonlinear fitting technique A new computer program is developed based on the T-matrix method to generate a large number of total (extinction) cross sections (TCS) values of the realistic raindrops that are deformed due to a balance of the forces that act on a drop failing under gravity, and were described in shape by Pruppacher and Pitter (1971). These data for various dimensions of the raindrops (mean effective radius from 0 to 3.25 mm), frequencies (10 to 80 GHz), (horizontal and vertical) polarizations, and temperatures (0, 10 and 20 degrees C) are stored to establish a data bank. Furthermore, a curve fitting technique, i.e., interpolation of order 3, is implemented for the TCS values in the data bank. Therefore, the interpolated TCS results can be obtained readily from the interpolation process with negligible or even null computational time and efforts. Error analysis is carried out to show the high accuracy of the present analysis and applicability of the interpolation. At three operating frequencies of 15, 21.225, and 38 GHz locally used in Singapore, some new TCS values are obtained from the new fast and efficient interpolation with a good accuracy", "fulltext": "", "keywords": "singapore;extinction cross sections;computer program;error analysis;t-matrix method;10 to 80 ghz;38 ghz;shf;realistic raindrops;electromagnetic wave scattering;em wave scattering;21.225 ghz;operating frequencies;mean effective radius;data-bank;temperature;gravity;0 to 3.25 mm;15 ghz;10 c;20 c;horizontal polarization;interpolation;0 c;ehf;nonlinear curve fitting technique;vertical polarization;total cross sections"}
+{"name": "test_2160", "title": "Challenges and trends in discrete manufacturing", "abstract": "Over 50 years ago, the 100,000 workers at Ford's Rouge automobile factory turned out 1200 cars per day. Nowadays, Ford's plant on that same site still produces 800 cars each day but with just 3000 workers. Similar stories abound in the manufacturing industries; technology revolution and evolution; a shift from vertical integration, better business and production practices and improved industrial relations-all have changed manufacturing beyond recognition. So what are the current challenges and trends in manufacturing? Certainly, the relentless advance of technology will continue, as will user pressure for more customized design or improved environmental friendliness. Some trends are already with us and more, as yet indiscernible, will come. But one major, fundamental shift now resounding throughout industry is the way in which information involving every single aspect of the manufacturing process is being integrated into one seamless system", "fulltext": "", "keywords": "discrete manufacturing;technology revolution;challenges;automobile factory;seamless manufacturing process;industrial relations;trends;production practices;technology evolution;business practices"}
+{"name": "test_2161", "title": "On the Beth properties of some intuitionistic modal logics", "abstract": "Let L be one of the intuitionistic modal logics. As in the classical modal case, we define two different forms of the Beth property for L, which are denoted by B1 and B2; in this paper we study the relation among B1, B2 and the interpolation properties C1 and C2. It turns out that C1 implies B1, but contrary to the boolean case, is not equivalent to B1. It is shown that B2 and C2 are independent, and moreover it comes out that, in contrast to classical case, there exists an extension of the intuitionistic modal logic of S/sub 4/-type, that has not the property B2. Finally we give two algebraic properties, that characterize respectively B1 and B2", "fulltext": "", "keywords": "intuitionistic modal logics;interpolation properties;beth properties"}
+{"name": "test_2162", "title": "More constructions for Boolean algebras", "abstract": "We construct Boolean algebras with prescribed behaviour concerning depth for the free product of two Boolean algebras over a third, in ZFC using pcf; assuming squares we get results on ultraproducts. We also deal with the family of cardinalities and topological density of homomorphic images of Boolean algebras (you can translate it to topology-on the cardinalities of closed subspaces); and lastly we deal with inequalities between cardinal invariants, mainly d(B)/sup kappa /<|B| implies ind(B)>/sup kappa /V Depth(B)>or=log(|B|)", "fulltext": "", "keywords": "ultraproducts;boolean algebras;free product;prescribed behaviour;zfc;cardinal invariants;homomorphic images"}
+{"name": "test_2163", "title": "IT as a key enabler to law firm competitiveness", "abstract": "Professional services firms have traditionally been able to thrive in virtually any market conditions. They have been consistently successful for several decades without ever needing to reexamine or change their basic operating model. However, gradual but inexorable change in client expectations and the business environment over recent years now means that more of the same is no longer enough. In future, law firms will increasingly need to exploit IT more effectively in order to remain competitive. To do this, they will need to ensure that all their information systems function as an integrated whole and are available to their staff, clients and business partners. The authors set out the lessons to be learned for law firms in the light of the recent PA Consulting survey", "fulltext": "", "keywords": "law firms;professional services firms;information systems;client expectations;business environment"}
+{"name": "test_2164", "title": "Electronic signatures - much ado?", "abstract": "Whilst the market may be having a crisis of confidence regarding the prospects for e-commerce, the EU and the Government continue apace to develop the legal framework. Most recently, this has resulted in the Electronic Signatures Regulations 2002. These Regulations were made on 13 February 2002 and came into force on 8 March 2002. The Regulations implement the European Electronic Signatures Directive (1999/93/EC). Critics may say that the Regulations were implemented too late (they were due to have been implemented by 19 July 2001), with too short a consultation period (25 January 2002 to 12 February 2002) and with an unconvincing case as to what they add to English law (as to which, read on). The author explains the latest development on e-signatures and the significance of Certification Service Providers (CSPs)", "fulltext": "", "keywords": "electronic signatures regulations 2002;e-commerce;legal framework;european electronic signatures directive"}
+{"name": "test_2165", "title": "Naomi Campbell: drugs, distress and the Data Protection Act", "abstract": "In the first case of its kind, Naomi Campbell successfully sued Mirror Group Newspapers for damage and distress caused by breach of the Data Protection Act 1998. Partner N. Wildish and assistant M. Turle of City law firm Field Fisher Waterhouse discuss the case and the legal implications of which online publishers should be aware", "fulltext": "", "keywords": "drugs;naomi campbell;data protection act;online publishers;distress"}
+{"name": "test_2166", "title": "Don't always believe what you Reed [optimisation techniques for Web sites and", "abstract": "trade mark infringement] On 20 May 2002, Mr Justice Pumfrey gave judgment in the case of (1) Reed Executive Plc (2) Reed Solutions Plc versus (1) Reed Business Information Limited (2) Reed Elsevier (UK) Limited (3) totaljobs.com Limited. The case explored for the first time in any detail the extent to which the use of various optimisation techniques for Web sites could give rise to new forms of trade mark infringement and passing off. The author reports on the case and offers his comments", "fulltext": "", "keywords": "reed executive plc;reed solutions plc;trade mark infringement;optimisation techniques;totaljobs.com limited;reed business information limited;passing off;web sites;reed elsevier (uk) limited"}
+{"name": "test_2167", "title": "Finally! some sensible European legislation on software", "abstract": "The European Commission has formally tabled a draft Directive on the Protection by Patents of Computer-Implemented Inventions. The aim of this very important Directive is to harmonise national patent laws relating to inventions using software. It follows an extensive consultation launched by the Commission in October 2000. The impetus behind the Directive was the recognition at EU level of a total lack of unity between the European Patent Office and European national courts in deciding what was or was not deemed patentable when it came to the subject of computer programs", "fulltext": "", "keywords": "computer programs;european patent office;law harmonisation;directive on the protection by patents of computer-implemented inventions;european commission;eu;national patent laws;national courts"}
+{"name": "test_2168", "title": "Cyberobscenity and the ambit of English criminal law", "abstract": "The author looks at a recent case and questions the Court of Appeal's approach. In the author's submission, the Court of Appeal's decision in Perrin was wrong. P published no material in England and Wales, and should not have been convicted of any offence under English law, even if it were proved that he sought to attract English subscribers to his site. That may be an unpalatable conclusion but, if the content of foreign-hosted Internet sites is to be controlled, the only sensible way forward is through international agreement and cooperation. The Council of Europe's Cybercrime Convention provides some indication of the limited areas over which widespread international agreement might be achieved", "fulltext": "", "keywords": "court of appeal;cybercrime convention;council of europe;criminal law;cyberobscenity;england;international agreement;internet sites"}
+{"name": "test_2169", "title": "E-government", "abstract": "The author provides an introduction to the main issues surrounding E-government modernisation and electronic delivery of all public services by 2005. The author makes it clear that E-government is about transformation, not computers and hints at the special legal issues which may arise", "fulltext": "", "keywords": "public services;legal issues;electronic delivery;e-government;modernisation"}
+{"name": "test_217", "title": "Vendor qualifications for IT staff and networking", "abstract": "In some cases, vendor-run accreditation schemes can offer an objective measure of a job applicant's skills, but they do not always indicate the true extent of practical abilities", "fulltext": "", "keywords": "job applicant;vendor-run accreditation schemes;it staff;network administrators;practical abilities"}
+{"name": "test_2170", "title": "Evolution of litigation support systems", "abstract": "For original paper see ibid., vol. 12, no. 6: \"The E-mail of the Species\". The author responds to that paper and argues that printing, scanning and imaging E-mails or other electronic (rather than paper) documents prior to listing and disclosure seems to be unnecessary, not 'proportionate' (from a costs point of view) and not particularly helpful, to either side. He asks how litigation support systems might evolve to help and support the legal team in their task", "fulltext": "", "keywords": "litigation support systems;legal team;e-mail"}
+{"name": "test_2171", "title": "Evicting orang utans from the office [electronic storage of legal files]", "abstract": "Having espoused the principle of the paperless office some time ago, we decided to apply it to our stored files. First we consulted the Law Society rules governing storage of files on electronic media. The next step was for us to draw up a protocol for scanning the files. The benefits of the exercise have been significant. The area previously used for storage has been freed for other use. Files are now available online, instantaneously. When we have needed to send out files to the client or following a change of solicitor, we have been able to do so almost immediately, by E-mail, retaining a copy for our future reference. The files are protected from loss or deterioration, back-up copies having been taken which are stored off site. The complete stored file archive can be put in your pocket (in CD-ROM format) or on a laptop, facilitating remote working", "fulltext": "", "keywords": "legal files;electronic storage;cd-rom;law society rules;paperless office;file scanning;file archive"}
+{"name": "test_2172", "title": "Electronic data exchange for real estate", "abstract": "With HM Land Registry's consultation now underway, no one denies that the property industry is facing a period of unprecedented change. PISCES (Property Information Systems Common Exchange) is a property-focused electronic data exchange standard. The standard is a set of definitions and rules to facilitate electronic transfer of data between key business areas and between different types of software packages that are used regularly by the property industry. It is not itself a piece of software but an enabling technology that allows software providers to prepare solutions within their own packages to transfer data between databases. This provides the attractive prospect of seamless transfer of data within and between systems and organisations", "fulltext": "", "keywords": "property information systems common exchange;pisces;software packages;databases;electronic data exchange;seamless transfer;hm land registry;property industry;standard"}
+{"name": "test_2173", "title": "E-mail and the legal profession", "abstract": "The widespread use of E-mail can be found in all areas of commerce, and the legal profession is one that has embraced this new medium of communication. E-mail is not without its drawbacks, however. Due to the nature of the technologies behind the medium, it is a less secure form of communication than many of those traditionally used by the legal profession, including DX, facsimile, and standard and registered post. There are a number of ways in which E-mails originating from the practice may be protected, including software encryption, hardware encryption and various methods of controlling and administering access to the E-mails", "fulltext": "", "keywords": "access control;e-mail;software encryption;legal profession;hardware encryption;secure communication"}
+{"name": "test_2174", "title": "Spam solution?", "abstract": "The author describes a solution to spam E-mails: disposable E-mail addresses (DEA). Mailshell's free trial Web-based E-mail service allows you, if you start getting spammed on that DEA, just to delete the DEA in Mailshell, and all E-mail thereafter sent to that address will automatically be junked (though you can later restore that address if you want). Mailshell allows any number of DEA", "fulltext": "", "keywords": "mailshell;web-based e-mail;spam e-mails;disposable e-mail addresses"}
+{"name": "test_2175", "title": "7 key tests in choosing your Web site firm", "abstract": "Most legal firms now have a Web site and are starting to evaluate the return on their investment. The paper looks at factors involved when choosing a firm to help set up or improve a Web site. (1) Look for a company that combines technical skills and business experience. (2) Look for a company that offers excellent customer service. (3) Check that the Web site firm is committed to developing and proactively updating the Web site. (4) Make sure the firm has a proven track record and a good portfolio. (5) Look for a company with both a breadth as well as depth of skills. (6) Make sure the firm can deliver work on target, in budget and to specification. (7) Ensure that you will enjoy working and feel comfortable with the Web site firm staff", "fulltext": "", "keywords": "legal firms;proactive updating;business experience;customer service;technical skills;return on investment;web site"}
+{"name": "test_2176", "title": "Why your Web strategy is, err, wrong", "abstract": "An awkward look at a few standard views from the author, who thinks that most people have got it, err, wrong. Like every other investment, when the time comes to sign the contract, the question that should be asked is not whether it is a good investment, but whether it is the best investment the firm can make with the money. the author argues that he would be surprised if any law firm Web site he has seen yet would jump that particular hurdle", "fulltext": "", "keywords": "law firm web site;web strategy"}
+{"name": "test_2177", "title": "A humanist's legacy in medical informatics: visions and accomplishments of", "abstract": "Professor Jean-Raoul Scherrer The objective is to report on the work of Prof. Jean-Raoul Scherrer, and show how his humanist vision, medical skills and scientific background have enabled and shaped the development of medical informatics over the last 30 years. Starting with the mainframe-based patient-centred hospital information system DIOGENE in the 70s, Prof. Scherrer developed, implemented and evolved innovative concepts of man-machine interfaces, distributed and federated environments, leading the way with information systems that obstinately focused on the support of care providers and patients. Through a rigorous design of terminologies and ontologies, the DIOGENE data would then serve as a basis for the development of clinical research, data mining, and lead to innovative natural language processing techniques. In parallel, Prof. Scherrer supported the development of medical image management, ranging from a distributed picture archiving and communication systems (PACS) to molecular imaging of protein electrophoreses. Recognizing the need for improving the quality and trustworthiness of medical information of the Web, Prof. Scherrer created the Health-On-the Net (HON) foundation. These achievements, made possible thanks to his visionary mind, deep humanism, creativity, generosity and determination, have made of Prof. Scherrer a true pioneer and leader of the human-centered, patient-oriented application of information technology for improving healthcare", "fulltext": "", "keywords": "data mining;internet;medical informatics;medical image management;professor jean-raoul scherrer;distributed systems;natural language processing;federated systems;mainframe based patient centered hospital information system;diogene system;pacs;man-machine interfaces"}
+{"name": "test_2178", "title": "Medicine in the 21 st century: global problems, global solutions", "abstract": "The objectives are to discuss application areas of information, technology in medicine and health care on the occasion of the opening of the Private Universitat fur Medizinische Informatik and Technik Tirol/University for Health Informatics and Technology Tyrol (LIMIT) at Innsbruck, Tyrol, Austria. Important application areas of information technology in medicine and health are appropriate individual access to medical knowledge, new engineering developments such as new radiant imaging methods and the implantable pacemaker/defibrillator devices, mathematical modeling for understanding the workings of the human body, the computer-based patient record, as well as new knowledge in molecular biology, human genetics, and biotechnology. Challenges and responsibilities for medical informatics research include medical data privacy and intellectual property rights inherent in the content of the information systems", "fulltext": "", "keywords": "radiant imaging methods;medicine;information technology;health care;biotechnology;engineering developments;human genetics;computer-based patient record;intellectual property rights;individual medical knowledge access;mathematical modeling;medical data privacy;human body;implantable pacemaker devices;implantable defibrillator devices;information systems;molecular biology;medical informatics research"}
+{"name": "test_2179", "title": "Guidelines, the Internet, and personal health: insights from the Canadian", "abstract": "HEALNet experience The objectives are to summarize the insights gained in collaborative research in a Canadian Network of Centres of Excellence, devoted to the promotion of evidence-based practice, and to relate this experience to Internet support of health promotion and consumer health informatics. A subjective review of insights is undertaken. Work directed the development of systems incorporating guidelines, care maps, etc., for use by professionals met with limited acceptance. Evidence-based tools for health care consumers are a desirable complement but require radically different content and delivery modes. In addition to evidence-based material offered by professionals, a wide array of Internet-based products and services provided by consumers for consumers emerged and proved a beneficial complement. The consumer-driven products and services provided via the Internet are a potentially important and beneficial complement of traditional health services. They affect the health consumer-provider roles and require changes in healthcare practices", "fulltext": "", "keywords": "consumer health informatics;evidence-based practice;personal health;health consumer-provider roles;canadian network of centres of excellence;collaborative research;health promotion;internet support"}
+{"name": "test_218", "title": "ISCSI poised to lower SAN costs", "abstract": "IT managers building storage area networks or expanding their capacity may be able to save money by using iSCSI and IP systems rather than Fibre Channel technologies", "fulltext": "", "keywords": "iscsi;ip systems;san costs;storage area networks"}
+{"name": "test_2180", "title": "Standard protocol for exchange of health-checkup data based on SGML: the", "abstract": "Health-checkup Data Markup Language (HDML) The objectives are to develop a health/medical data interchange model for efficient electronic exchange of data among health-checkup facilities. A Health-checkup Data Markup Language (HDML) was developed on the basis of the Standard Generalized Markup Language (SGML), and a feasibility study carried out, involving data exchange between two health checkup facilities. The structure of HDML is described. The transfer of numerical lab data, summary findings and health status assessment was successful. HDML is an improvement to laboratory data exchange. Further work has to address the exchange of qualitative and textual data", "fulltext": "", "keywords": "health status assessment;data interchange model;sgml;numerical lab data;health-checkup data markup language;health checkup data exchange;summary findings"}
+{"name": "test_2181", "title": "Development of a health guidance support system for lifestyle improvement", "abstract": "The objective is to provide automated advice for lifestyle adjustment based on an assessment of the results of a questionnaire and medical examination or health checkup data. A system was developed that gathers data based on questions regarding weight gain, exercise, smoking, sleep, eating habits, salt intake, animal fat intake, snacks, alcohol, and oral hygiene, body mass index, resting blood pressure, fasting blood sugar, total cholesterol, triglycerides, uric acid and liver function tests. Based on the relationships between the lifestyle data and the health checkup data, a health assessment sheet was generated for persons being allocated to a multiple-risk factor syndrome group. Health assessment and useful advice for lifestyle improvement were automatically extracted with the system, toward the high risk group for life style related diseases. The system is operational. In comparison with conventional, limited advice methods, we developed a practical system that defined the necessity for lifestyle improvement more clearly, and made giving advice easier", "fulltext": "", "keywords": "liver function tests;oral hygiene;sleep;salt intake;alcohol;snacks;fasting blood sugar;health checkup data;health guidance support system;smoking;eating habits;medical examination;lifestyle improvement;total cholesterol;weight gain;body mass index;uric acid;animal fat intake;questionnaire;resting blood pressure;triglycerides;exercise"}
+{"name": "test_2182", "title": "Organization design: The continuing influence of information technology", "abstract": "Drawing from an information processing perspective, this paper examines how information technology (IT) has been a catalyst in the development of new forms of organizational structures. The article draws a historical linkage between the relative stability of an organization's task environment starting after the Second World War to the present environmental instability that now characterizes many industries. Specifically, the authors suggest that advances in IT have enabled managers to adapt existing forms and create new models for organizational design that better fit requirements of an unstable environment. Time has seemingly borne out this hypothesis as the bureaucratic structure evolved to the matrix to the network and now to the emerging shadow structure. IT has gone from a support mechanism to a substitute for organizational structures in the form of the shadow structure. The article suggests that the evolving and expanding role of IT will continue for organizations that face unstable environments", "fulltext": "", "keywords": "environmental instability;information technology;information processing perspective;organization task environment;organization design;organizational structures"}
+{"name": "test_2183", "title": "Knowledge-based structures and organisational commitment", "abstract": "Organisational commitment, the emotional attachment of an employee to the employing organisation, has attracted a substantial body of literature, relating the concept to various antecedents, including organisational structure, and to a range of consequences, including financially important performance factors such as productivity and staff turnover. The new areas of knowledge management and learning organisations offer substantial promise as imperatives for the organisation of business enterprises. As organisations in the contemporary environment adopt knowledge-based structures to improve their competitive position, there is value in examining these structures against other performance related factors. Theoretical knowledge-based structures put forward by R. Miles et al. (1997) and J. Quinn et al. (1996) and an existing implementation are examined to determine common features inherent in these approaches. These features are posited as a typical form and their impact on organisational commitment and hence on individual and organisational performance is examined", "fulltext": "", "keywords": "knowledge-based structures;performance factors;emotional attachment;productivity;staff turnover;organisational commitment;earning organisations"}
+{"name": "test_2184", "title": "The evolution of information systems: Their impact on organizations and", "abstract": "structures Information systems and organization structures have been highly interconnected with each other. Over the years, information systems architectures as well as organization structures have evolved from centralized to more decentralized forms. This research looks at the evolution of both information systems and organization structures. In the process, it looks into the impact of computers on organizations, and examines the ways organization structures have changed, in association with changes in information system architectures. It also suggests logical linkages between information system architectures and their \"fit\" with certain organization structures and strategies. It concludes with some implications for emerging and future organizational forms, and provides a quick review of the effect of the Internet on small businesses traditionally using stand-alone computers", "fulltext": "", "keywords": "information systems evolution;information system architectures"}
+{"name": "test_2185", "title": "In search of a general enterprise model", "abstract": "Many organisations, particularly SMEs, are reluctant to invest time and money in models to support decision making. Such reluctance could be overcome if a model could be used for several purposes rather than using a traditional \"single perspective\" model. This requires the development of a \"general enterprise model\" (GEM), which can be applied to a wide range of problem domains with unlimited scope. Current enterprise modelling frameworks only deal effectively with nondynamic modelling issues whilst dynamic modelling issues have traditionally only been addressed at the operational level. Although the majority of research in this area relates to manufacturing companies, the framework for a GEM must be equally applicable to service and public sector organisations. The paper identifies five key design issues that need to be considered when constructing a GEM. A framework for such a GEM is presented based on a \"plug and play\" methodology and demonstrated by a simple case study", "fulltext": "", "keywords": "problem domains;case study;enterprise modelling frameworks;decision making;public sector organisations;dynamic modelling issues;business process re-engineering;single perspective model;general enterprise model;operational level;gem;smes;service sector organisations;plug and play methodology"}
+{"name": "test_2186", "title": "Strategies for high throughput, templated zeolite synthesis", "abstract": "The design and redesign of high throughput experiments for zeolite synthesis are addressed. A model that relates materials function to the chemical composition of the zeolite and the structure directing agent is introduced. Using this model, several Monte Carlo-like design protocols are evaluated. Multi-round protocols are bound to be effective, and strategies that use a priori information about the structure-directing libraries are found to be the best", "fulltext": "", "keywords": "figure of merit;materials function;material discovery;a priori information;phase-dependent random gaussian variables;catalytic activity;reflecting boundary conditions;metropolis-type method;templated zeolite synthesis;random energy model;catalytic selectivity;organo-cation template molecules;ligand libraries;structure directing agent;chemical composition;voronoi diagram;monte carlo-like design protocols;multi-round protocols;combinatorial methods;high throughput strategies;small molecule design"}
+{"name": "test_2187", "title": "Variable structure intelligent control for PM synchronous servo motor drive", "abstract": "The variable structure control (VSC) of discrete time systems based on intelligent control is presented in this paper. A novel approach is proposed for the state estimation. A linear observer is firstly designed. Then a neural network is used for compensating uncertainty. The parameter of the VSC scheme is adjusted online by a neural network. Practical operating results from a PM synchronous motor (PMSM) illustrate the effectiveness and practicability of the proposed approach", "fulltext": "", "keywords": "uncertainty compensation;state estimation;neural network;control design;pm synchronous servo motor drive;variable structure intelligent control;discrete time systems;control performance;linear observer"}
+{"name": "test_2188", "title": "A nonlinear modulation strategy for hybrid AC/DC power systems", "abstract": "A nonlinear control strategy to improve transient stability of a multi-machine AC power system with several DC links terminated in the presence of large disturbances is presented. The approach proposed in this paper is based on differential geometric theory, and the HVDC systems are taken as a variable admittance connected at the inverter or rectifier AC bus. After deriving the analytical description of the relationship between the variable admittance and active power flows of each generator, the traditional generator dynamic equations can thus be expressed with the variable admittance of HVDC systems as an additional state variable and changed to an affine form, which is suitable for global linearization method being used to determine its control variable. An important feature of the proposed method is that, the modulated DC power is an adaptive and non-linear function of AC system states, and it can be realized by local feedback and less transmitted data from, adjacent generators. The design procedure is tested on a dual-infeed hybrid AC/DC system", "fulltext": "", "keywords": "rectifier ac bus;local feedback;hvdc systems;active power flows;generator dynamic equations;affine form;dual-infeed hybrid ac/dc system;inverter;adjacent generators;multi-machine ac power system;global linearization method;variable admittance;differential geometric theory;hybrid ac/dc power systems;nonlinear control strategy;dc links;nonlinear modulation strategy;transient stability"}
+{"name": "test_2189", "title": "Mobile computing \"Killer app\" competition", "abstract": "Design competitions offer students an excellent way to gain hands-on experience in engineering and computer science courses. The University of Florida, in partnership with Motorola, has held two mobile computing design competitions. In Spring and Fall 2001, students in Abdelsalam Helal's Mobile Computing class designed killer apps for a Motorola smart phone", "fulltext": "", "keywords": "smart phone;mobile computing;motorola;design competitions"}
+{"name": "test_219", "title": "Firewall card shields data", "abstract": "The SlotShield 3000 firewall on a PCI card saves power and space, but might not offer enough security for large networks", "fulltext": "", "keywords": "pci card;slotshield 3000 firewall;security;large networks"}
+{"name": "test_2190", "title": "Standards for service discovery and delivery", "abstract": "For the past five years, competing industries and standards developers have been hotly pursuing automatic configuration, now coined the broader term service discovery. Jini, Universal Plug and Play (UPnP), Salutation, and Service Location Protocol are among the front-runners in this new race. However, choosing service discovery as the topic of the hour goes beyond the need for plug-and-play solutions or support for the SOHO (small office/home office) user. Service discovery's potential in mobile and pervasive computing environments motivated my choice", "fulltext": "", "keywords": "service discovery;service location protocol;salutation;pervasive computing;mobile computing;jini;universal plug and play"}
+{"name": "test_2191", "title": "The role of speech input in wearable computing", "abstract": "Speech recognition seems like an attractive input mechanism for wearable computers, and as we saw in this magazine's first issue, several companies are promoting products that use limited speech interfaces for specific tasks. However, we must overcome several challenges to using speech recognition in more general contexts, and interface designers must be wary of applying the technology to situations where speech is inappropriate", "fulltext": "", "keywords": "wearable computing;speech recognition;wearable computer;speech interfaces;background noise;mobile speech recognition;speech recognizers;speech input"}
+{"name": "test_2192", "title": "The ubiquitous provisioning of internet services to portable devices", "abstract": "Advances in mobile telecommunications and device miniaturization call for providing both standard and novel location- and context-dependent Internet services to mobile clients. Mobile agents are dynamic, asynchronous, and autonomous, making the MA programming paradigm suitable for developing novel middleware for mobility-enabled services", "fulltext": "", "keywords": "internet services;mobility-enabled services;mobile agents;device miniaturization;middleware;mobile telecommunications;mobile clients"}
+{"name": "test_2193", "title": "Integrating virtual and physical context to support knowledge workers", "abstract": "The Kimura system augments and integrates independent tools into a pervasive computing system that monitors a user's interactions with the computer, an electronic whiteboard, and a variety of networked peripheral devices and data sources", "fulltext": "", "keywords": "kimura system;electronic whiteboard;pervasive computing;knowledge workers;data sources;networked peripheral devices"}
+{"name": "test_2194", "title": "Data management in location-dependent information services", "abstract": "Location-dependent information services have great promise for mobile and pervasive computing environments. They can provide local and nonlocal news, weather, and traffic reports as well as directory services. Before they can be implemented on a large scale, however, several research issues must be addressed", "fulltext": "", "keywords": "news;traffic reports;pervasive computing;mobile computing;data management;wireless networks;weather;location-dependent information services;directory services"}
+{"name": "test_2195", "title": "Modeling privacy control in context-aware systems", "abstract": "Significant complexity issues challenge designers of context-aware systems with privacy control. Information spaces provide a way to organize information, resources, and services around important privacy-relevant contextual factors. In this article, we describe a theoretical model for privacy control in context-aware systems based on a core abstraction of information spaces. We have previously focused on deriving socially based privacy objectives in pervasive computing environments. Building on Ravi Sandhu's four-layer OM-AM (objectives, models, architectures, and mechanisms) idea, we aim to use information spaces to construct a model for privacy control that supports our socially based privacy objectives. We also discuss how we can introduce decentralization, a desirable property for many pervasive computing systems, into our information space model, using unified privacy tagging", "fulltext": "", "keywords": "privacy control;privacy;pervasive computing;smart office;context-aware systems"}
+{"name": "test_2196", "title": "ConChat: a context-aware chat program", "abstract": "ConChat is a context-aware chat program that enriches electronic communication by providing contextual information and resolving potential semantic conflicts between users.ConChat uses contextual information to improve electronic communication. Using contextual cues, users can infer during a conversation what the other person is doing and what is happening in his or her immediate surroundings. For example, if a user learns that the other person is talking with somebody else or is involved in some urgent activity, he or she knows to expect a slower response. Conversely, if the user learns that the other person is sitting in a meeting directly related to the conversation, he or she then knows to respond more quickly. Also, by informing users about the other person's context and tagging potentially ambiguous chat messages, ConChat explores how context can improve electronic communication by reducing semantic conflicts", "fulltext": "", "keywords": "semantic conflicts;context-aware chat program;conchat;contextual cues;contextual information"}
+{"name": "test_2197", "title": "A context-aware decision engine for content adaptation", "abstract": "Building a good content adaptation service for mobile devices poses many challenges. To meet these challenges, this quality-of-service-aware decision engine automatically negotiates for the appropriate adaptation decision for synthesizing an optimal content version", "fulltext": "", "keywords": "content adaptation;decision engine;quality-of-service-aware;mobile devices;optimal content version;adaptation decision"}
+{"name": "test_2198", "title": "Reconfigurable context-sensitive middleware for pervasive computing", "abstract": "Context-sensitive applications need data from sensors, devices, and user actions, and might need ad hoc communication support to dynamically discover new devices and engage in spontaneous information exchange. Reconfigurable Context-Sensitive Middleware facilitates the development and runtime operations of context-sensitive pervasive computing software", "fulltext": "", "keywords": "pervasive computing;context-sensitive applications;middleware;context-sensitive pervasive computing;reconfigurable context-sensitive middleware"}
+{"name": "test_2199", "title": "Activity and location recognition using wearable sensors", "abstract": "Using measured acceleration and angular velocity data gathered through inexpensive, wearable sensors, this dead-reckoning method can determine a user's location, detect transitions between preselected locations, and recognize and classify sitting, standing, and walking behaviors. Experiments demonstrate the proposed method's effectiveness", "fulltext": "", "keywords": "preselected locations;angular velocity;walking;sitting;measured acceleration;dead-reckoning method;user's location;transitions;standing;wearable sensors"}
+{"name": "test_22", "title": "Analyzing the benefits of 300 mm conveyor-based AMHS", "abstract": "While the need for automation in 300 mm fabs is not debated, the form and performance of such automation is still in question. Software simulation that compares conveyor-based continuous flow transport technology to conventional car-based wafer-lot delivery has detailed delivery time and throughput advantages to the former", "fulltext": "", "keywords": "delivery time;software simulation;car-based wafer-lot delivery;semiconductor fab;automated material handling system;throughput;wafer processing;300 mm;conveyor-based continuous flow transport technology"}
+{"name": "test_220", "title": "How to avoid merger pitfalls", "abstract": "Paul Diamond of consultancy KPMG explains why careful IT asset management is crucial to the success of mergers", "fulltext": "", "keywords": "it asset management;mergers;kpmg;consultancy"}
+{"name": "test_2200", "title": "Labscape: a smart environment for the cell biology laboratory", "abstract": "Labscape is a smart environment that we designed to improve the experience of people who work in a cell biology laboratory. Our goal in creating it was to simplify, laboratory work by making information available where it is needed and by collecting and organizing data where and when it is created into a formal representation that others can understand and process. By helping biologists produce a more complete record of their work with less effort, Labscape is designed to foster improved collaboration in conjunction with increased individual efficiency and satisfaction. A user-driven system, although technologically conservative, embraces a central goal of ubiquitous computing: to enhance the ability to perform domain tasks through fluid interaction with computational resources. Smart environments could soon replace the pen and paper commonly used in the laboratory setting", "fulltext": "", "keywords": "ubiquitous computing;labscape;smart environment;biochemical procedure;experimental technologies;laboratory work;cell biology"}
+{"name": "test_223", "title": "Broadcasts keep staff in picture [intranets]", "abstract": "Mark Hawkins, chief operating officer at UK-based streaming media specialist Twofourtv, explains how firms can benefit by linking their corporate intranets to broadcasting technology", "fulltext": "", "keywords": "corporate intranets;broadcasting technology;streaming media;twofourtv"}
+{"name": "test_224", "title": "Java portability put to the test", "abstract": "Sun Microsystems' recently launched Java Verification Program aims to enable companies to assess the cross-platform portability of applications written in Java, and to help software vendors ensure that their solutions can run in heterogenous J2EE application server environments", "fulltext": "", "keywords": "java verification program;cross-platform portability;sun microsystems"}
+{"name": "test_225", "title": "The eyes have it [hotel security]", "abstract": "CCTV systems can help lodging establishments accomplish a range of objectives, from deterring criminals to observing staff interactions with clientele. But pitfalls can arise if the CCTV system has not been properly integrated into the overall hotel security plan. CCTV system designs at new hotel properties are often too sophisticated, too complicated, and too costly, and do not take into consideration the security realities of site management. These problems arise when the professionals designing or installing the system, including architects, construction engineers, integrators, and consultants, are not familiar with a hotel's operating strategies or security standards", "fulltext": "", "keywords": "hotel security;site management;operating strategies;cctv system"}
+{"name": "test_226", "title": "Online masquerade: whose e-mail is it?", "abstract": "E-mails carrying viruses like the recent Klez worm use deceptively simple techniques and known vulnerabilities to spread from one computer to another with ease", "fulltext": "", "keywords": "viruses;vulnerabilities;e-mail;klez worm"}
+{"name": "test_227", "title": "Relativistic constraints on the distinguishability of orthogonal quantum states", "abstract": "The constraints imposed by special relativity on the distinguishability of quantum states are discussed. An explicit expression relating the probability of an error in distinguishing two orthogonal single-photon states to their structure, the time t at which a measurement starts, and the interval of time T elapsed from the start of the measurement until the time at which the outcome is obtained by an observer is given as an example", "fulltext": "", "keywords": "nonrelativistic quantum information theory;special relativity;relativistic constraints;quantum communication channels;time interval;observer;quantum-state distinguishability;orthogonal single-photon states;orthogonal quantum states"}
+{"name": "test_228", "title": "Rapid microwell polymerase chain reaction with subsequent ultrathin-layer gel", "abstract": "electrophoresis of DNA Large-scale genotyping, mapping and expression profiling require affordable, fully automated high-throughput devices enabling rapid, high-performance analysis using minute quantities of reagents. In this paper, we describe a new combination of microwell polymerase chain reaction (PCR) based DNA amplification technique with automated ultrathin-layer gel electrophoresis analysis of the resulting products. This technique decreases the reagent consumption (total reaction volume 0.75-1 mu L), the time requirement of the PCR (15-20 min) and subsequent ultrathin-layer gel electrophoresis based fragment analysis (5 min) by automating the current manual procedure and reducing the human intervention using sample loading robots and computerized real time data analysis. Small aliquots (0.2 mu L) of the submicroliter size PCR reaction were transferred onto loading membranes and analyzed by ultrathin-layer gel electrophoresis which is a novel, high-performance and automated microseparation technique. This system employs integrated scanning laser-induced fluorescence-avalanche photodiode detection and combines the advantages of conventional slab and capillary gel electrophoresis. Visualization of the DNA fragments was accomplished by \"in migratio\" complexation with ethidium bromide during the electrophoresis process also enabling real time imaging and data analysis", "fulltext": "", "keywords": "large-scale genotyping;integrated scanning lif apd detection;automated microseparation;real time imaging;expression profiling;reagent consumption;complexation with ethidium bromide;ultrathin-layer gel electrophoresis;automated electrophoresis analysis;rapid microwell polymerase chain reaction;dna amplification;rapid high-performance analysis;computerized real time data analysis;sample loading robots"}
+{"name": "test_229", "title": "Simple minds [health care IT]", "abstract": "A few things done properly, and soon, is the short-term strategy for the UK NHS IT programme. Can it deliver this time?", "fulltext": "", "keywords": "strategy;uk nhs it programme;health care"}
+{"name": "test_23", "title": "Absorption of long waves by nonresonant parametric microstructures", "abstract": "Using simple acoustical and mechanical models, we consider the conceptual possibility of designing an active absorbing (nonreflecting) coating in the form of a thin layer with small-scale stratification and fast time modulation of parameters. Algorithms for space-time modulation of the controlled-layer structure are studied in detail for a one-dimensional boundary-value problem. These algorithms do not require wave-field measurements, which eliminates the self-excitation problem that is characteristic of active systems. The majority of the considered algorithms of parametric control transform the low-frequency incident wave to high-frequency waves of the technological band for which the waveguiding medium inside the layer is assumed to be opaque (absorbing). The efficient use conditions are found for all the algorithms. It is shown that the absorbing layer can be as thin as desired with respect to the minimum spatial scale of the incident wave and ensures efficient absorption in a wide frequency interval (starting from zero frequency) that is bounded from above only by a finite space-time resolution of the parameter-control operations. The structure of a three-dimensional parametric \"'black\" coating whose efficiency is independent of the angle of incidence of an incoming wave is developed on the basis of the studied one-dimensional problems. The general solution of the problem of diffraction of incident waves from such a coating is obtained. This solution is analyzed in detail for the case of a disk-shaped element", "fulltext": "", "keywords": "fast time modulation;angle of incidence;one-dimensional problems;low-frequency incident wave;acoustical models;small-scale stratification;mechanical models;parametric control;high-frequency waves;controlled-layer structure;absorbing layer;space-time modulation;active absorbing coating;disk-shaped element;one-dimensional boundary-value problem;thin layer;diffraction;waveguiding medium;nonreflecting coating"}
+{"name": "test_230", "title": "2002 in-house fulfillment systems report [publishing]", "abstract": "CM's 13th annual survey of in-house fulfillment system suppliers brings you up to date on the current capabilities of the leading publication software packages", "fulltext": "", "keywords": "survey;in-house fulfillment system;publication software packages;suppliers"}
+{"name": "test_231", "title": "Writing the fulfillment RFP [publishing]", "abstract": "For the uninitiated, writing a request for proposal can seem both mysterious and daunting. Here's a format that will make you look like a pro the first time out", "fulltext": "", "keywords": "fulfillment;request for proposal;publisher"}
+{"name": "test_232", "title": "Library services today and tomorrow: lessons from iLumina, a digital library", "abstract": "for creating and sharing teaching resources This article is based on the emerging experience associated with a digital library of instructional resources, iLumina, in which the contributors of resources and the users of those resources are the same-an open community of instructors in science, mathematics, engineering, and technology. Moreover, it is not the resources, most of which will be distributed across the Internet, but metadata about the resources that is the focus of the central iLumina repository and its support services for resource contributors and users. The distributed iLumina library is a community-sharing library for repurposing and adding value to potentially useful, mostly non-commercial instructional resources that are typically more granular in nature than commercially developed course materials. The experience of developing iLumina is raising a range of issues that have nothing to do with the place and time characteristics of the instructional context in which iLumina instructional resources are created or used. The issues instead have their locus in the democratization of both the professional roles of librarians and the quality assurance mechanisms associated with traditional peer review", "fulltext": "", "keywords": "internet;user issues;interoperability;metadata;quality assurance;information resources;teaching resource sharing;professional roles;reusable software;community-sharing library;digital library;librarians;library automation;ilumina;peer review;distributed systems;standards;academic library"}
+{"name": "test_233", "title": "The Canadian National Site Licensing Project", "abstract": "In January 2000, a consortium of 64 universities in Canada signed a historic inter-institutional agreement that launched the Canadian National Site Licensing Project (CNSLP), a three-year pilot project aimed at bolstering the research and innovation capacity of the country's universities. CNSLP tests the feasibility of licensing, on a national scale, electronic versions of scholarly publications; in its initial phases the project is focused on full-text electronic journals and research databases in science, engineering, health and environmental disciplines. This article provides an overview of the CNSLP initiative, summarizes organizational and licensing accomplishments to date, and offers preliminary observations on challenges and opportunities for subsequent phases of the project", "fulltext": "", "keywords": "academic libraries;information resources;cnslp;research and innovation;inter-institutional agreement;electronic scholarly publications;canadian national site licensing project;full-text electronic journals;research databases"}
+{"name": "test_234", "title": "The UK's National Electronic Site Licensing Initiative (NESLI)", "abstract": "In 1998 the UK created the National Electronic Site Licensing Initiative (NESLI) to increase and improve access to electronic journals and to negotiate license agreements on behalf of academic libraries. The use of a model license agreement and the success of site licensing is discussed. Highlights from an interim evaluation by the Joint Information Systems Committee (JISC) are noted and key issues and questions arising from the evaluation are identified", "fulltext": "", "keywords": "license agreements;icolc;joint information systems committee;national electronic site licensing initiative;jisc;nesli;usage statistics;academic libraries;electronic journals"}
+{"name": "test_235", "title": "The role of CAUL (Council of Australian Libraries) in consortial purchasing", "abstract": "The Council of Australian University Librarians, constituted in 1965 for the purposes of cooperative action and the sharing of information, assumed the role of consortial purchasing agent in 1996 on behalf of its members and associate organisations in Australia and New Zealand. This role continues to grow in tandem with the burgeoning of electronic publication and the acceptance of publishers of the advantages of dealing with consortia. The needs of the Australian university community overlap significantly with consortia in North America and Europe, but important differences are highlighted", "fulltext": "", "keywords": "australia;europe;information sharing;cooperative action;north america;electronic publication;consortial purchasing;new zealand;council of australian university librarians"}
+{"name": "test_236", "title": "Licensing experiences in the Netherlands", "abstract": "The licensing strategy of university libraries in the Netherlands is closely connected with university policies to develop document servers and to make research publications available on the Web. National agreements have been made with major publishers, such as Elsevier Science and Kluwer Academic, to provide access to a wide range of scientific information and to experiment with new ways of providing information and new business models", "fulltext": "", "keywords": "web;kluwer academic;university libraries;document servers;elsevier science;scientific information;licensing strategy;netherlands;business models;university policies;research publications"}
+{"name": "test_237", "title": "International library consortia: positive starts, promising futures", "abstract": "Library consortia have grown substantially over the past ten years, both within North America and globally. As this resurgent consortial movement has begun to mature, and as publishers and vendors have begun to adapt to consortial purchasing models, consortia have expanded their agendas for action. The movement to globalize consortia is traced (including the development and current work of the International Coalition of Library Consortia-ICOLC). A methodology is explored to classify library consortia by articulating the key factors that affect and distinguish consortia as organizations within three major areas: strategic, tactical, and practical (or managerial) concerns. Common consortial values are examined, and a list of known international library consortia is presented", "fulltext": "", "keywords": "consortial purchasing models;international library consortia"}
+{"name": "test_238", "title": "The Open Archives Initiative: realizing simple and effective digital library", "abstract": "interoperability The Open Archives Initiative (OAI) is dedicated to solving problems of digital library interoperability. Its focus has been on defining simple protocols, most recently for the exchange of metadata from archives. The OAI evolved out of a need to increase access to scholarly publications by supporting the creation of interoperable digital libraries. As a first step towards such interoperability, a metadata harvesting protocol was developed to support the streaming of metadata from one repository to another, ultimately to a provider of user services such as browsing, searching, or annotation. This article provides an overview of the mission, philosophy, and technical framework of the OAI", "fulltext": "", "keywords": "user services;exchange metadata;annotation;digital library interoperability;searching;browsing;metadata harvesting protocol;streaming metadata;open archives initiative;protocols;scholarly publications"}
+{"name": "test_239", "title": "Content standards for electronic books: the OEBF publication structure and the", "abstract": "role of public interest participation In the emerging world of electronic publishing how we create, distribute, and read books will be in a large part determined by an underlying framework of content standards that establishes the range of technological opportunities and constraints for publishing and reading systems. But efforts to develop content standards based on sound engineering models must skillfully negotiate competing and sometimes apparently irreconcilable objectives if they are to produce results relevant to the rapidly changing course of technology. The Open eBook Forum's Publication Structure, an XML-based specification for electronic books, is an example of the sort of timely and innovative problem solving required for successful real-world standards development. As a result of this effort, the electronic book industry will not only happen sooner and on a larger scale than it would have otherwise, but the electronic books it produces will be more functional, more interoperable, and more accessible to all readers. Public interest participants have a critical role in this process", "fulltext": "", "keywords": "electronic publishing;oebf publication structure;public interest participation;content standards;xml-based specification;open ebook forum publication structure;electronic books"}
+{"name": "test_24", "title": "Fuzzy modeling based on generalized conjunction operations", "abstract": "An approach to fuzzy modeling based on the tuning of parametric conjunction operations is proposed. First, some methods for the construction of parametric generalized conjunction operations simpler than the known parametric classes of conjunctions are considered and discussed. Second, several examples of function approximation by fuzzy models, based on the tuning of the parameters of the new conjunction operations, are given and their approximation performances are compared with the approaches based on a tuning of membership functions and other approaches proposed in the literature. It is seen that the tuning of the conjunction operations can be used for obtaining fuzzy models with a sufficiently good performance when the tuning of membership functions is not possible or not desirable", "fulltext": "", "keywords": "fuzzy inference systems;generalized conjunction operations;membership functions;function approximation;t-norm;tuning;fuzzy modeling;approximation performances"}
+{"name": "test_240", "title": "Project Euclid and the role of research libraries in scholarly publishing", "abstract": "Project Euclid, a joint electronic journal publishing initiative of Cornell University Library and Duke University Press is discussed in the broader contexts of the changing patterns of scholarly communication and the publishing scene of mathematics. Specific aspects of the project such as partnerships and the creation of an economic model are presented as well as what it takes to be a publisher. Libraries have gained important and relevant experience through the creation and management of digital libraries, but they need to develop further skills if they want to adopt a new role in the life cycle of scholarly communication", "fulltext": "", "keywords": "partnerships;scholarly communication;project euclid;research libraries;duke university press;joint electronic journal publishing initiative;cornell university library;mathematics;economic model;scholarly publishing"}
+{"name": "test_241", "title": "Perspectives on scholarly online books: the Columbia University Online Books", "abstract": "Evaluation Project The Online Books Evaluation Project at Columbia University studied the potential for scholarly online books from 1995 to 1999. Issues included scholars' interest in using online books, the role they might play in scholarly life, features that scholars and librarians sought in online books, the costs of producing and owning print and online books, and potential marketplace arrangements. Scholars see potential for online books to make their research, learning, and teaching more efficient and effective. Librarians see potential to serve their scholars better. Librarians may face lower costs if they can serve their scholars with online books instead of print books. Publishers may be able to offer scholars greater opportunities to use their books while enhancing their own profitability", "fulltext": "", "keywords": "scholarly online books;learning;costs;marketplace arrangements;columbia university online books evaluation project;research;print books"}
+{"name": "test_242", "title": "The California Digital Library and the eScholarship program", "abstract": "The eScholarship program was launched in 2000 to foster faculty-led innovation in scholarly publishing. An initiative of the University of California (UC) and a program of the California Digital Library, the eScholarship program has stimulated significant interest in its short life. Its modest but visible accomplishments garner praise from many quarters, within and beyond the University of California. In perhaps the best indication of its timeliness and momentum, there are more proposals submitted to eScholarship today than the CDL can manage. This early success is due in part to the sheer power of an idea whose time has come, but also to the unique approach on which CDL was founded and the eScholarship initiative was first launched", "fulltext": "", "keywords": "faculty-led innovation;scholarly publishing;university of california;escholarship program;california digital library"}
+{"name": "test_243", "title": "BioOne: a new model for scholarly publishing", "abstract": "This article describes a unique electronic journal publishing project involving the University of Kansas, the Big 12 Plus Libraries Consortium, the American Institute of Biological Sciences, Allen Press, and SPARC, the Scholarly Publishing and Academic Resources Coalition. This partnership has created BioOne, a database of 40 full-text society journals in the biological and environmental sciences, which was launched in April, 2001. The genesis and development of the project is described and financial, technical, and intellectual property models for the project are discussed. Collaborative strategies for the project are described", "fulltext": "", "keywords": "environmental sciences;bioone full-text society journal database;biological sciences;scholarly publishing and academic resources coalition;allen press;university of kansas;electronic journal publishing project;american institute of biological sciences;scholarly publishing model;big 12 plus libraries consortium;technical models;collaborative strategies;financial models;sparc;intellectual property models"}
+{"name": "test_244", "title": "Symbiosis or alienation: advancing the university press/research library", "abstract": "relationship through electronic scholarly communication University presses and research libraries have a long tradition of collaboration. The rapidly expanding electronic scholarly communication environment offers important new opportunities for cooperation and for innovative new models of publishing. The economics of libraries and scholarly publishers have strained the working relationship and promoted debates on important information policy issues. This article explores the context for advancing the partnership, cites examples of joint efforts in electronic publishing, and presents an action plan for working together", "fulltext": "", "keywords": "electronic publishing;university press/research library relationship;economics;information policy;electronic scholarly communication"}
+{"name": "test_245", "title": "Support vector machines model for classification of thermal error in machine", "abstract": "tools This paper addresses a change in the concept of machine tool thermal error prediction which has been hitherto carried out by directly mapping them with the temperature of critical elements on the machine. The model developed herein using support vector machines, a powerful data-training algorithm, seeks to account for the impact of specific operating conditions, in addition to temperature variation, on the effective prediction of thermal errors. Several experiments were conducted to study the error pattern, which was found to change significantly with variation in operating conditions. This model attempts to classify the error based on operating conditions. Once classified, the error is then predicted based on the temperature states. This paper also briefly describes the concept of the implementation of such a comprehensive model along with an on-line error assessment and calibration system in a PC-based open-architecture controller environment, so that it could be employed in regular production for the purpose of periodic calibration of machine tools", "fulltext": "", "keywords": "thermal error classification;online calibration system;pc-based open-architecture controller environment;support vector machines model;error pattern;data-training algorithm;machine tool thermal error prediction;svm;online error assessment;critical element temperature"}
+{"name": "test_246", "title": "Adaptive and efficient mutual exclusion", "abstract": "The paper presents adaptive algorithms for mutual exclusion using only read and write operations; the performance of the algorithms depends only on the point contention, i.e., the number of processes that are concurrently active during algorithm execution (and not on n, the total number of processes). Our algorithm has O(k) remote step complexity and O(log k) system response time, where k is the point contention. The remote step complexity is the maximal number of steps performed by a process where a wait is counted as one step. The system response time is the time interval between subsequent entries to the critical section, where one time unit is the minimal interval in which every active process performs at least one step. The space complexity of this algorithm is O(N log n), where N is the range of process names. We show how to make the space complexity of our algorithm depend solely on n, while preserving the other performance measures of the algorithm", "fulltext": "", "keywords": "performance measures;point contention;adaptive algorithms;system response time;active process;algorithm execution;adaptive mutual exclusion;minimal interval;space complexity;write operations;read operations;critical section;remote step complexity"}
+{"name": "test_247", "title": "The congenial talking philosophers problem in computer networks", "abstract": "Group mutual exclusion occurs naturally in situations where a resource can be shared by processes of the same group, but not by processes of different groups. For example, suppose data is stored in a CD-jukebox. Then, when a disc is loaded for access, users that need data on the disc can concurrently access the disc, while users that need data on a different disc have to wait until the current disc is unloaded. The design issues for group mutual exclusion have been modeled as the Congenial Talking Philosophers problem, and solutions for shared memory models have been proposed (Y.-J. Young, 2000; P. Keane and M. Moir, 1999). As in ordinary mutual exclusion and many other problems in distributed systems, however, techniques developed for shared memory do not necessarily apply to message passing (and vice versa). We investigate solutions for Congenial Talking Philosophers in computer networks where processes communicate by asynchronous message passing. We first present a solution that is a straightforward adaptation from G. Ricart and A.K. Agrawala's (1981) algorithm for ordinary mutual exclusion. Then we show that the simple modification suffers a severe performance degradation that could cause the system to behave as though only one process of a group can be in the critical section at a time. We then present a more efficient and highly concurrent distributed algorithm for the problem, the first such solution in computer networks", "fulltext": "", "keywords": "resource sharing;group mutual exclusion;asynchronous message passing;congenial talking philosophers problem;shared-memory models;distributed systems;process communication;concurrent distributed algorithm;computer networks;critical section"}
+{"name": "test_248", "title": "Universal dynamic synchronous self-stabilization", "abstract": "We prove the existence of a \"universal\" synchronous self-stabilizing protocol, that is, a protocol that allows a distributed system to stabilize to a desired nonreactive behaviour (as long as a protocol stabilizing to that behaviour exists). Previous proposals required drastic increases in asymmetry and knowledge to work, whereas our protocol does not use any additional knowledge, and does not require more symmetry-breaking conditions than available; thus, it is also stabilizing with respect to dynamic changes in the topology. We prove an optimal quiescence time n + D for a synchronous network of n processors and diameter D; the protocol can be made finite state with a negligible loss in quiescence time. Moreover, an optimal D + 1 protocol is given for the case of unique identifiers. As a consequence, we provide an effective proof technique that allows one to show whether self-stabilization to a certain behaviour is possible under a wide range of models", "fulltext": "", "keywords": "anonymous networks;quiescence time;nonreactive behaviour;graph fibrations;synchronous self-stabilizing protocol;optimal protocol;universal dynamic synchronous self-stabilization;synchronous network;finite state;self-stabilization;distributed system;unique identifiers;dynamic changes;optimal quiescence time;proof technique;topology"}
+{"name": "test_249", "title": "Randomized two-process wait-free test-and-set", "abstract": "We present the first explicit, and currently simplest, randomized algorithm for two-process wait-free test-and-set. It is implemented with two 4-valued single writer single reader atomic variables. A test-and-set takes at most 11 expected elementary steps, while a reset takes exactly 1 elementary step. Based on a finite-state analysis, the proofs of correctness and expected length are compressed into one table", "fulltext": "", "keywords": "symmetry breaking;randomized algorithm;asynchronous distributed protocols;wait-free read/write registers;correctness proofs;fault-tolerance;shared memory;finite-state analysis;expected elementary steps;4-valued single writer single reader atomic variables;randomized two-process wait-free test-and-set"}
+{"name": "test_25", "title": "Identification of evolving fuzzy rule-based models", "abstract": "An approach to identification of evolving fuzzy rule-based (eR) models is proposed. eR models implement a method for the noniterative update of both the rule-base structure and parameters by incremental unsupervised learning. The rule-base evolves by adding more informative rules than those that previously formed the model. In addition, existing rules can be replaced with new rules based on ranking using the informative potential of the data. In this way, the rule-base structure is inherited and updated when new informative data become available, rather than being completely retrained. The adaptive nature of these evolving rule-based models, in combination with the highly transparent and compact form of fuzzy rules, makes them a promising candidate for modeling and control of complex processes, competitive to neural networks. The approach has been tested on a benchmark problem and on an air-conditioning component modeling application using data from an installation serving a real building. The results illustrate the viability and efficiency of the approach", "fulltext": "", "keywords": "rule-base structure;identification;performance analysis;incremental unsupervised learning;fault detection;fault diagnostics;behavior modeling;forecasting;adaptive nonlinear control;evolving fuzzy rule-based models;air-conditioning component modeling;ranking;informative potential;robotics;knowledge extraction;complex processes;fuzzy rules;noniterative update"}
+{"name": "test_250", "title": "Aim for the enterprise: Microsoft Project 2002", "abstract": "A long-time favorite of project managers, Microsoft Project 2002 is making its enterprise debut. Its new Web-based collaboration tools and improved scalability with OLAP support make it much easier to manage multiple Web projects with disparate workgroups and budgets", "fulltext": "", "keywords": "web-based collaboration tools;multiple web project management;budgets;microsoft project 2002;scalability;workgroups;olap support"}
+{"name": "test_251", "title": "Central hub for design assets: Adobe GoLive 6.0", "abstract": "Adobe GoLive is a strong contender for Web authoring and publishing. Version 6.0 features a flexible GUI environment combined with a comprehensive workgroup and collaboration server, plus tight integration with leading design tools", "fulltext": "", "keywords": "web authoring;collaboration server;workgroup environment;workgroup server;gui;web publishing environment;application servers;macromedia swf format;adobe golive 6.0;animation and scripting tool;design-centric dynamic content;java;real;flash;livemotion 2.0"}
+{"name": "test_252", "title": "Reaching for five nines: ActiveWatch and SiteSeer", "abstract": "Every Web admin's dream is achieving the fabled five nines-99.999 percent uptime. To attain such availability, your Web site must be down no more than about five minutes per year. Technologies like RAID, clustering, and load balancing make this easier, but to actually track uptime, maintain auditable records, and discover patterns in failures to prevent downtime in the future, you'll need to set up external monitoring. Because your Internet connection is a key factor in measuring uptime, you must monitor your site from the Internet itself, beyond your firewall. You could monitor with custom software on remote hosts, or you could use one of the two reasonably priced services available: Mercury Interactive's ActiveWatch and Freshwater Software's SiteSeer. (Freshwater Software has been a subsidiary of Mercury Interactive for about a year now.) The two services offer a slightly different mix of features and target different markets. Both services offer availability and performance monitoring from several remote locations, alerts to email or pager, and periodic reports. They differ in what's most easily monitored, and in the way you interact with the services", "fulltext": "", "keywords": "remote locations;external monitoring;periodic reports;failure pattern discovery;performance monitoring;email alerts;mercury interactive activewatch;freshwater software siteseer;availability monitoring;internet connection;auditable records;web site;downtime;uptime tracking;pager alerts"}
+{"name": "test_253", "title": "Accessible streaming content", "abstract": "Make sure your Web site is offering quality service to all your users. The article provides some tips and tactics for making your streaming media accessible. Accessibility of streaming content for people with disabilities is often not part of the spec for multimedia projects, but it certainly affects your quality of service. Most of the resources available on Web accessibility deal with HTML. Fortunately, rich media and streaming content developers have a growing number of experts to turn to for information and assistance. The essentials of providing accessible streaming content are simple: blind and visually impaired people need audio to discern important visual detail and interface elements, while deaf and hard-of-hearing people need text to access sound effects and dialog. Actually implementing these principles is quite a challenge, though. Now due to a relatively new law in the US, known as Section 508, dealing with accessibility issues is becoming an essential part of publishing on the Web", "fulltext": "", "keywords": "content providers;html;disabled users;web site;accessible streaming content;web accessibility;streaming media;sound effects;deaf people;quality service;blind people;interface elements;hard-of-hearing people;streaming content developers;united states;accessibility issues;section 508;visual detail;visually impaired people;web publishing;multimedia projects"}
+{"name": "test_254", "title": "What you get is what you see [Web performance monitoring]", "abstract": "To get the best possible performance from your Web infrastructure, you'll need a complete view. Don't neglect the big picture because you're too busy concentrating on details. The increasing complexity of Web sites and the content they provide has consequently increased the complexity of the infrastructure that supports them. But with some knowledge of networking, a handful of useful tools, and the insight that those tools provide, designing and operating for optimal performance and reliability is within your grasp", "fulltext": "", "keywords": "web infrastructure;web performance;networking;web sites;reliability"}
+{"name": "test_255", "title": "The culture of usability", "abstract": "Now that most of us agree that usability testing is an integral investment in site development, it's time to recognize that the standard approach falls short. It is possible to do less work and get better results while spending less money. By bringing usability testing in-house and breaking tests into more manageable sessions, you can vastly improve your online offering without affecting your profit margin", "fulltext": "", "keywords": "web site;usability testing program"}
+{"name": "test_256", "title": "Debugging Web applications", "abstract": "The author considers how one can save time tracking down bugs in Web-based applications by arming yourself with the right tools and programming practices. A wide variety of debugging tools have been written with Web developers in mind", "fulltext": "", "keywords": "programming;web application debugging tools"}
+{"name": "test_257", "title": "Unsafe at any speed?", "abstract": "While Sun prides itself on Java's secure sandbox programming model, Microsoft takes a looser approach. Its C# language incorporates C-like concepts, including pointers and memory management. But is unsafe code really a boon to programmers, or is it a step backward?", "fulltext": "", "keywords": "sun java secure sandbox programming model;c-like concepts;memory management;pointers;microsoft c# language"}
+{"name": "test_258", "title": "Building digital collections at the OAC: current strategies with a view to", "abstract": "future uses Providing a context for the exploration of user defined virtual collections, the article describes the history and recent development of the Online Archive of California (OAC). Stating that usability and user needs are primary factors in digital resource development, issues explored include collaborations to build digital collections, reliance upon professional standards for description and encoding, system architecture, interface design, the need for user tools, and the role of archivists as interpreters in the digital environment", "fulltext": "", "keywords": "user needs;metadata standards;future uses;user studies;digital collections;encoded archival description;archival descriptive standards;online archive of california;oac;professional standards;user tools;system architecture;interface design;digital environment;user defined virtual collections;best practices;history;digital resource"}
+{"name": "test_259", "title": "Nuts and bolts: implementing descriptive standards to enable virtual", "abstract": "collections To date, online archival information systems have relied heavily on legacy finding aids for data to encode and provide to end users, despite fairly strong indications in the archival literature that such legacy data is problematic even as a mediated access tool. Archivists have only just begun to study the utility of archival descriptive data for end users in unmediated settings such as via the Web. The ability of future archival information systems to respond to the expectations and needs of end users is inextricably linked to archivists getting their collective data house in order. The General International Standard Archival Description (ISAD(G)) offers the profession a place from which to start extricating ourselves from the idiosyncracies of our legacy data and description practices", "fulltext": "", "keywords": "archival literature;general international standard archival description;virtual collections;online archive of california;archival information systems;isad;mediated access tool;descriptive standards;archivists;archival descriptive data;online archival information systems;collective data house;legacy data;oac;end users"}
+{"name": "test_26", "title": "Learning weights for the quasi-weighted means", "abstract": "We study the determination of weights for quasi-weighted means (also called quasi-linear means) when a set of examples is given. We consider first a simple case, the learning of weights for weighted means, and then we extend the approach to the more general case of a quasi-weighted mean. We consider the case of a known arbitrary generator f. The paper finishes considering the use of parametric functions that are suitable when the values to aggregate are measure values or ratio", "fulltext": "", "keywords": "learning;quasi-weighted means;measure values;parametric functions;ratio values;quasi-linear means"}
+{"name": "test_260", "title": "Prospecting virtual collections", "abstract": "Virtual collections are a distinct sub-species of digital collections and digital archives. Archivists and curators as archivists and curators do not construct virtual collections; rather they enable virtual collections through the application of descriptive and other standards. Virtual collections are constructed by end users", "fulltext": "", "keywords": "digitization;digital collections;curators;virtual collections;digital archives;descriptive standards;archivists;end users"}
+{"name": "test_261", "title": "Union outreach - a pilgrim's progress", "abstract": "As the American labor movement continues on its path toward reorganization and rejuvenation, archivists are challenged to ensure that the organizational, political, and cultural changes labor unions are experiencing are fully documented. The article examines the need for labor archivists to reach out actively to unions and the problems they face in getting their message across, not only to union leadership but also to union members. Outreach by labor archivists is vital on three critical fronts: the need to secure union funding in support of labor archival programs; obtaining union cooperation in reviewing and amending obsolete deposit agreements; and coordinating efforts with unions to save the records of closing district and local union offices. Attempting to resolve these outstanding issues, labor archivists are pulled between two distinct institutional cultures (one academic in nature, the other enmeshed in a union bureaucracy) and often have their own labor archival programs compromised by the internal dynamics and politics inherent in administering large academic libraries and unions. If labor archivists are to be successful, they must find their collective voice within the labor movement and establish their relevancy to unions during a period of momentous change and restructuring. Moreover, archivists need to give greater thought to designing and implementing outreach programs that bridge the fundamental \"disconnect\" between union bureaucracies and the rank and file, and unions and the public", "fulltext": "", "keywords": "collective voice;labor archival programs;labor unions;labor archivists;american labor movement;political changes;large academic libraries;institutional cultures;union members;union offices;internal dynamics;union bureaucracy;archivists;obsolete deposit agreements;union funding;union cooperation;cultural changes;union leadership"}
+{"name": "test_262", "title": "The impact of EAD adoption on archival programs: a pilot survey of early", "abstract": "implementers The article reports the results of a survey conducted to assess the impact that the implementation of Encoded Archival Description (EAD) has on archival programs. By gathering data related to the funding, staffing, and evaluation of EAD programs and about institutional goals for EAD implementation, the study explored how EAD has affected the operations of the institutions which are utilizing it and the extent to which EAD has become a part of regular repository functions", "fulltext": "", "keywords": "archival programs;ead implementation;staffing;diffusion of innovation;institutional goals;funding;ead programs;encoded archival description;ead adoption;archival descriptive standards;regular repository functions"}
+{"name": "test_263", "title": "K-12 instruction and digital access to archival materials", "abstract": "Providing K-12 schools with digital access to archival materials can strengthen both student learning and archival practice, although it cannot replace direct physical access to records. The article compares a variety of electronic and nonelectronic projects to promote teaching with primary source materials. The article also examines some of the different historiographical and pedagogical approaches used in archival Web sites geared for K-12 instruction, focusing on differences between the educational sites sponsored by the Library of Congress and the National Archives and Records Administration", "fulltext": "", "keywords": "electronic projects;historiographical approaches;student learning;pedagogical approaches;digital access;archival practice;archival materials;k-12 instruction;nonelectronic projects;direct physical access;primary source materials;archival web;national archives and records administration;library of congress;educational sites"}
+{"name": "test_264", "title": "The archival imagination of David Bearman, revisited", "abstract": "Many archivists regard the archival imagination evidenced in the writings of David Bearman as avant-garde. Archivist L. Henry (1998) has sharply criticized Bearman for being irreverent toward the archival theory and practice outlined by classical American archivist T. R. Schellenberg. Although Bearman is sometimes credited (and sometimes berated) for establishing \"a new paradigm\" centered on the archival management of electronic records, his methods and strategies are intended to encompass all forms of record keeping. The article provides general observations on Bearman's archival imagination, lists some of its components, and addresses elements of Henry's critique. Although the long lasting impact of Bearman's imagination upon the archival profession might be questioned, it nonetheless deserves continued consideration by archivists and inclusion as a component of graduate archival education", "fulltext": "", "keywords": "schellenberg;archival management;archival profession;electronic records;record keeping;graduate archival education;archival imagination;classical american archivist;archival theory;david bearman"}
+{"name": "test_265", "title": "Pattern recognition strategies for molecular surfaces. II. Surface", "abstract": "complementarity For pt.I see ibid., vol.23, p.1176-87 (2002). Fuzzy logic based algorithms for the quantitative treatment of complementarity of molecular surfaces are presented. Therein, the overlapping surface patches defined in part I of this series are used. The identification of complementary surface patches can be considered as a first step for the solution of molecular docking problems. Standard technologies can then be used for further optimization of the resulting complex structures. The algorithms are applied to 33 biomolecular complexes. After the optimization with a downhill simplex method, for all these complexes one structure was found, which is in very good agreement with the experimental results", "fulltext": "", "keywords": "downhill simplex method;pattern recognition strategies;biomolecular complexes;optimization;surface complementarity;molecular surfaces;overlapping surface;fuzzy logic based algorithms;quantitative treatment"}
+{"name": "test_266", "title": "Pattern recognition strategies for molecular surfaces. I. Pattern generation", "abstract": "using fuzzy set theory A new method for the characterization of molecules based on the model approach of molecular surfaces is presented. We use the topographical properties of the surface as well as the electrostatic potential, the local lipophilicity/hydrophilicity, and the hydrogen bond density on the surface for characterization. The definition and the calculation method for these properties are reviewed. The surface is segmented into overlapping patches with similar molecular properties. These patches can be used to represent the characteristic local features of the molecule in a way that is beyond the atomistic resolution but can nevertheless be applied for the analysis of partial similarities of different molecules as well as for the identification of molecular complementarity in a very general sense. The patch representation can be used for different applications, which will be demonstrated in subsequent articles", "fulltext": "", "keywords": "pattern generation;fuzzy set theory;patch representation;pattern recognition strategies;local lipophilicity/hydrophilicity;topographical properties;model approach;partial similarities;molecular surfaces;overlapping patches;hydrophilicity;segmented surface;local features;hydrogen bond density;molecular properties;lipophilicity;atomistic resolution;electrostatic potential;molecular complementarity"}
+{"name": "test_267", "title": "An efficient parallel algorithm for the calculation of canonical MP2 energies", "abstract": "We present the parallel version of a previous serial algorithm for the efficient calculation of canonical MP2 energies. It is based on the Saebo-Almlof direct-integral transformation, coupled with an efficient prescreening of the AO integrals. The parallel algorithm avoids synchronization delays by spawning a second set of slaves during the bin-sort prior to the second half-transformation. Results are presented for systems with up to 2000 basis functions. MP2 energies for molecules with 400-500 basis functions can be routinely calculated to microhartree accuracy on a small number of processors (6-8) in a matter of minutes with modern PC-based parallel computers", "fulltext": "", "keywords": "pc-based parallel computers;saebo-almlof direct-integral transformation;parallel algorithm;ao integrals;canonical mp2 energies;synchronization delays;second half-transformation;mp2 energies;microhartree accuracy;basis functions"}
+{"name": "test_268", "title": "A method for correlations analysis of coordinates: applications for molecular", "abstract": "conformations We describe a new method to analyze multiple correlations between subsets of coordinates that represent a sample. The correlation is established only between specific regions of interest at the coordinates. First, the region(s) of interest are selected at each molecular coordinate. Next, a correlation matrix is constructed for the selected regions. The matrix is subject to further analysis, illuminating the multidimensional structural characteristics that exist in the conformational space. The method's abilities are demonstrated in several examples: it is used to analyze the conformational space of complex molecules, it is successfully applied to compare related conformational spaces, and it is used to analyze a diverse set of protein folding trajectories", "fulltext": "", "keywords": "multidimensional structural characteristics;correlation matrix;molecular coordinate;conformational spaces;complex molecules;regions of interest;molecular conformations;multiple correlation analysis;protein folding trajectories"}
+{"name": "test_269", "title": "Genetic algorithm guided selection: variable selection and subset selection", "abstract": "A novel genetic algorithm guided selection method, GAS, has been described. The method utilizes a simple encoding scheme which can represent both compounds and variables used to construct a QSAR/QSPR model. A genetic algorithm is then utilized to simultaneously optimize the encoded variables that include both descriptors and compound subsets. The GAS method generates multiple models each applying to a subset of the compounds. Typically the subsets represent clusters with different chemotypes. Also a procedure based on molecular similarity is presented to determine which model should be applied to a given test set compound. The variable selection method implemented in GAS has been tested and compared using the Selwood data set (n = 31 compounds; nu = 53 descriptors). The results showed that the method is comparable to other published methods. The subset selection method implemented in GAS has been first tested using an artificial data set (n = 100 points; nu = 1 descriptor) to examine its ability to subset data points and second applied to analyze the XLOGP data set (n = 1831 compounds; nu = 126 descriptors). The method is able to correctly identify artificial data points belonging to various subsets. The analysis of the XLOGP data set shows that the subset selection method can be useful in improving a QSAR/QSPR model when the variable selection method fails", "fulltext": "", "keywords": "chemotypes;xlogp data set;compound subsets;qsar/qspr model;subset selection;genetic algorithm guided selection method;clusters;compounds;optimization;multiple models;variables;selwood data set;artificial data points;descriptors;molecular similarity;encoding scheme;variable selection"}
+{"name": "test_27", "title": "A formal model of computing with words", "abstract": "Classical automata are formal models of computing with values. Fuzzy automata are generalizations of classical automata where the knowledge about the system's next state is vague or uncertain. It is worth noting that like classical automata, fuzzy automata can only process strings of input symbols. Therefore, such fuzzy automata are still (abstract) devices for computing with values, although a certain vagueness or uncertainty are involved in the process of computation. We introduce a new kind of fuzzy automata whose inputs are instead strings of fuzzy subsets of the input alphabet. These new fuzzy automata may serve as formal models of computing with words. We establish an extension principle from computing with values to computing with words. This principle indicates that computing with words can be implemented with computing with values with the price of a big amount of extra computations", "fulltext": "", "keywords": "pushdown automata;fuzzy automata;fuzzy subsets;input alphabet;extension principle;formal model;computing with words"}
+{"name": "test_270", "title": "Using molecular equivalence numbers to visually explore structural features", "abstract": "that distinguish chemical libraries A molecular equivalence number (meqnum) classifies a molecule with respect to a class of structural features or topological shapes such as its cyclic system or its set of functional groups. Meqnums can be used to organize molecular structures into nonoverlapping, yet highly relatable classes. We illustrate the construction of some different types of meqnums and present via examples some methods of comparing diverse chemical libraries based on meqnums. In the examples we compare a library which is a random sample from the MDL Drug Data Report (MDDR) with a library which is a random sample from the Available Chemical Directory (ACD). In our analyses, we discover some interesting features of the topological shape of a molecule and its set of functional groups that are strongly linked with compounds occurring in the MDDR but not in the ACD. We also illustrate the utility of molecular equivalence indices in delineating the structural domain over which an SAR conclusion is valid", "fulltext": "", "keywords": "cyclic system;structural features;molecule classification;mdl drug data report;nonoverlapping relatable classes;molecular equivalence indices;molecular equivalence number;chemical libraries;available chemical directory;functional groups;topological shapes"}
+{"name": "test_271", "title": "On the use of neural network ensembles in QSAR and QSPR", "abstract": "Despite their growing popularity among neural network practitioners, ensemble methods have not been widely adopted in structure-activity and structure-property correlation. Neural networks are inherently unstable, in that small changes in the training set and/or training parameters can lead to large changes in their generalization performance. Recent research has shown that by capitalizing on the diversity of the individual models, ensemble techniques can minimize uncertainty and produce more stable and accurate predictors. In this work, we present a critical assessment of the most common ensemble technique known as bootstrap aggregation, or bagging, as applied to QSAR and QSPR. Although aggregation does offer definitive advantages, we demonstrate that bagging may not be the best possible choice and that simpler techniques such as retraining with the full sample can often produce superior results. These findings are rationalized using Krogh and Vedelsby's (1995) decomposition of the generalization error into a term that measures the average generalization performance of the individual networks and a term that measures the diversity among them. For networks that are designed to resist over-fitting, the benefits of aggregation are clear but not overwhelming", "fulltext": "", "keywords": "uncertainty;structure-activity correlation;neural network ensembles;generalization performance;qsar;generalization error decomposition;bootstrap aggregation;retraining;training parameters;training set;bagging;structure-property correlation;qspr"}
+{"name": "test_272", "title": "Median partitioning: a novel method for the selection of representative subsets", "abstract": "from large compound pools A method termed median partitioning (MP) has been developed to select diverse sets of molecules from large compound pools. Unlike many other methods for subset selection, the MP approach does not depend on pairwise comparison of molecules and can therefore be applied to very large compound collections. The only time limiting step is the calculation of molecular descriptors for database compounds. MP employs arrays of property descriptors with little correlation to divide large compound pools into partitions from which representative molecules can be selected. In each of n subsequent steps, a population of molecules is divided into subpopulations above and below the median value of a property descriptor until a desired number of 2/sup n/ partitions are obtained. For descriptor evaluation and selection, an entropy formulation was embedded in a genetic algorithm. MP has been applied to generate a subset of the Available Chemicals Directory, and the results have been compared with cell-based partitioning", "fulltext": "", "keywords": "median partitioning;entropy formulation;time limiting step;molecules;available chemicals directory;property descriptor array;large compound pools;genetic algorithm;representative subset selection;database compounds;molecular descriptors;cell-based partitioning"}
+{"name": "test_273", "title": "Chemical information based scaling of molecular descriptors: a universal", "abstract": "chemical scale for library design and analysis Scaling is a difficult issue for any analysis of chemical properties or molecular topology when disparate descriptors are involved. To compare properties across different data sets, a common scale must be defined. Using several publicly available databases (ACD, CMC, MDDR, and NCI) as a basis, we propose to define chemically meaningful scales for a number of molecular properties and topology descriptors. These chemically derived scaling functions have several advantages. First, it is possible to define chemically relevant scales, greatly simplifying similarity and diversity analyses across data sets. Second, this approach provides a convenient method for setting descriptor boundaries that define chemically reasonable topology spaces. For example, descriptors can be scaled so that compounds with little potential for biological activity, bioavailability, or other drug-like characteristics are easily identified as outliers. We have compiled scaling values for 314 molecular descriptors. In addition the 10th and 90th percentile values for each descriptor have been calculated for use in outlier filtering", "fulltext": "", "keywords": "data sets;chemical properties;library analysis;molecular descriptors;diversity analyses;drug-like characteristics;universal chemical scale;descriptor boundaries;chemical information based scaling;databases;biological activity;library design;bioavailability;molecular topology;outliers;similarity analyses"}
+{"name": "test_274", "title": "MTD-PLS: a PLS-based variant of the MTD method. II. Mapping ligand-receptor", "abstract": "interactions. Enzymatic acetic acid esters hydrolysis The PLS variant of the MTD method (T.I. Oprea et al., SAR QSAR Environ. Res. 2001, 12, 75-92) was applied to a series of 25 acetylcholinesterase hydrolysis substrates. Statistically significant MTD-PLS models (q/sup 2/ between 0.7 and 0.8) are in agreement with previous MTD models, with the advantage that local contributions are understood beyond the occupancy/nonoccupancy interpretation in MTD. A \"chemically intuitive\" approach further forces MTD-PLS coefficients to assume only negative (or zero) values for fragmental volume descriptors and positive (or zero) values for fragmental hydrophobicity descriptors. This further separates the various kinds of local interactions at each vertex of the MTD hypermolecule, making this method suitable for medicinal chemistry synthesis planning", "fulltext": "", "keywords": "additive approach;polarizabilities;chemically intuitive approach;enzymatic acetic acid esters hydrolysis;regression coefficients;intermolecular force categories;statistical model stability;ligand-receptor interactions mapping;medicinal chemistry synthesis planning;hydrogen bonding;fragmental volume descriptors;mtd-pls models;fragmental hydrophobicity descriptors;minimum topological difference method;pls-based variant;hypermolecule;ligand binding affinity;steric misfit;acetylcholinesterase hydrolysis substrates"}
+{"name": "test_275", "title": "Prediction of ultraviolet spectral absorbance using quantitative", "abstract": "structure-property relationships High performance liquid chromatography (HPLC) with ultraviolet (UV) spectrophotometric detection is a common method for analyzing reaction products in organic chemistry. This procedure would benefit from a computational model for predicting the relative response of organic molecules. Models are now reported for the prediction of the integrated UV absorbance for a diverse set of organic compounds using a quantitative structure-property relationship (QSPR) approach. A seven-descriptor linear correlation with a squared correlation coefficient (R/sup 2/) of 0.815 is reported for a data set of 521.compounds. Using the sum of ZINDO oscillator strengths in the integration range as an additional descriptor allowed reduction in the number of descriptors producing a robust model for 460 compounds with five descriptors and a squared correlation coefficient 0.857. The descriptors used in the models are discussed with respect to the physical nature of the UV absorption process", "fulltext": "", "keywords": "mos-f package;high performance liquid chromatography;ultraviolet spectral absorbance prediction;generic quantitation;seven-descriptor linear correlation;organic chemistry;combinatorial chemistry;configuration interaction calculation;codessa program;computational model;relative response;quantitative structure-property relationship;reaction products;squared correlation coefficient;ultraviolet spectrophotometric detection;zindo oscillator strengths"}
+{"name": "test_276", "title": "Assessment of the macrocyclic effect for the complexation of crown-ethers with", "abstract": "alkali cations using the substructural molecular fragments method The Substructural Molecular Fragments method (Solov'ev, V. P.; Varnek, A. A.; Wipff, G. J. Chem. Inf. Comput. Sci. 2000, 40, 847-858) was applied to assess stability constants (logK) of the complexes of crown-ethers, polyethers, and glymes with Na/sup +/, K/sup +/, and Cs/sup +/ in methanol. One hundred forty-seven computational models including different fragment sets coupled with linear or nonlinear fitting equations were applied for the data sets containing 69 (Na/sup +/), 123 (K/sup +/), and 31 (Cs/sup +/) compounds. To account for the \"macrocyclic effect\" for crown-ethers, an additional \"cyclicity\" descriptor was used. \"Predicted\" stability constants both for macrocyclic compounds and for their open-chain analogues are in good agreement with the experimental data reported earlier and with those studied experimentally in this work. The macrocyclic effect as a function of cation and ligand is quantitatively estimated for all studied crown-ethers", "fulltext": "", "keywords": "crown-ethers;data mining;structure-property tool;quantitative structure-properties relationship;different fragment sets;trail program;alkali cations;computational models;thermodynamic parameters;augmented atom;molecular graph decomposition;complexation;macrocyclic effect;cyclicity descriptor;nonlinear fitting equations;linear fitting equations;open-chain analogues;substructural molecular fragments method;statistical parameters;stability constants"}
+{"name": "test_277", "title": "Improving the predicting power of partial order based QSARs through linear", "abstract": "extensions Partial order theory (POT) is an attractive and operationally simple method that allows ordering of compounds, based on selected structural and/or electronic descriptors (modeled order), or based on their end points, e.g., solubility (experimental order). If the modeled order resembles the experimental order, compounds that are not experimentally investigated can be assigned a position in the model that eventually might lead to a prediction of an end-point value. However, in the application of POT in quantitative structure-activity relationship modeling, only the compounds directly comparable to the noninvestigated compounds are applied. To explore the possibilities of improving the methodology, the theory is extended by application of the so-called linear extensions of the model order. The study show that partial ordering combined with linear extensions appears as a promising tool providing probability distribution curves in the range of possible end-point values for compounds not being experimentally investigated", "fulltext": "", "keywords": "quantitative structure-activity relationships;modeled order;most probable linear order;electronic descriptors;linear extensions;end points;predicting power improvement;graphical representation;structural descriptors;solubilities;hasse diagram;organic compounds;combinatorial rule;partially ordered set;partial order theory"}
+{"name": "test_278", "title": "Novel ZE-isomerism descriptors derived from molecular topology and their", "abstract": "application to QSAR analysis We introduce several series of novel ZE-isomerism descriptors derived directly from two-dimensional molecular topology. These descriptors make use of a quantity named ZE-isomerism correction, which is added to the vertex degrees of atoms connected by double bonds in Z and E configurations. This approach is similar to the one described previously for topological chirality descriptors (Golbraikh, A., et al. J. Chem. Inf. Comput. Sci. 2001, 41, 147-158). The ZE-isomerism descriptors include modified molecular connectivity indices, overall Zagreb indices, extended connectivity, overall connectivity, and topological charge indices. They can be either real or complex numbers. Mathematical properties of different subgroups of ZE-isomerism descriptors are discussed. These descriptors circumvent the inability of conventional topological indices to distinguish between Z and E isomers. The applicability of ZE-isomerism descriptors to QSAR analysis is demonstrated in the studies of a series of 131 anticancer agents inhibiting tubulin polymerization", "fulltext": "", "keywords": "anticancer agents;quantitative structure-activity relationship;tubulin polymerization;chemical databases;double bond connected atoms;toxicities;computer-assisted drug design;topological charge indices;complex numbers;overall zagreb indices;qsar analysis;descriptor pharmacophore;molecular graphs;overall connectivity;two-dimensional molecular topology;vertex degrees;combinatorial chemical libraries;ze-isomerism correction;extended connectivity;modified molecular connectivity indices;ze-isomerism descriptors"}
+{"name": "test_279", "title": "Computer mediated communication and university international students", "abstract": "The design for the preliminary study presented was based on the experiences of the international students and faculty members of a small southwest university being surveyed and interviewed. The data collection procedure blends qualitative and quantitative data. A strong consensus was found that supports the study's premise that there is an association between the use of computer mediated communication (CMC) and teaching and learning performance of international students. Both groups believe CMC to be an effective teaching and learning tool by: increasing the frequency and quality of communication between students and instructors; improving language skills through increased writing and communication opportunities; allowing students and instructors to stay current and to compete effectively; providing alternative teaching and learning methods to increase students' confidence in their ability to communicate effectively with peers and instructors; and improving the instructors' pedagogical focus and questioning techniques", "fulltext": "", "keywords": "learning performance;student confidence;faculty members;questioning techniques;peers;qualitative data;data collection procedure;quantitative data;cmc;teaching;university international students;instructors;language skills;pedagogical focus;computer mediated communication;communication opportunities;small southwest university"}
+{"name": "test_28", "title": "Uncertainty bounds and their use in the design of interval type-2 fuzzy logic", "abstract": "systems We derive inner- and outer-bound sets for the type-reduced set of an interval type-2 fuzzy logic system (FLS), based on a new mathematical interpretation of the Karnik-Mendel iterative procedure for computing the type-reduced set. The bound sets can not only provide estimates about the uncertainty contained in the output of an interval type-2 FLS, but can also be used to design an interval type-2 FLS. We demonstrate, by means of a simulation experiment, that the resulting system can operate without type-reduction and can achieve similar performance to one that uses type-reduction. Therefore, our new design method, based on the bound sets, can relieve the computation burden of an interval type-2 FLS during its operation, which makes an interval type-2 FLS useful for real-time applications", "fulltext": "", "keywords": "type-reduced set;karnik-mendel iterative procedure;inner-bound sets;uncertainty bounds;real-time applications;outer-bound sets;interval type-2 fuzzy logic systems;time-series forecasting"}
+{"name": "test_280", "title": "Entrepreneurs in Action: a Web-case model", "abstract": "Much of the traditional schooling in America is built around systems of compliance and control, characteristics which stifle the creative and entrepreneurial instincts of the children who are subjected to these tactics. The article explores a different approach to education, one that involves capturing the interest of the student through the use of problem and project-based instruction delivered via the Internet. Called Entrepreneurs in Action, this program seeks to involve students in a problem at the outset and to promote the learning of traditional subject areas as a process of the problem-solving activities that are undertaken. The program's details are explained, from elementary school through university level courses, and the authors outline their plans to test the efficacy of the program at each level", "fulltext": "", "keywords": "internet;university level courses;problem-solving activities;entrepreneurs in action;traditional subject areas;entrepreneurial instincts;project-based instruction;elementary school;america;web-case model;traditional schooling"}
+{"name": "test_281", "title": "Factors contributing to preservice teachers' discomfort in a Web-based course", "abstract": "structured as an inquiry A report is given of a qualitative emergent design study of a Science, Technology, Society Interaction (STS) Web-enhanced course. Students' discomfort during the pilot test provided insight into the intellectual scaffolding that preservice secondary science teachers needed to optimize their performance when required to develop understanding through open-ended inquiry in a Web environment. Eight factors identified contributed to student discomfort: computer skills, paradigm shifts, trust, time management, thinking about their own thinking, systematic inquiry, self-assessment, and scientific discourse. These factors suggested developing understanding through inquiry by conducting a self-designed, open-ended, systematic inquiry required autonomous learning involving metacognitive skills and time management skills. To the extent in which students either came into the course with this scaffolding, or developed it during the course, they were successful in learning about STS and its relationship to science teaching. Changes in the Web site made to accommodate learners' needs as they surfaced are described", "fulltext": "", "keywords": "thinking;self-assessment;intellectual scaffolding;scientific discourse;systematic inquiry;preservice secondary science teachers;trust;computer skills;autonomous learning;qualitative emergent design study;web-based course;web-enhanced course;open-ended inquiry;student discomfort;science technology society interaction course;metacognitive skills;paradigm shifts;web environment;preservice teacher discomfort;science teaching;sts;time management;time management skills"}
+{"name": "test_282", "title": "Recommendations for implementing Internet inquiry projects", "abstract": "The purpose of the study presented was to provide recommendations to teachers who are interested in implementing Internet inquiry projects. Four classes of ninth- and tenth-grade honors students (N = 100) participated in an Internet inquiry project in which they were presented with an ecology question that required them to make a decision based on information that they gathered, analyzed, and synthesized from the Internet and their textbook. Students then composed papers with a rationale for their decision. Students in one group had access to pre-selected relevant Web sites, access to the entire Internet, and were provided with less online support. Students in the other group had access to only pre-selected relevant Web sites, but were provided with more online support. Two of the most important recommendations were: 1) to provide students with more online support; and 2) to provide students with pre-selected relevant Web sites and allow them to search the Internet for information", "fulltext": "", "keywords": "honors students;internet inquiry projects;online support;ecology question;teachers;pre-selected relevant web sites"}
+{"name": "test_283", "title": "Alien Rescue: a problem-based hypermedia learning environment for middle school", "abstract": "science The article describes an innovative hypermedia product for sixth graders in space science: Alien Rescue. Using a problem-based learning approach that is highly interactive, Alien Rescue engages students in scientific investigations aimed at finding solutions to complex and meaningful problems. Problem-based learning (PBL) is an instructional strategy proven to be effective in medical and business fields, and it is increasingly popular in education. However, using PBL in K-12 classrooms is challenging and requires access to rich knowledge bases and cognitive tools. Alien Rescue is designed to provide such cognitive support for successful use of PBL in sixth-grade classrooms. The design and development of Alien Rescue is guided by current educational research. Research is an integral part of this project. Results of formative evaluation and research studies are being integrated into the development and improvement of the program. Alien Rescue is designed in accordance with the National Science Standards and the Texas Essential Knowledge and Skills (TEKS) for science. So far Alien Rescue has been field-tested by approximately 1400 sixth graders. More use in middle schools is in progress and more research on its use is planned", "fulltext": "", "keywords": "formative evaluation;sixth graders;middle schools;k-12 classrooms;middle school science;pbl;space science;educational research;alien rescue;rich knowledge bases;problem-based hypermedia learning environment;cognitive tools;medical fields;instructional strategy;cognitive support;scientific investigations;business fields"}
+{"name": "test_284", "title": "Project-based learning: teachers learning and using high-tech to preserve Cajun", "abstract": "culture Using project-based learning pedagogy in EdTc 658 Advances in Educational Technology, the author has trained inservice teachers in Southwestern Louisiana with an advanced computer multimedia program called Director(R) (Macromedia, Inc.). The content of this course focused on modeling the project-based learning pedagogy and researching Acadian's traditions and legacy. With the multi-functions of microcomputers, new technologies were used to preserve and celebrate the local culture with superiority of text, graphics, animation, sound, and video. The article describes how several groups of school teachers in the surrounding areas of a regional state university of Louisiana learned computer multimedia using project-based learning and integrated their learning into local cultural heritage", "fulltext": "", "keywords": "local culture;project-based learning pedagogy;acadian traditions;inservice teachers;computer multimedia;new technologies;advanced computer multimedia program;regional state university;cajun culture;project-based learning;teachers;director;edtc 658 advances in educational technology;local cultural heritage;school teachers;macromedia"}
+{"name": "test_285", "title": "Presentation media, information complexity, and learning outcomes", "abstract": "Multimedia computing provides a variety of information presentation modality combinations. Educators have observed that visuals enhance learning which suggests that multimedia presentations should be superior to text-only and text with static pictures in facilitating optimal human information processing and, therefore, comprehension. The article reports the findings from a 3 (text-only, overhead slides, and multimedia presentation)*2 (high and low information complexity) factorial experiment. Subjects read a text script, viewed an acetate overhead slide presentation, or viewed a multimedia presentation depicting the greenhouse effect (low complexity) or photocopier operation (high complexity). Multimedia was superior to text-only and overhead slides for comprehension. Information complexity diminished comprehension and perceived presentation quality. Multimedia was able to reduce the negative impact of information complexity on comprehension and increase the extent of sustained attention to the presentation. These findings suggest that multimedia presentations invoke the use of both the verbal and visual working memory channels resulting in a reduction of the cognitive load imposed by increased information complexity. Moreover, multimedia superiority in facilitating comprehension goes beyond its ability to increase sustained attention; the quality and effectiveness of information processing attained (i.e., use of verbal and visual working memory) is also significant", "fulltext": "", "keywords": "photocopier operation;static pictures;visual working memory channel;cognitive processing limitations;optimal human information processing;multimedia presentations;greenhouse effect;information presentation modality combinations;multimedia superiority;acetate overhead slide presentation;overhead slides;educators;cognitive load;multimedia computing;information complexity;verbal working memory channel;presentation media;human working memory;text script;learning outcomes;sustained attention;multimedia presentation"}
+{"name": "test_286", "title": "Real-time tissue characterization on the basis of in vivo Raman spectra", "abstract": "The application of in vivo Raman spectroscopy for clinical diagnosis demands dedicated software that can perform the necessary signal processing and subsequent (multivariate) data analysis, enabling clinically relevant parameters to be extracted and made available in real time. Here we describe the design and implementation of a software package that allows for real-time signal processing and data analysis of Raman spectra. The design is based on automatic data exchange between Grams, a spectroscopic data acquisition and analysis program, and Matlab, a program designed for array-based calculations. The data analysis software has a modular design providing great flexibility in developing custom data analysis routines for different applications. The implementation is illustrated by a computationally demanding application for the classification of skin spectra using principal component analysis and linear discriminant analysis", "fulltext": "", "keywords": "modular design;real-time tissue characterization;multivariate data analysis;computationally demanding application;grams;array-based calculations;clinical diagnosis;clinically relevant parameters extraction;linear discriminant analysis;matlab;data analysis software;skin spectra classification;dedicated software;automatic data exchange"}
+{"name": "test_287", "title": "Loudspeaker voice-coil inductance losses: circuit models, parameter estimation,", "abstract": "and effect on frequency response When the series resistance is separated and treated as a separate element, it is shown that losses in an inductor require the ratio of the flux to MMF in the core to be frequency dependent. For small-signal operation, this dependence leads to a circuit model composed of a lossless inductor and a resistor in parallel, both of which are frequency dependent. Mathematical expressions for these elements are derived under the assumption that the ratio of core flux to MMF varies as omega /sup n-1/, where n is a constant. A linear regression technique is described for extracting the model parameters from measured data. Experimental data are presented to justify the model for the lossy inductance of a loudspeaker voice-coil. A SPICE example is presented to illustrate the effects of voice-coil inductor losses on the frequency response of a typical driver", "fulltext": "", "keywords": "frequency response;spice;series resistance;lossy inductance;circuit models;core flux to mmf ratio;lossless inductor;parameter estimation;loudspeaker driver;linear regression;loudspeaker voice-coil inductance losses;small-signal operation"}
+{"name": "test_288", "title": "Complexity transitions in global algorithms for sparse linear systems over", "abstract": "finite fields We study the computational complexity of a very basic problem, namely that of finding solutions to a very large set of random linear equations in a finite Galois field modulo q. Using tools from statistical mechanics we are able to identify phase transitions in the structure of the solution space and to connect them to the changes in the performance of a global algorithm, namely Gaussian elimination. Crossing phase boundaries produces a dramatic increase in memory and CPU requirements necessary for the algorithms. In turn, this causes the saturation of the upper bounds for the running time. We illustrate the results on the specific problem of integer factorization, which is of central interest for deciphering messages encrypted with the RSA cryptosystem", "fulltext": "", "keywords": "finite galois field;phase boundaries;complexity transitions;integer factorization;gaussian elimination;statistical mechanics;message deciphering;rsa cryptosystem;sparse linear systems;finite fields;encryption;disordered systems;random linear equations;global algorithms"}
+{"name": "test_289", "title": "Noise effect on memory recall in dynamical neural network model of hippocampus", "abstract": "We investigate some noise effect on a neural network model proposed by Araki and Aihara (1998) for the memory recall of dynamical patterns in the hippocampus and the entorhinal cortex; the noise effect is important since the release of transmitters at synaptic clefts, the operation of gate of ion channels and so on are known as stochastic phenomena. We consider two kinds of noise effect due to a deterministic noise and a stochastic noise. By numerical simulations, we find that reasonable values of noise give better performance on the memory recall of dynamical patterns. Furthermore we investigate the effect of the strength of external inputs on the memory recall", "fulltext": "", "keywords": "hippocampus;dynamical neural network model;entorhinal cortex;synaptic strength;dynamical patterns;numerical simulations;memory recall;inhibitory connection;gate of ion channels;stochastic phenomena;brain functions;deterministic noise;synaptic clefts;stochastic noise;noise effect"}
+{"name": "test_29", "title": "Fuzzy polynomial neural networks: hybrid architectures of fuzzy modeling", "abstract": "We introduce a concept of fuzzy polynomial neural networks (FPNNs), a hybrid modeling architecture combining polynomial neural networks (PNNs) and fuzzy neural networks (FNNs). The development of the FPNNs dwells on the technologies of computational intelligence (CI), namely fuzzy sets, neural networks, and genetic algorithms. The structure of the FPNN results from a synergistic usage of FNN and PNN. FNNs contribute to the formation of the premise part of the rule-based structure of the FPNN. The consequence part of the FPNN is designed using PNNs. The structure of the PNN is not fixed in advance as it usually takes place in the case of conventional neural networks, but becomes organized dynamically to meet the required approximation error. We exploit a group method of data handling (GMDH) to produce this dynamic topology of the network. The performance of the FPNN is quantified through experimentation that exploits standard data already used in fuzzy modeling. The obtained experimental results reveal that the proposed networks exhibit high accuracy and generalization capabilities in comparison to other similar fuzzy models", "fulltext": "", "keywords": "gmdh;membership functions;hybrid architectures;momentum coefficients;fuzzy polynomial neural networks;learning;fuzzy sets;dynamic topology;standard backpropagation;genetic algorithms;learning rates;group method of data handling;computational intelligence;genetic optimization;highly nonlinear rule-based models;fuzzy modeling;fuzzy inference method"}
+{"name": "test_290", "title": "MEMS applications in computer disk drive dual-stage servo systems", "abstract": "We present a decoupled discrete time pole placement design method, which can be combined with a self-tuning scheme to compensate variations in the microactuator's (MA's) resonance mode. Section I of the paper describes the design and fabrication of a prototype microactuator with an integrated gimbal structure. Section II presents a decoupled track-following controller design and a self-tuning control scheme to compensate for the MA's resonance mode variations", "fulltext": "", "keywords": "mems;microactuator;computer disk drive dual-stage servo systems;hard disk drives;electrostatic design;fabrication process;self-tuning scheme;decoupled discrete time pole placement design method;track-following controller design;servo control"}
+{"name": "test_291", "title": "Nuclear magnetic resonance molecular photography", "abstract": "A procedure is described for storing a two-dimensional (2D) pattern consisting of 32*32=1024 bits in a spin state of a molecular system and then retrieving the stored information as a stack of nuclear magnetic resonance spectra. The system used is a nematic liquid crystal, the protons of which act as spin clusters with strong intramolecular interactions. The technique used is a programmable multifrequency irradiation with low amplitude. When it is applied to the liquid crystal, a large number of coherent long-lived /sup 1/H response signals can be excited, resulting in a spectrum showing many sharp peaks with controllable frequencies and amplitudes. The spectral resolution is enhanced by using a second weak pulse with a 90 degrees phase shift, so that the 1024 bits of information can be retrieved as a set of well-resolved pseudo-2D spectra reproducing the input pattern", "fulltext": "", "keywords": "second weak pulse;spin echoes;molecular system spin state;nmr molecular photography;1024 bit;2d pattern;low amplitude;coupled spins;spin clusters;dipole-dipole interactions;nematic liquid crystal;high-content molecular information processing;spin dynamics;hilbert spaces;spin-locking;pseudo-2d spectra;information storage;proton spin;strong intramolecular interactions;spectral resolution;coherent long-lived /sup 1/h response signals;programmable multifrequency irradiation"}
+{"name": "test_292", "title": "Novel active noise-reducing headset using earshell vibration control", "abstract": "Active noise-reducing (ANR) headsets are available commercially in applications varying from aviation communication to consumer audio. Current ANR systems use passive attenuation at high frequencies and loudspeaker-based active noise control at low frequencies to achieve broadband noise reduction. This paper presents a novel ANR headset in which the external noise transmitted to the user's ear via earshell vibration is reduced by controlling the vibration of the earshell using force actuators acting against an inertial mass or the earshell headband. Model-based theoretical analysis using velocity feedback control showed that current piezoelectric actuators provide sufficient force but require lower stiffness for improved low-frequency performance. Control simulations based on experimental data from a laboratory headset showed that good performance can potentially be achieved in practice by a robust feedback controller, while a single-frequency real-time control experiment verified that noise reduction can be achieved using earshell vibration control", "fulltext": "", "keywords": "piezoelectric actuators;inertial mass;active noise-reducing headset;stiffness;single-frequency real-time control;force actuators;consumer audio;robust feedback controller;passive attenuation;broadband noise reduction;aviation communication;earshell vibration control;external noise transmission;velocity feedback control"}
+{"name": "test_293", "title": "Theoretical and experimental investigations on coherence of traffic noise", "abstract": "transmission through an open window into a rectangular room in high-rise buildings A method for theoretically calculating the coherence between sound pressure inside a rectangular room in a high-rise building and that outside the open window of the room is proposed. The traffic noise transmitted into a room is generally dominated by low-frequency components, to which active noise control (ANC) technology may find an application. However, good coherence between reference and error signals is essential for an effective noise reduction and should be checked first. Based on traffic noise prediction methods, wave theory, and mode coupling theory, the results of this paper enabled one to determine the potentials and limitations of ANC used to reduce such a transmission. Experimental coherence results are shown for two similar, empty rectangular rooms located on the 17th and 30th floors of a 34 floor high-rise building. The calculated results with the proposed method are generally in good agreement with the experimental results and demonstrate the usefulness of the method for predicting the coherence", "fulltext": "", "keywords": "open window;traffic noise transmission;wave theory;active noise control technology;rectangular room;mode coupling theory;high-rise buildings;traffic noise prediction methods;sound pressure;low-frequency components"}
+{"name": "test_294", "title": "High-density remote storage: the Ohio State University Libraries depository", "abstract": "The article describes a high-density off-site book storage facility operated by the Ohio State University Libraries. Opened in 1995, it has the capacity to house nearly 1.5 million items in only 9000 square feet by shelving books by size on 30-foot tall shelving. A sophisticated climate control system extends the life of stored materials up to 12 times. An online catalog record for each item informs patrons that the item is located in a remote location. Regular courier deliveries from the storage facility bring requested materials to patrons with minimal delay", "fulltext": "", "keywords": "climate control system;shelving;high-density off-site book storage facility;remote location;patrons;courier deliveries;high-density remote storage;online catalog record;circulation;stored materials;ohio state university libraries"}
+{"name": "test_295", "title": "Hours of operation and service in academic libraries: toward a national", "abstract": "standard In an effort toward establishing a standard for academic library hours, the article surveys and compares hours of operation and service for ARL libraries and IPEDS survey respondents. The article ranks the ARL (Association for Research Libraries) libraries according to hours of operation and reference hours and then briefly discusses such issues as libraries offering twenty-four access and factors affecting service hour decisions", "fulltext": "", "keywords": "arl libraries;operation/service hours;academic library hours;association for research libraries;integrated post secondary education data system;ipeds survey respondents"}
+{"name": "test_296", "title": "Using the Web to answer legal reference questions", "abstract": "In an effort to help non-law librarians with basic legal reference questions, the author highlights three basic legal Web sites and outlines useful subject-specific Web sites that focus on statutes and regulations, case law and attorney directories", "fulltext": "", "keywords": "legal reference questions;attorney directories;nonlaw librarians;case law;world wide web"}
+{"name": "test_297", "title": "The service side of systems librarianship", "abstract": "Describes the role of a systems librarian at a small academic library. Although online catalogs and the Internet are making library accessibility more convenient, the need for library buildings and professionals has not diminished. Typical duties of a systems librarian and the effects of new technology on librarianship are discussed. Services provided to other constituencies on campus and the blurring relationship between the library and computer services are also presented", "fulltext": "", "keywords": "internet;systems librarianship;online catalogs;small academic library;service side"}
+{"name": "test_298", "title": "Defining electronic librarianship: a content analysis of job advertisements", "abstract": "Advances in technology create dramatic changes within libraries. The complex issues surrounding this new electronic, end-user environment have major ramifications and require expert knowledge. Electronic services librarians and electronic resources librarians are two specialized titles that have recently emerged within the field of librarianship to fill this niche. Job advertisements listed in American Libraries from January 1989 to December 1998 were examined to identify responsibilities, qualifications, organizational and salary information relating to the newly emerging role of electronic librarian", "fulltext": "", "keywords": "salary information;responsibilities;american libraries;electronic resources librarians;electronic librarianship;organizational information;electronic services librarians;job advertisements;electronic end-user environment;content analysis;qualifications"}
+{"name": "test_299", "title": "Customer in-reach and library strategic systems: the case of ILLiad", "abstract": "Libraries have walls. Recognizing this fact, the Interlibrary Loan Department at Virginia Tech is creating systems and services that enable our customers to reach past our walls at anytime from anywhere. Customer in-reach enables Virginia Tech faculty, students, and staff anywhere in the world to obtain information and services heretofore available only to our on-campus customers. ILLiad, Virginia Tech's interlibrary borrowing system, is the library strategic system that attains this goal. The principles that guided development of ILLiad are widely applicable", "fulltext": "", "keywords": "interlibrary loan department;customer in-reach;library strategic systems;virginia tech;interlibrary borrowing system;illiad"}
+{"name": "test_3", "title": "NuVox shows staying power with new cash, new market", "abstract": "Who says you can't raise cash in today's telecom market? NuVox Communications positions itself for the long run with $78.5 million in funding and a new credit facility", "fulltext": "", "keywords": "investors;nuvox communications;telecom;competitive carrier market"}
+{"name": "test_30", "title": "Improvements and critique on Sugeno's and Yasukawa's qualitative modeling", "abstract": "Investigates Sugeno's and Yasukawa's (1993) qualitative fuzzy modeling approach. We propose some easily implementable solutions for the unclear details of the original paper, such as trapezoid approximation of membership functions, rule creation from sample data points, and selection of important variables. We further suggest an improved parameter identification algorithm to be applied instead of the original one. These details are crucial concerning the method's performance as it is shown in a comparative analysis and helps to improve the accuracy of the built-up model. Finally, we propose a possible further rule base reduction which can be applied successfully in certain cases. This improvement reduces the time requirement of the method by up to 16% in our experiments", "fulltext": "", "keywords": "qualitative modeling;membership functions;rule base reduction;sugeno-yasukawa method;parameter identification algorithm;rule creation;trapezoid approximation;fuzzy modeling"}
+{"name": "test_300", "title": "The plot thins: thin-client computer systems and academic libraries", "abstract": "The few libraries that have tried thin client architectures have noted a number of compelling reasons to do so. For starters, thin client devices are far less expensive than most PCs. More importantly, thin client computing devices are believed to be far less expensive to manage and support than traditional PCs", "fulltext": "", "keywords": "academic libraries;thin-client computer systems"}
+{"name": "test_301", "title": "Academic libraries and community: making the connection", "abstract": "I explore the theme of academic libraries serving and reaching out to the broader community. I highlight interesting projects reported on in the literature (such as the Through Our Parents' Eyes project) and report on others. I look at challenges to community partnerships and recommendations for making them succeed. Although I focus on links with the broader community, I also took at methods for increasing cooperation among various units on campus, so that the needs of campus community groups-such as distance education students or disabled students-are effectively addressed. Though academic libraries are my focus, we can learn a lot from the community building efforts of public libraries", "fulltext": "", "keywords": "disabled students;community partnerships;campus community groups;distance education students;public libraries;academic libraries"}
+{"name": "test_302", "title": "Using Internet search engines to estimate word frequency", "abstract": "The present research investigated Internet search engines as a rapid, cost-effective alternative for estimating word frequencies. Frequency estimates for 382 words were obtained and compared across four methods: (1) Internet search engines, (2) the Kucera and Francis (1967) analysis of a traditional linguistic corpus, (3) the CELEX English linguistic database (Baayen et al., 1995), and (4) participant ratings of familiarity. The results showed that Internet search engines produced frequency estimates that were highly consistent with those reported by Kucera and Francis and those calculated from CELEX, highly consistent across search engines, and very reliable over a 6 month period of time. Additional results suggested that Internet search engines are an excellent option when traditional word frequency analyses do not contain the necessary data (e.g., estimates for forenames and slang). In contrast, participants' familiarity judgments did not correspond well with the more objective estimates of word frequency. Researchers are advised to use search engines with large databases (e.g., AltaVista) to ensure the greatest representativeness of the frequency estimates", "fulltext": "", "keywords": "linguistic corpus;word frequency estimation;internet search engines;celex english linguistic database;large databases;participant familiarity ratings"}
+{"name": "test_303", "title": "Visual-word identification thresholds for the 260 fragmented words of the", "abstract": "Snodgrass and Vanderwart pictures in Spanish Word difficulty varies from language to language; therefore, normative data of verbal stimuli cannot be imported directly from another language. We present mean identification thresholds for the 260 screen-fragmented words corresponding to the total set of Snodgrass and Vanderwart (1980) pictures. Individual words were fragmented in eight levels using Turbo Pascal, and the resulting program was implemented on a PC microcomputer. The words were presented individually to a group of 40 Spanish observers, using a controlled time procedure. An unspecific learning effect was found showing that performance improved due to practice with the task. Finally, of the 11 psycholinguistic variables that previous researchers have shown to affect word identification, only imagery accounted for a significant amount of variance in the threshold values", "fulltext": "", "keywords": "mean identification thresholds;unspecific learning effect;word identification;turbo pascal;psycholinguistic variables;controlled time procedure;visual-word identification thresholds;spanish;pc microcomputer;snodgrass and vanderwart pictures;fragmented words;screen-fragmented words;verbal stimuli;word difficulty"}
+{"name": "test_304", "title": "A Web-accessible database of characteristics of the 1,945 basic Japanese kanji", "abstract": "In 1981, the Japanese government published a list of the 1,945 basic Japanese kanji (Jooyoo Kanji-hyo), including specifications of pronunciation. This list was established as the standard for kanji usage in print. The database for 1,945 basic Japanese kanji provides 30 cells that explain in detail the various characteristics of kanji. Means, standard deviations, distributions, and information related to previous research concerning these kanji are provided in this paper. The database is saved as a Microsoft Excel 2000 file for Windows. This kanji database is accessible on the Web site of the Oxford Text Archive, Oxford University (http://ota.ahds.ac.uk). Using this database, researchers and educators will be able to conduct planned experiments and organize classroom instruction on the basis of the known characteristics of selected kanji", "fulltext": "", "keywords": "standard deviations;web-accessible database;microsoft excel 2000 file for windows;kanji usage print;classroom instruction;oxford text archive web site;means;cells;jooyoo kanji-hyo;distributions;basic japanese kanji;pronunciation"}
+{"name": "test_305", "title": "Full-screen ultrafast video modes over-clocked by simple VESA routines and", "abstract": "registers reprogramming under MS-DOS Fast full-screen presentation of stimuli is necessary in psychological research. Although Spitczok von Brisinski (1994) introduced a method that achieved ultrafast display by reprogramming the registers, he could not produce an acceptable full-screen display. In this report, the author introduces a new method combining VESA routine calling with register reprogramming that can yield a display at 640 * 480 resolution, with a refresh rate of about 150 Hz", "fulltext": "", "keywords": "register reprogramming;ms-dos;full-screen ultrafast video modes;fast full-screen stimuli presentation;vesa routine calling;psychological research"}
+{"name": "test_306", "title": "Measuring keyboard response delays by comparing keyboard and joystick inputs", "abstract": "The response characteristics of PC keyboards have to be identified when they are used as response devices in psychological experiments. In the past, the proposed method has been to check the characteristics independently by means of external measurement equipment. However, with the availability of different PC models and the rapid pace of model change, there is an urgent need for the development of convenient and accurate methods of checking. The method proposed consists of raising the precision of the PC's clock to the microsecond level and using a joystick connected to the MIDI terminal of a sound board to give the PC an independent timing function. Statistical processing of the data provided by this method makes it possible to estimate accurately the keyboard scanning interval time and the average keyboard delay time. The results showed that measured keyboard delay times varied from 11 to 73 msec, depending on the keyboard model, with most values being less than 30 msec", "fulltext": "", "keywords": "independent timing function;psychological experiments;checking;statistical data processing;pc keyboards;model change;joystick inputs;midi terminal;keyboard inputs;sound board;average keyboard delay time;pc clock precision;keyboard scanning interval time;keyboard response delay measurement"}
+{"name": "test_307", "title": "Computer program to generate operant schedules", "abstract": "A computer program for programming schedules of reinforcement is described. Students can use the program to experience schedules of reinforcement that are typically used with nonhuman subjects. Accumulative recording of a student's response can be shown on the screen and/or printed with the computer's printer. The program can also be used to program operant schedules for animal subjects. The program was tested with human subjects experiencing fixed ratio, variable ratio, fixed interval, and variable interval schedules. Performance for human subjects on a given schedule was similar to performance for nonhuman subjects on the same schedule", "fulltext": "", "keywords": "fixed interval schedules;variable interval schedules;reinforcement schedule programming;animal subjects;fixed ratio schedules;computer program;operant schedule generation;nonhuman subjects;human subjects;variable ratio schedules;cumulative student response recording"}
+{"name": "test_308", "title": "On-line Homework/Quiz/Exam applet: freely available Java software for", "abstract": "evaluating performance on line The Homework/Quiz/Exam applet is a freely available Java program that can be used to evaluate student performance on line for any content authored by a teacher. It has database connectivity so that student scores are automatically recorded. It allows several different types of questions. Each question can be linked to images and detailed story problems. Three levels of feedback are provided to student responses. It allows teachers to randomize the sequence of questions and to randomize which of several options is the correct answer in multiple-choice questions. The creation and editing of questions involves menu selections, button presses, and the typing of content; no programming knowledge is required. The code is open source in order to encourage modifications that will meet individual pedagogical needs", "fulltext": "", "keywords": "database connectivity;feedback;detailed story problems;images;freely available java software;online homework/quiz/exam applet;teacher authored content;menu selections;multiple-choice questions;button presses;question creation;individual pedagogical needs;randomized question sequence;automatic student score recording;typing content;online student performance evaluation;question editing"}
+{"name": "test_309", "title": "WEXTOR: a Web-based tool for generating and visualizing experimental designs", "abstract": "and procedures WEXTOR is a Javascript-based experiment generator and teaching tool on the World Wide Web that can be used to design laboratory and Web experiments in a guided step-by-step process. It dynamically creates the customized Web pages and Javascripts needed for the experimental procedure and provides experimenters with a print-ready visual display of their experimental design. WEXTOR flexibly supports complete and incomplete factorial designs with between-subjects, within-subjects, and quasi-experimental factors, as well as mixed designs. The software implements client-side response time measurement and contains a content wizard for creating interactive materials, as well as dependent measures (graphical scales, multiple-choice items, etc.), on the experiment pages. However, it does not aim to replace a full-fledged HTML editor. Several methodological features specifically needed in Web experimental design have been implemented in the Web-based tool and are described in this paper. WEXTOR is platform independent. The created Web pages can be uploaded to any type of Web server in which data may be recorded in logfiles or via a database. The current version of WEXTOR is freely available for educational and noncommercial purposes. Its Web address is http://www.genpsylab.unizh.ch/wextor/index.html", "fulltext": "", "keywords": "logfiles;web server;javascript-based experiment generator;content wizard;factorial designs;experimental design visualization;database;html;print-ready visual display;client-side response time measurement;web-based tool;teaching tool;world wide web;customized web pages;wextor;free software"}
+{"name": "test_31", "title": "Adaptive neural/fuzzy control for interpolated nonlinear systems", "abstract": "Adaptive control for nonlinear time-varying systems is of both theoretical and practical importance. We propose an adaptive control methodology for a class of nonlinear systems with a time-varying structure. This class of systems is composed of interpolations of nonlinear subsystems which are input-output feedback linearizable. Both indirect and direct adaptive control methods are developed, where the spatially localized models (in the form of Takagi-Sugeno fuzzy systems or radial basis function neural networks) are used as online approximators to learn the unknown dynamics of the system. Without assumptions on rate of change of system dynamics, the proposed adaptive control methods guarantee that all internal signals of the system are bounded and the tracking error is asymptotically stable. The performance of the adaptive controller is demonstrated using a jet engine control problem", "fulltext": "", "keywords": "interpolated nonlinear systems;online approximators;jet engine control;input-output feedback linearizable systems;takagi-sugeno fuzzy systems;tracking error;adaptive neural/fuzzy control;spatially localized models;radial basis function neural networks;unknown dynamics;time-varying systems;stability analysis;indirect control;direct control"}
+{"name": "test_310", "title": "ePsych: interactive demonstrations and experiments in psychology", "abstract": "ePsych (http://epsych.msstate.edu), a new Web site currently under active development, is intended to teach students about the discipline of psychology. The site presumes little prior knowledge about the field and so may be used in introductory classes, but it incorporates sufficient depth of coverage to be useful in more advanced classes as well. Numerous interactive and dynamic elements are incorporated into various modules, orientations, and guidebooks. These elements include Java-based experiments and demonstrations, video clips, and animated diagrams. Rapid access to all material is provided through a layer-based navigation system that allows users to visit various \"Worlds of the Mind.\" Active learning is encouraged, by challenging students with puzzles and problems and by providing the opportunity to \"dig deeper\" to learn more about the phenomena at hand", "fulltext": "", "keywords": "epsych;animated diagrams;psychology experiments;interactive demonstrations;active learning;layer-based navigation system;worlds of the mind;web site;video clips;teaching;java-based experiments"}
+{"name": "test_311", "title": "Information architecture without internal theory: an inductive design process", "abstract": "This article suggests that Information Architecture (IA) design is primarily an inductive process. Although top-level goals, user attributes and available content are periodically considered, the process involves bottom-up design activities. IA is inductive partly because it lacks internal theory, and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. The nature of IA design is well described by Constructive Induction (CI), a design process that involves locating the best representational framework for the design problem, identifying a solution within that framework and translating it back to the design problem at hand. The future of IA, if it remains inductive or develops a body of theory (or both), is considered", "fulltext": "", "keywords": "inductive design process;constructive induction;information architecture design;user experiences;bottom-up design activities;internal theory;emergent phenomena"}
+{"name": "test_312", "title": "Information architecture for the Web: The IA matrix approach to designing", "abstract": "children's portals The article presents a matrix that can serve as a tool for designing the information architecture of a Web portal in a logical and systematic manner. The information architect begins by inputting the portal's objective, target user, and target content. The matrix then determines the most appropriate information architecture attributes for the portal by filling in the Applied Information Architecture portion of the matrix. The article discusses how the matrix works using the example of a children's Web portal to provide access to museum information", "fulltext": "", "keywords": "information architecture;target user;target content;children's web portal;museum information"}
+{"name": "test_313", "title": "Information architecture: notes toward a new curriculum", "abstract": "There are signs that information architecture is coalescing into a field of professional practice. However, if it is to become a profession, it must develop a means of educating new information architects. Lessons from other fields suggest that professional education typically evolves along a predictable path, from apprenticeships to trade schools to college- and university-level education. Information architecture education may develop more quickly to meet the growing demands of the information society. Several pedagogical approaches employed in other fields may be adopted for information architecture education, as long as the resulting curricula provide an interdisciplinary approach and balance instruction in technical and design skills with consideration of theoretical concepts. Key content areas are information organization, graphic. design, computer science, user and usability studies, and communication. Certain logistics must be worked out, including where information architecture studies should be housed and what kinds of degrees should be offered and at what levels. The successful information architecture curriculum will be flexible and adaptable in order to meet the changing needs of students and the marketplace", "fulltext": "", "keywords": "graphic design;professional education;professional practice;information organization;pedagogical approaches;usability studies;computer science;information architects;information architecture education"}
+{"name": "test_314", "title": "Information architecture in JASIST: just where did we come from?", "abstract": "The emergence of Information Architecture within the information systems world has been simultaneously drawn out yet rapid. Those with an eye on history are quick to point to Wurman's 1976 use of the term \"architecture of information,\" but it has only been in the last 2 years that IA has become the source of sufficient interest for people to label themselves professionally as Information Architects. The impetus for this recent emergence of IA can be traced to a historical summit, supported by ASIS&T in May 2000 at Boston. It was here that several hundred of us gathered to thrash out the questions of just what IA was and what this new field might become. At the time of the summit, invited to present a short talk on my return journey from the annual ACM SIGCHI conference, I entered the summit expecting little and convinced that IA was nothing new. I left 2 days later refreshed, not just by the enthusiasm of the attendees for this term but by IA's potential to unify the disparate perspectives and orientations of professionals from a range of disciplines. It was at this summit that the idea for the special issue took root. I proposed the idea to Don Kraft, hoping he would find someone else to run with it. AS luck would have it, I ended up taking charge of it myself, with initial support from David Blair. From the suggestion to the finished product-has been the best part of 2 years, and in that time more than 50 volunteers reviewed over 20 submissions", "fulltext": "", "keywords": "controlled vocabularies;information architecture;metadata fields;qualified information architect;cd-rom;web sites;information systems"}
+{"name": "test_315", "title": "The impact of the Internet on public library use: an analysis of the current", "abstract": "consumer market for library and Internet services The potential impact of the Internet on the public's demand for the services and resources of public libraries is an issue of critical importance. The research reported in this article provides baseline data concerning the evolving relationship between the public's use of the library and its use of the Internet. The authors developed a consumer model of the American adult market for information services and resources, segmented by use (or nonuse) of the public library and by access (or lack of access) to, and use (or nonuse) of, the Internet. A national Random Digit Dialing telephone survey collected data to estimate the size of each of six market segments, and to describe their usage choices between the public library and the Internet. The analyses presented in this article provide estimates of the size and demographics of each of the market segments; describe why people are currently using the public library and the Internet; identify the decision criteria people use in their choices of which provider to use; identify areas in which libraries and the Internet appear to be competing and areas in which they appear to be complementary; and identify reasons why people choose not to use the public library and/or the Internet. The data suggest that some differentiation between the library and the Internet is taking place, which may very well have an impact on consumer choices between the two. Longitudinal research is necessary to fully reveal trends in these usage choices, which have implications for all types of libraries in planning and policy development", "fulltext": "", "keywords": "national random digit dialing telephone survey;internet;consumer model;american adult market;baseline data;decision criteria;longitudinal research;public library;public libraries"}
+{"name": "test_316", "title": "Duality revisited: construction of fractional frequency distributions based on", "abstract": "two dual Lotka laws Fractional frequency distributions of, for example, authors with a certain (fractional) number of papers are very irregular, and therefore not easy to model or to explain. The article gives a first attempt to this by as suming two simple Lotka laws (with exponent 2): one for the number of authors with n papers (total count here) and one for the number of papers with n authors, n in N. Based on an earlier made convolution model of Egghe, interpreted and reworked now for discrete scores, we are able to produce theoretical fractional frequency distributions with only one parameter, which are in very close agreement with the practical ones as found in a large dataset produced earlier by Rao (1995). The article also shows that (irregular) fractional frequency distributions are a consequence of Lotka's law, and are not examples of breakdowns of this famous historical law", "fulltext": "", "keywords": "dual lotka laws;convolution model;irregular fractional frequency distributions;discrete scores"}
+{"name": "test_317", "title": "Relevance of Web documents: ghosts consensus method", "abstract": "The dominant method currently used to improve the quality of Internet search systems is often called \"digital democracy.\" Such an approach implies the utilization of the majority opinion of Internet users to determine the most relevant documents: for example, citation index usage for sorting of search results (google.com) or an enrichment of a query with terms that are asked frequently in relation with the query's theme. \"Digital democracy\" is an effective instrument in many cases, but it has an unavoidable shortcoming, which is a matter of principle: the average intellectual and cultural level of Internet users is very low; everyone knows what kind of information is dominant in Internet query statistics. Therefore, when one searches the Internet by means of \"digital democracy\" systems, one gets answers that reflect an underlying assumption that the user's mind potential is very low, and that his cultural interests are not demanding. Thus, it is more correct to use the term \"digital ochlocracy\" to refer to Internet search systems with \"digital democracy.\" Based on the well-known mathematical mechanism of linear programming, we propose a method to solve the indicated problem", "fulltext": "", "keywords": "majority opinion;linear programming;internet query statistics;digital ochlocracy;search results;internet search systems;digital democracy;ghosts consensus method;world wide web;citation index usage"}
+{"name": "test_318", "title": "Note on \"Deterministic inventory lot-size models under inflation with shortages", "abstract": "and deterioration for fluctuating demand\" by Yang et al For original paper see H.-L. Yang et al., ibid., vol.48, p.144-58 (2001). Yang et al. extended the lot-size models to allow for inflation and fluctuating demand. For this model they proved that the optimal replenishment schedule exists and is unique. They also proposed an algorithm to find the optimal policy. The present paper provides examples, which show that the optimal replenishment schedule and consequently the overall optimal policy may not exist", "fulltext": "", "keywords": "inflation;deterministic inventory lot-size models;optimal policy algorithm;optimal scheduling parameters;optimal replenishment schedule;fluctuating demand"}
+{"name": "test_319", "title": "Designing a screening experiment for highly reliable products", "abstract": "Within a reasonable life-testing time, how to improve the reliability of highly reliable products is one of the great challenges. By using a resolution III experiment together with degradation test, Tseng et al. (1995) presented a case study of improving the reliability of fluorescent lamps. However, in conducting such an experiment, they did not address the problem of how to choose the optimal settings of variables, such as sample size, inspection frequency, and termination time for each run, which are influential to the correct identification of significant factors and the experimental cost. Assuming that the product's degradation paths satisfy Wiener processes, this paper proposes a systematic approach to the aforementioned problem. First, an identification rule is proposed. Next, under the constraints of a minimum probability of correct decision and a maximum probability of incorrect decision of the proposed identification rule, the optimum test plan can be obtained by minimizing the total experimental cost. An example is provided to illustrate the proposed method", "fulltext": "", "keywords": "inspection frequency;highly reliable products;degradation tests;fluorescent lamps;termination time;identification rule;screening experiment;wiener process;optimal test plan;maximum probability of incorrect decision;minimum probability of correct decision;resolution iii design"}
+{"name": "test_32", "title": "Analysis and efficient implementation of a linguistic fuzzy c-means", "abstract": "The paper is concerned with a linguistic fuzzy c-means (FCM) algorithm with vectors of fuzzy numbers as inputs. This algorithm is based on the extension principle and the decomposition theorem. It turns out that using the extension principle to extend the capability of the standard membership update equation to deal with a linguistic vector has a huge computational complexity. In order to cope with this problem, an efficient method based on fuzzy arithmetic and optimization has been developed and analyzed. We also carefully examine and prove that the algorithm behaves in a way similar to the FCM in the degenerate linguistic case. Synthetic data sets and the iris data set have been used to illustrate the behavior of this linguistic version of the FCM", "fulltext": "", "keywords": "fuzzy numbers;extension principle;linguistic vectors;decomposition theorem;optimization;computational complexity;linguistic fuzzy c-means algorithm;fuzzy arithmetic"}
+{"name": "test_320", "title": "Warranty reserves for nonstationary sales processes", "abstract": "Estimation of warranty costs, in the event of product failure within the warranty period, is of importance to the manufacturer. Costs associated with replacement or repair of the product are usually drawn from a warranty reserve fund created by the manufacturer. Considering a stochastic sales process, first and second moments (and thereby the variance) are derived for the manufacturer's total discounted warranty cost of a single sale for single-component items under four different warranty policies from a manufacturer's point of view. These servicing strategies represent a renewable free-replacement, nonrenewable free-replacement, renewable pro-rata, and a nonrenewable minimal-repair warranty plans. The results are extended to determine the mean and variance of total discounted warranty costs for the total sales over the life cycle of the product. Furthermore, using a normal approximation, warranty reserves necessary for a certain protection level, so that reserves are not completely depleted, are found. Results and their managerial implications are studied through an extensive example", "fulltext": "", "keywords": "total discounted warranty costs;renewable pro-rata;variance;warranty reserves;normal approximation;total discounted warranty cost;stochastic sales process;nonrenewable free-replacement;product failure;product repair;nonrenewable minimal-repair warranty plans;second moments;managerial implications;product life cycle;single-component items;first moments;renewable free-replacement;warranty costs estimation;nonstationary sales processes;product replacement;servicing strategies"}
+{"name": "test_321", "title": "A multimodal data collection tool using REALbasic and Mac OS X", "abstract": "This project uses REALbasic 3.5 in the Mac OS X environment for development of a configuration tool that builds a data collection procedure for investigating the effectiveness of sonified graphs. The advantage of using REALbasic with the Mac OS X system is that it provides rapid development of stimulus presentation, direct recording of data to files, and control over other procedural issues. The program can be made to run natively on the new Mac OS X system, older Mac OS systems, and Windows (98SE, ME, 2000 PRO). With modification, similar programs could be used to present any number of visual/auditory stimulus combinations, complete with questions for each stimulus", "fulltext": "", "keywords": "direct data recording;auditory stimulus;mac os x environment;visual stimulus;visual data comprehension;psychology;sonified graphs;stimulus presentation;configuration tool;multimodal data collection tool;data collection;realbasic;windows"}
+{"name": "test_322", "title": "Toward an Experimental Timing Standards Lab: benchmarking precision in the real", "abstract": "world Much discussion has taken place over the relative merits of various platforms and operating systems for real-time data collection. Most would agree that, provided great care is taken, many are capable of millisecond timing precision. However, to date, much of this work has focused on the theoretical aspects of raw performance. It is our belief that researchers would be better informed if they could place confidence limits on their own specific paradigms in situ and without modification. To this end, we have developed a millisecond precision test rig that can control and time experiments on a second presentation machine. We report on the specialist hardware and software used. We elucidate the importance of the approach in relation to real-world experimentation", "fulltext": "", "keywords": "real-time data collection;operating systems;performance evaluation;event generation software;experimental timing standards lab;benchmarking precision;millisecond timing precision"}
+{"name": "test_323", "title": "A server-side program for delivering experiments with animations", "abstract": "A server-side program for animation experiments is presented. The program is capable of delivering an experiment composed of discrete animation sequences in various file formats, collecting a discrete or continuous response from the observer, evaluating the appropriateness of the response, and ensuring that the user is not proceeding at an unreasonable rate. Most parameters of the program are controllable by experimenter-edited text files or simple switches in the program code, thereby minimizing the need for programming to create new experiments. A simple demonstration experiment is discussed and is freely available", "fulltext": "", "keywords": "discrete animation sequences;internet;animation experiment delivery;experimenter-edited text files;file formats;web based psychological experiments;server-side program"}
+{"name": "test_324", "title": "Using NetCloak to develop server-side Web-based experiments without writing CGI", "abstract": "programs Server-side experiments use the Web server, rather than the participant's browser, to handle tasks such as random assignment, eliminating inconsistencies with Java and other client-side applications. Heretofore, experimenters wishing to create server-side experiments have had to write programs to create common gateway interface (CGI) scripts in programming languages such as Perl and C++. NetCloak uses simple, HTML-like commands to create CGIs. We used NetCloak to implement an experiment on probability estimation. Measurements of time on task and participants' IP addresses assisted quality control. Without prior training, in less than 1 month, we were able to use NetCloak to design and create a Web-based experiment and to help graduate students create three Web-based experiments of their own", "fulltext": "", "keywords": "common gateway interface scripts;internet;web server;behavioral data;probability estimation;client-side applications;netcloak;psychology;quality control;html;ip addresses;cgi programs;graduate students;c++ language;java;random assignment;server-side web-based experiments;perl"}
+{"name": "test_325", "title": "Open courseware and shared knowledge in higher education", "abstract": "Most college and university campuses in the United States and much of the developed world today maintain one, two, or several learning management systems (LMSs), which are courseware products that provide students and faculty with Web-based tools to manage course-related applications. Since the mid-1990s, two predominant models of Web courseware management systems have emerged: commercial and noncommercial. Some of the commercial products available today were created in academia as noncommercial but have since become commercially encumbered. Other products remain noncommercial but are struggling to survive in a world of fierce commercial competition. This article argues for an ethics of pedagogy in higher education that would be based on the guiding assumptions of the non-proprietary, peer-to-peer, open-source software movement", "fulltext": "", "keywords": "open-source software;internet;ethics;shared knowledge;university;higher education;learning management systems;commercial products;college;open courseware;web courseware management systems"}
+{"name": "test_326", "title": "Web-based experiments controlled by JavaScript: an example from probability", "abstract": "learning JavaScript programs can be used to control Web experiments. This technique is illustrated by an experiment that tested the effects of advice on performance in the classic probability-learning paradigm. Previous research reported that people tested via the Web or in the lab tended to match the probabilities of their responses to the probabilities that those responses would be reinforced. The optimal strategy, however, is to consistently choose the more frequent event; probability matching produces suboptimal performance. We investigated manipulations we reasoned should improve performance. A horse race scenario in which participants predicted the winner in each of a series of races between two horses was compared with an abstract scenario used previously. Ten groups of learners received different amounts of advice, including all combinations of (1) explicit instructions concerning the optimal strategy, (2) explicit instructions concerning a monetary sum to maximize, and (3) accurate information concerning the probabilities of events. The results showed minimal effects of horse race versus abstract scenario. Both advice concerning the optimal strategy and probability information contributed significantly to performance in the task. This paper includes a brief tutorial on JavaScript, explaining with simple examples how to assemble a browser-based experiment", "fulltext": "", "keywords": "web-based experiments;internet-based research;advice;explicit instructions;probability;browser-based experiment;probability learning;javascript"}
+{"name": "test_327", "title": "Using latent semantic analysis to assess reader strategies", "abstract": "We tested a computer-based procedure for assessing reader strategies that was based on verbal protocols that utilized latent semantic analysis (LSA). Students were given self-explanation-reading training (SERT), which teaches strategies that facilitate self-explanation during reading, such as elaboration based on world knowledge and bridging between text sentences. During a computerized version of SERT practice, students read texts and typed self-explanations into a computer after each sentence. The use of SERT strategies during this practice was assessed by determining the extent to which students used the information in the current sentence versus the prior text or world knowledge in their self-explanations. This assessment was made on the basis of human judgments and LSA. Both human judgments and LSA were remarkably similar and indicated that students who were not complying with SERT tended to paraphrase the text sentences, whereas students who were compliant with SERT tended to explain the sentences in terms of what they knew about the world and of information provided in the prior text context. The similarity between human judgments and LSA indicates that LSA will be useful in accounting for reading strategies in a Web-based version of SERT", "fulltext": "", "keywords": "computer-based procedure;reader strategy assessment;world knowledge;human judgments;latent semantic analysis;self-explanation-reading training;elaboration;text sentence bridging;verbal protocols"}
+{"name": "test_328", "title": "Personality research on the Internet: a comparison of Web-based and traditional", "abstract": "instruments in take-home and in-class settings Students, faculty, and researchers have become increasingly comfortable with the Internet, and many of them are interested in using the Web to collect data. Few published studies have investigated the differences between Web-based data and data collected with more traditional methods. In order to investigate these potential differences, two important factors were crossed in this study: whether the data were collected on line or not and whether the data were collected in a group setting at a fixed time or individually at a time of the respondent's choosing. The Visions of Morality scale (Shelton and McAdams, 1990) was used, and the participants were assigned to one of four conditions: in-class Web survey, in-class paper-and-pencil survey; take-home Web survey, and take-home paper-and-pencil survey. No significant differences in scores were found for any condition; however, response rates were affected by the type of survey administered, with the take-home Web-based instrument having the lowest response rate. Therefore, researchers need to be aware that different modes of administration may affect subject attrition and may, therefore, confound investigations of other independent variables", "fulltext": "", "keywords": "web-based instruments;in-class web survey;in-class paper-and-pencil survey;internet;visions of morality scale;administration;take-home paper-and-pencil survey;take-home web survey;personality research;data collection;response rates;subject attrition"}
+{"name": "test_329", "title": "Implications of document-level literacy skills for Web site design", "abstract": "The proliferation of World Wide Web (Web) sites and the low cost of publishing information on the Web have placed a tremendous amount of information at the fingertips of millions of people. Although most of this information is at least intended to be accurate, there is much that is rumor, innuendo, urban legend, and outright falsehood. This raises problems especially for students (of all ages) trying to do research or learn about some topic. Finding accurate, credible information requires document level literacy skills, such as integration, sourcing, corroboration, and search. This paper discusses these skills and offers a list of simple ways that designers of educational Web sites can help their visitors utilize these skills", "fulltext": "", "keywords": "accurate credible information;integration;falsehood;document-level literacy skills;rumor;sourcing;corroboration;innuendo;search;urban legend;educational web site design;students"}
+{"name": "test_33", "title": "Fuzzy control of multivariable process by modified error decoupling", "abstract": "In this paper, a control concept for the squared (equal number of inputs and outputs) multivariable process systems is given. The proposed control system consists of two parts, single loop fuzzy controllers in each loop and a centralized decoupling unit. The fuzzy control system uses feedback control to minimize the error in the loop and the decoupler uses an adaptive technique to mitigate loop interactions. The decoupler predicts the interacting loop changes and modifies the input (error) of the loop controller. The controller was tested on the simulation model of \"single component vaporizer\" process", "fulltext": "", "keywords": "square multivariable process systems;single component vaporizer process;single-loop fuzzy controllers;error minimization;squared multivariable process systems;multivariable process;set point changes;feedback control;centralized decoupling unit;load changes;modified error decoupling;loop interaction mitigation"}
+{"name": "test_330", "title": "Improving computer security for authentication of users: influence of proactive", "abstract": "password restrictions Entering a user name-password combination is a widely used procedure for identification and authentication in computer systems. However, it is a notoriously weak method, in that the passwords adopted by many users are easy to crack. In an attempt to, improve security, proactive password checking may be used, in which passwords must meet several criteria to be more resistant to cracking. In two experiments, we examined the influence of proactive password restrictions on the time that it took to generate an acceptable password and to use it subsequently to log in. The required length was a minimum of five characters in experiment I and eight characters in experiment 2. In both experiments, one condition had only the length restriction, and the other had additional restrictions. The additional restrictions greatly increased the time it took to generate the password but had only a small effect on the time it took to use it subsequently to log in. For the five-character passwords, 75% were cracked when no other restrictions were imposed, and this was reduced to 33% with the additional restrictions. For the eight-character passwords, 17% were cracked with no other restrictions, and 12.5% with restrictions. The results indicate that increasing the minimum character length reduces crackability and increases security, regardless of whether additional restrictions are imposed", "fulltext": "", "keywords": "five-character passwords;proactive password restrictions;proactive password checking;eight-character passwords;length restriction;computer security;user authentication"}
+{"name": "test_331", "title": "Multidimensional data visualization", "abstract": "Historically, data visualization has been limited primarily to two dimensions (e.g., histograms or scatter plots). Available software packages (e.g., Data Desk 6.1, MatLab 6.1, SAS-JMP 4.04, SPSS 10.0) are capable of producing three-dimensional scatter plots with (varying degrees of) user interactivity. We constructed our own data visualization application with the Visualization Toolkit (Schroeder et al., 1998) and Tcl/Tk to display multivariate data through the application of glyphs (Ware, 2000). A glyph is a visual object onto which many data parameters may be mapped, each with a different visual attribute (e.g., size or color). We used our multi-dimensional data viewer to explore data from several psycholinguistic experiments. The graphical interface provides flexibility when users dynamically explore the multidimensional image rendered from raw experimental data. We highlight advantages of multidimensional data visualization and consider some potential limitations", "fulltext": "", "keywords": "tcl/tk;multi-dimensional data viewer;visualization toolkit;visual attribute;visual object;3d scatter plots;graphical interface;multidimensional data visualization;data parameters;psycholinguistic experiments;glyphs;user interactivity;multidimensional image rendering;multivariate data display"}
+{"name": "test_332", "title": "Fitting mixed-effects models for repeated ordinal outcomes with the NLMIXED", "abstract": "procedure This paper presents an analysis of repeated ordinal outcomes arising from two psychological studies. The first case is a repeated measures analysis of variance; the second is a mixed-effects regression. in a longitudinal design. In both, the subject-specific variation is modeled by including random effects in the linear predictor (inside a link function) of a generalized linear model. The NLMIXED procedure in SAS is used to fit the mixed-effects models for the categorical response data. The presentation emphasizes the parallel between the model. specifications and the SAS statements. The purpose of this paper is to facilitate the use of mixed-effects models in the analysis of repeated ordinal outcomes", "fulltext": "", "keywords": "linear predictor;random effects;subject-specific variation modeling;psychological studies;nlmixed procedure;mixed-effects model fitting;repeated measures analysis of variance;categorical response data;mixed-effects regression;longitudinal design;generalized linear model;repeated ordinal outcomes;model specifications"}
+{"name": "test_333", "title": "Teaching psychology as a laboratory science in the age of the Internet", "abstract": "For over 30 years, psychologists have relied on computers to teach experimental psychology. With the advent of experiment generators, students can create well-designed experiments and can test sophisticated hypotheses from the start of their undergraduate training. Characteristics of new Net-based experiment generators are discussed and compared with traditional stand-alone generators. A call is made to formally evaluate the instructional effectiveness of the wide range of experiment generators now available. Specifically, software should be evaluated in terms of known learning outcomes, using appropriate control groups. The many inherent differences between any two software programs should be made clear. The teacher's instructional method should be fully described and held constant between comparisons. Finally, the often complex interaction between the teacher's instructional method and the pedagogical details of the software must be considered", "fulltext": "", "keywords": "internet;net-based experiment generators;undergraduate training;software;control groups;known learning outcomes;laboratory science;teacher instructional method;instructional effectiveness;experimental psychology teaching;pedagogical details;stand-alone generators;computers;hypothesis testing;well-designed experiments"}
+{"name": "test_334", "title": "Capturing niche markets with copper", "abstract": "For \"last-mile access\" in niche applications, twisted copper pair may be the cable of best option to gain access and deliver desired services. The article discusses how operators can use network edge devices to serve new customers. Niche market segments represent a significant opportunity for cable TV delivery of television and high-speed Internet signals. But the existing telecommunications infrastructure in those developments frequently presents unique challenges for the service provider to overcome", "fulltext": "", "keywords": "copper cables;network edge devices;niche markets;last-mile access;twisted copper pair"}
+{"name": "test_335", "title": "Fresh voices, big ideas [IBM internship program]", "abstract": "IBM is matching up computer-science and MBA students with its business managers in an 11-week summer internship program and challenging them to develop innovative technology ideas", "fulltext": "", "keywords": "mba college students;ibm business managers;internship program;computer-science students;patents"}
+{"name": "test_338", "title": "Down up [IT projects]", "abstract": "Despite the second quarter's gloomy GDP report, savvy CIOs are forging ahead with big IT projects that will position their companies to succeed when the economy soars again", "fulltext": "", "keywords": "morgan stanley;walgreen;ford;staples;victoria's secret;strategic technology projects;caterpillar"}
+{"name": "test_339", "title": "An automated parallel image registration technique based on the correlation of", "abstract": "wavelet features With the increasing importance of multiple multiplatform remote sensing missions, fast and automatic integration of digital data from disparate sources has become critical to the success of these endeavors. Our work utilizes maxima of wavelet coefficients to form the basic features of a correlation-based automatic registration algorithm. Our wavelet-based registration algorithm is tested successfully with data from the National Oceanic and Atmospheric Administration (NOAA) Advanced Very High Resolution Radiometer (AVHRR) and the Landsat Thematic Mapper (TM), which differ by translation and/or rotation. By the choice of high-frequency wavelet features, this method is similar to an edge-based correlation method, but by exploiting the multiresolution nature of a wavelet decomposition, our method achieves higher computational speeds for comparable accuracies. This algorithm has been implemented on a single-instruction multiple-data (SIMD) massively parallel computer, the MasPar MP-2, as well as on the CrayT3D, the Cray T3E, and a Beowulf cluster of Pentium workstations", "fulltext": "", "keywords": "remote sensing;automatic registration algorithm;land surface;automated parallel image registration;image processing;landsat thematic mapper;geophysical measurement technique;avhrr;wavelet decomposition;optical imaging;terrain mapping;correlation;microwave radiometry;simd massively parallel computing;wavelet feature"}
+{"name": "test_34", "title": "Design of PID-type controllers using multiobjective genetic algorithms", "abstract": "The design of a PID controller is a multiobjective problem. A plant and a set of specifications to be satisfied are given. The designer has to adjust the parameters of the PID controller such that the feedback interconnection of the plant and the controller satisfies the specifications. These specifications are usually competitive and any acceptable solution requires a tradeoff among them. An approach for adjusting the parameters of a PID controller based on multiobjective optimization and genetic algorithms is presented in this paper. The MRCD (multiobjective robust control design) genetic algorithm has been employed. The approach can be easily generalized to design multivariable coupled and decentralized PID loops and has been successfully validated for a large number of experimental cases", "fulltext": "", "keywords": "multiobjective robust control design;multivariable coupled pid loops;tuning methods;feedback interconnection;multiobjective genetic algorithms;decentralized pid loops;pid-type controllers"}
+{"name": "test_340", "title": "Temelin casts its shadow [nuclear power plant]", "abstract": "Reservations about Temelin nuclear plant in the Czech Republic are political rather than technical. This paper discusses the problems of turbogenerator vibrations and how they were diagnosed. The paper also discusses some of the other problems of commissioning the power plant. The simulator used for training new staff is also mentioned", "fulltext": "", "keywords": "czech republic;turbogenerator vibrations;power plant commissioning;training simulator;temelin nuclear plant"}
+{"name": "test_341", "title": "How should team captains order golfers on the final day of the Ryder Cup", "abstract": "matches? I used game theory to examine how team captains should select their slates for the final day of the Ryder Cup matches. Under the assumption that golfers have different abilities and are not influenced by pressure or momentum, I found that drawing names from a hat will do no worse than any other strategy", "fulltext": "", "keywords": "game theory;slate;golfer ordering;golf;ryder cup final day"}
+{"name": "test_342", "title": "Mount Sinai Hospital uses integer programming to allocate operating room time", "abstract": "An integer-programming model and a post-solution heuristic allocates operating room time to the five surgical divisions at Toronto's Mount Sinai Hospital. The hospital has used this approach for several years and credits it with both administrative savings and the ability to produce quickly an equitable master surgical schedule", "fulltext": "", "keywords": "ontario;integer programming;operating room time allocation;canada;mount sinai hospital;toronto;post-solution heuristic"}
+{"name": "test_343", "title": "Using the Small Business Innovation Research Program to turn your ideas into", "abstract": "products The US Government's Small Business Innovation Research Program helps small businesses transform new ideas into commercial products. The program provides an ideal means for businesses and universities to obtaining funding for cooperative projects. Rules and information for the program are readily available, and I will give a few helpful hints to provide guidance", "fulltext": "", "keywords": "funding;businesses;us government;universities;cooperative projects;commercial product development;small business innovation research program;usa"}
+{"name": "test_344", "title": "Student consulting projects benefit faculty and industry", "abstract": "Student consulting projects require students to apply OR/MS tools to obtain insight into the activities of firms in the community. These projects benefit faculty by providing clear feedback on the real capabilities of students, a broad connection to local industry, and material for case studies and research. They benefit companies by stimulating new thinking regarding their activities and delivering results they can use. Projects provide insights into the end-user modeling mode of OR/MS practice. Projects support continuous improvement as the lessons gained from a crop of projects enable better teaching during the next course offering, which in turn leads to better projects and further insights into teaching", "fulltext": "", "keywords": "student consulting projects;or/ms tools;student capability feedback;case study material;student placements"}
+{"name": "test_345", "title": "In search of strategic operations research/management science", "abstract": "We define strategic OR/MS as \"OR/MS work that leads to a sustainable competitive advantage.\" We found evidence of strategic OR/MS in the literature of strategic information systems (SIS) and OR/MS. We examined 30 early examples of SIS, many of which contained OR/MS work. Many of the most successful had high OR/MS content, while the least successful contained none. The inclusion of OR/MS work may be a key to sustaining an advantage from information technology. We also examined the Edelman Prize finalist articles published between 1990 and 1999. We found that 13 of the 42 private sector applications meet our definition of strategic OR/MS", "fulltext": "", "keywords": "operations research;sis;strategic or/ms;management science;strategic information systems"}
+{"name": "test_346", "title": "Baseball, optimization, and the World Wide Web", "abstract": "The competition for baseball play-off spots-the fabled pennant race-is one of the most closely watched American sports traditions. While play-off race statistics, such as games back and magic number, are informative, they are overly conservative and do not account for the remaining schedule of games. Using optimization techniques, one can model schedule effects explicitly and determine precisely when a team has secured a play-off spot or has been eliminated from contention. The RIOT Baseball Play-off Races Web site developed at the University of California, Berkeley, provides automatic updates of new, optimization-based play-off race statistics each day of the major league baseball season. In developing the site, we found that we could determine the first-place elimination status of all teams in a division using a single linear-programming formulation, since a minimum win threshold for teams finishing in first place applies to all teams in a division. We identified a similar (but weaker) result for the problem of play-off elimination with wildcard teams", "fulltext": "", "keywords": "pennant race;linear programming;games back;minimum win threshold;game schedule;optimization;lp;magic number;play-off race statistics;world wide web;baseball play-off spot competition;riot baseball play-off races web site"}
+{"name": "test_347", "title": "From revenue management concepts to software systems", "abstract": "In 1999, after developing and installing over 170 revenue management (RM) systems for more than 70 airlines, PROS Revenue Management, Inc. had the opportunity to develop RM systems for three companies in nonairline industries. PROS research and design department designed the opportunity analysis study (OAS), a mix of OR/MS, consulting, and software development practices to determine the applicability of RM in new business situations. PROS executed OASs with the three companies. In all three cases, the OAS supported the value of RM and led to contracts for implementation of RM systems", "fulltext": "", "keywords": "oas;rm systems;inc;opportunity analysis study;pros revenue management;software systems;software development practices;or/ms;consulting practices;revenue management concepts"}
+{"name": "test_348", "title": "Lower bounds on the information rate of secret sharing schemes with homogeneous", "abstract": "access structure We present some new lower bounds on the optimal information rate and on the optimal average information rate of secret sharing schemes with homogeneous access structure. These bounds are found by using some covering constructions and a new parameter, the k-degree of a participant, that is introduced in this paper. Our bounds improve the previous ones in almost all cases", "fulltext": "", "keywords": "optimal average information rate;information rate;secret sharing schemes;cryptography;lower bounds;optimal information rate;homogeneous access structure;k-degree"}
+{"name": "test_349", "title": "A self-adjusting quality of service control scheme", "abstract": "We propose and analyze a self-adjusting Quality of Service (QoS) control scheme with the goal of optimizing the system reward as a result of servicing different priority clients with varying workload, QoS and reward/penalty requirements. Our scheme is based on resource partitioning and designated \"degrade QoS areas\" such that system resources are partitioned into priority areas each of which is reserved specifically to serve only clients in a corresponding class with no QoS degradation, plus one \"degraded QoS area\" into which all clients can be admitted with QoS adjustment being applied only to the lowest priority clients. We show that the best partition is dictated by the workload and the reward/penalty characteristics of clients in difference priority classes. The analysis results can be used by a QoS manager to optimize the system total reward dynamically in response to changing workloads at run time. We demonstrate the validity of our scheme by means of simulation and comparing the proposed QoS self-adjusting scheme with those that do not use resource partitioning or designated degraded QoS areas", "fulltext": "", "keywords": "multimedia systems;resource partitioning;resource reservation;performance evaluation;simulation;self-adjusting quality of service control scheme;priority clients"}
+{"name": "test_35", "title": "Fusion of qualitative bond graph and genetic algorithms: A fault diagnosis", "abstract": "application In this paper, the problem of fault diagnosis via integration of genetic algorithms (GA's) and qualitative bond graphs (QBG's) is addressed. We suggest that GA's can be used to search for possible fault components among a system of qualitative equations. The QBG is adopted as the modeling scheme to generate a set of qualitative equations. The qualitative bond graph provides a unified approach for modeling engineering systems, in particular, mechatronic systems. In order to demonstrate the performance of the proposed algorithm, we have tested the proposed algorithm on an in-house designed and built floating disc experimental setup. Results from fault diagnosis in the floating disc system are presented and discussed. Additional measurements will be required to localize the fault when more than one fault candidate is inferred. Fault diagnosis is activated by a fault detection mechanism when a discrepancy between measured abnormal behavior and predicted system behavior is observed. The fault detection mechanism is not presented here", "fulltext": "", "keywords": "fault diagnosis;predicted system behavior;mechatronic systems;fault components;measured abnormal behavior;qualitative equations;genetic algorithms;floating disc;qualitative bond graph;engineering systems"}
+{"name": "test_350", "title": "There is no optimal routing policy for the torus", "abstract": "A routing policy is the method used to select a specific output channel for a message from among a number of acceptable output channels. An optimal routing policy is a policy that maximizes the probability of a message reaching its destination without delays. Optimal routing policies have been proposed for several regular networks, including the mesh and the hypercube. An open problem in interconnection network research has been the identification of an optimal routing policy for the torus. In this paper, we show that there is no optimal routing policy for the torus. Our result is demonstrated by presenting a detailed example in which the best choice of output channel is dependent on the probability of each channel being available. This result settles, in the negative, a conjecture by J. Wu (1996) concerning an optimal routing policy for the torus", "fulltext": "", "keywords": "hypercube;optimal routing policy;torus"}
+{"name": "test_351", "title": "Optimal online algorithm for scheduling on two identical machines with machine", "abstract": "availability constraints This paper considers the online scheduling on two identical machines with machine availability constraints for minimizing makespan. We assume that machine M/sub j/ is unavailable during period from s/sub j/ to t/sub j/ (0