query
stringlengths
8
1.13k
pos
stringlengths
13
1.09k
neg
stringlengths
11
1.07k
query_lang
stringclasses
32 values
__index_level_0__
int64
41
1.05M
A Study on Piezoresistive Characteristics of Smart Nano Composites based on Carbon Nanotubes for a Novel Pressure Sensor
This paper presents a preliminary study on the pressure sensing characteristics of smart nano composites made of MWCNT (multi-walled carbon nanotube) to develop a novel pressure sensor. We fabricated the composite pressure sensor by using a solution casting process. Made of carbon smart nano composites, the sensor works by means of piezoresistivity under pressure. We built a signal processing system similar to a conventional strain gage system. The sensor voltage outputs during the experiment for the pressure sensor and the resistance changes of the MWCNT as well as the epoxy based on the smart nano composite under static pressure were fairly stable and showed quite consistent responses under lab level tests. We confirmed that the response time characteristics of MWCNT nano composites with epoxy were faster than the MWCNT/EPDM sensor under static loads.
To deal with the market demand and the production organization feature of steel making enterprises,we formulate the optimization of charge design as a cluster analysis problem with quality constraints on chemical compositions.A particle-swarm-optimization-based(PSO-based) solution is proposed for reducing the dimensions based on the principal component analysis(PCA) techniques.The range constraints of chemical compositions for products with similar quality are introduced in terms of traditional process constraints,due time constraints and furnace capacity constraints,etc.The solution adopts an integration schema for charge plan and quality design.It performs the cluster analysis for candidate products with similar chemical compositions and constraints to realize the optimal charge design under quality constraints on steel products of multiple varieties and in small batch demands.
eng_Latn
43,544
Process Control in Production Line of Casting Sand Treatment
This essay makes a careful study of control system in various of processes,such as mechanization of production line in the sand treatment,electrical automation and technological process.It makes every procedure integrated and related each other,and meets the demands in production line in terms of high efficiency,low strength and high quality.After being put into application,the line has proved to be every practical and successful.
Three design schemes of the runner system of an injection mould for an automobile panel were simulated and optimized based on CAE technology.The optimized gating system helps to increase the quality of product and shorten the development cycle.
eng_Latn
43,572
Workpiece dynamics during stable cutting in a turning operation
In this investigation, workpiece and cutting tool dynamics in a turning operation are investigated. It is shown that dynamic information from the workpiece propagates to the cutting tool more consistently in the tangential cutting direction than in the radial cutting direction since there is a preload that seeks to keep the tool in contact with the workpiece. Finally, trajectories in the state space comprised of workpiece accelerations parallel to the radial and tangential cutting directions were investigated and it is graphically shown that stable cutting results in more organised dynamics and unstable cutting results in less organised dynamics, which agrees with previous work.
A virtual prototype model of machine tool is constructed by using the Pro/E software and the ANSYS software.Considering the effects of joints,this paper makes a study of dynamic analyses of machine tool and the effect of joints on the dynamic characteristics of machine tool to attain the purpose of synthetically predicting and evaluating the NC machine tool dynamic performance without a physical sample.
eng_Latn
43,573
Managing the Shape Function of Analog Devices in a Slicing Tree Floorplan
Analog Intellectual Property Cores design is still under study [1, 2]. The precharacterized cell libraries concept cannot be applied because the devices (transistors, capacitors, resistors,...) are electrically sized regarding a specific context. Thus, the trend is to develop reusable generators. To guarantee a reusable and fast placement, slicing tree floorplan can be used. Considering a particular electrical sizing and a specific process, the analog devices may occupy a wide range of shapes because of folding. Finding the correct aspect ratio for each device which optimizes the placement under a specific height and/or width constraint may lead to examine a high number of cases. This paper presents a general placement method to manage rectangular objects with varying aspect ratio. This approach allows hierarchical layout-aware electrical sizing of analog circuits.
There is an increasing prevalence of haptic devices in many engineering fields, especially in medicine and specifically in surgery. The stereotactic haptic boundaries used in Computer Aided Orthopa...
eng_Latn
43,576
Comparative study regarding precision of sowing devices distribution.
Precision for sowing is essential in order to achieve sowing quality works for hoeing plants, ultimately influencing the high productions obtaining. This paper presents a comparative study regarding the sowing accuracy obtained in laboratory conditions on a special stand, using 3 different row units: one individual transmission and two centralized transmissions for the seed distribution devices. The researches were developed for three working speeds, three different plant densities per hectare using corn as seed material.
This paper mainly introduces the system of computer embroidery of houshold sewing machine.In this system,there is a X Y working surface which is controlled by chip microprocessors 8031.The system's characteristics and the desgin of software and hardware are also explained.
eng_Latn
43,579
Genetic Programming for Grinding Surface Roughness Modelling
Grinding process is commonly selected for finishing operation because grinding has high accuracy and surface finish with a relatively high material removal rate. One of the most common issues in grinding process planning is to determine grinding condition for required surface roughness. This paper presents a feasibility study on grinding surface roughness modelling using genetic programming (GP) method. It has successfully demonstrated that GP could provide reliable prediction and has advantages over other established methods in terms of dealing with missing data during modelling process.
The purpose of this paper is to present some results of the experiments of type simulation embedded in optimization (on model type Systems Dynamics). They allow, not only, the so called, direct optimization, but extended sensitivity analysis of parameters and structures too. Authors have used languages Cosmic and Cosmos to support the learning process in (about) modelled dynamical system.
eng_Latn
43,600
Method and apparatus for an automated injection molding configuring and manufacturing system
A method and apparatus according to the invention comprises the defined by the customer or using a plurality of variables that are determined by the manufacturing process of computer-assisted injection molding configuration subsystem that allows the customer to interactively specify the conditions of the system and design them do. The configuration subsystem is connected to a computer network such as the Internet. The method and apparatus according to the present invention further comprises a computerized business system and process are associated with the configuration subsystem. The computerized business subsystem provides a cost and schedule of the system constituted by the configuration subsystem automatically, and further processes the order for the system. The processing subsystem automatically processes the input information of the customer and generating a view of the configuration system.
A virtual prototype model of machine tool is constructed by using the Pro/E software and the ANSYS software.Considering the effects of joints,this paper makes a study of dynamic analyses of machine tool and the effect of joints on the dynamic characteristics of machine tool to attain the purpose of synthetically predicting and evaluating the NC machine tool dynamic performance without a physical sample.
eng_Latn
43,611
Research on Dynamics Characteristics of Machine Tool Based on Joints
A virtual prototype model of machine tool is constructed by using the Pro/E software and the ANSYS software.Considering the effects of joints,this paper makes a study of dynamic analyses of machine tool and the effect of joints on the dynamic characteristics of machine tool to attain the purpose of synthetically predicting and evaluating the NC machine tool dynamic performance without a physical sample.
This paper presents a CAD framework that allows exploration of high-level designs within a multi-objective framework. The designs are manipulated with high-level transformations in a sequence determined by an external stochastic process. The effectiveness of the framework is evaluated with DSP systems, illustrating the ability for pointing the designer to a set of designs with good power-area tradeoffs.
eng_Latn
43,617
Computer simulation for warp knitted towel fabric
In order to improve the simulation effect of warp knitted towel fabric during computer-aided designing,the article puts forward a new simulation method-filling pile loop model,which is based on analysis of the pile loop model data and the factors influencing the configuration of the pile loop.The fundamental pile loop model is established by studying the relationship between the data of single pile loop model and weave structure.Having analyzed the relationship between individual factors that influence the configuration of pile loop,how to get the simulation pile loop model in different shapes and how to establish the pile loop model database are discussed,realizing the simulation of warp knitted towel fabrics by use of computer technology.
In this paper, we have presented an effective yield improvement methodology that can help both manufacturing ::: foundries, fabless and fab-lite companies to identify systematic failures. It uses the physical addresses of failing bits ::: from wafer sort results to overlay to inline wafer defect inspection locations. The inline defect patterns or the design ::: patterns where overlay results showed matches were extracted and grouped by feature similarity or cell names. The potentially problematic design patterns can be obtained and used for design debug and process improvement.
eng_Latn
43,620
Containerization and the PaaS Cloud
A Performance Study on the VM Startup Time in the Cloud
Design study of magnet shapes for axial Halbach arrays using 3D finite element analyses
eng_Latn
43,636
what are cells made up of
The cell (from Latin cella, meaning small room) is the basic structural, functional, and biological unit of all known living organisms. Cells are the smallest unit of life that can replicate independently, and are often called the building blocks of life. The study of cells is called cell biology. Cells consist of cytoplasm enclosed within a membrane, which contains many biomolecules such as proteins and nucleic acids.
All animals and plants are made of cells. Animal cells and plant cells have features in common, such as a nucleus, cytoplasm, cell membrane, mitochondria and ribosomes. Plant cells also have a cell wall, and often have chloroplasts and a permanent vacuole.nimal cells and plant cells have features in common, such as a nucleus, cytoplasm, cell membrane, mitochondria and ribosomes. Plant cells also have a cell wall, and often have chloroplasts and a permanent vacuole.
eng_Latn
43,672
what is a cell definition
The cell (from Latin cella, meaning small room) is the basic structural, functional, and biological unit of all known living organisms. Cells are the smallest unit of life that can replicate independently, and are often called the building blocks of life. The study of cells is called cell biology.Cells consist of cytoplasm enclosed within a membrane, which contains many biomolecules such as proteins and nucleic acids.ome (such as the nucleus and golgi apparatus) are typically solitary, while others (such as mitochondria, chloroplasts, peroxisomes and lysosomes) can be numerous (hundreds to thousands). The cytosol is the gelatinous fluid that fills the cell and surrounds the organelles.
The cell (from Latin cella, meaning small room) is the basic structural, functional, and biological unit of all known living organisms. Cells are the smallest unit of life that can replicate independently, and are often called the building blocks of life. The study of cells is called cell biology.hile the number of cells in plants and animals varies from species to species, humans contain more than 10 trillion (10 13) cells. Most plant and animal cells are visible only under the microscope, with dimensions between 1 and 100 micrometres.
eng_Latn
43,689
what is study of the physiology of cells
Many basic cell biology classes cover some cell physiology, and students can also take specific courses in this branch of biology to get more detailed information. Each cell within an organism is designed to act as an independently functioning unit which supports the larger organism as a whole. Cell physiology looks at the structure of different types of cells, and how cells function. It also looks at how cells come together to create organs and other structures, and how the cells within an organism work together.
The field of plant physiology includes the study of all the internal activities of plants—those chemical and physical processes associated with life as they occur in plants. This includes study at many levels of scale of size and time.hirdly, plant physiology deals with interactions between cells, tissues, and organs within a plant. Different cells and tissues are physically and chemically specialized to perform different functions.
eng_Latn
43,693
It's the branch of biology that studies various animals
Branches of Zoology | ZoologyDegreeOnline.com By definition, zoology is the branch of biology dealing with animals and the animal kingdom. ... Cytology, The study of cell structure, its organelles, and their functions ... Physiology, The study of the functions and various organs in animals.
jeopardy/2176_Qs.txt at master jedoublen/jeopardy GitHub BEASTLY BOOKS | This Richard Bach story is an allegory about a seabird who seeks to attain perfect flight | Jonathan Livingston Seagull. right: Sam. Wrong:.
eng_Latn
43,931
what structure of the plant is responsible for producing food for the plant
First, the study of phytochemistry (plant chemistry) is included within the domain of plant physiology. To function and survive, plants produce a wide array of chemical compounds not found in other organisms. Photosynthesis requires a large array of pigments, enzymes, and other compounds to function.
Both xylem and phloem are vascular tissues found in a plant. Xylem is a tubular structure which is responsible for water transport from the roots towards all of the parts of t … he plant.Phloem is also a tubular structure but is responsible for the transportation of food and other nutrients needed by plant.ylem is a tubular structure which is responsible for water transport from the roots towards all of the parts of the plant. Phloem is also a tubular structure which, on the other hand, is responsible for the transportation of food and other nutrients needed by plant.
eng_Latn
43,950
what is a ahi
Ahi is a form of tuna derived from either the yellowfin or bigeye tuna. It is frequently caught in warm Pacific waters and sold fresh. It is found in great numbers in both Hawaii and off the Pacific Coast of Mexico and Southern California. This type of tuna is tremendously popular in sashimi, which is sushi containing raw fish.
Apnea Hypopnea Index (AHI) The AHI is the number of apneas or hypopneas recorded during the study per hour of sleep. It is generally expressed as the number of events per hour. Based on the AHI, the severity of OSA is classified as follows: None/Minimal: AHI < 5 per hour. Mild: AHI ≥ 5, but < 15 per hour.
eng_Latn
44,016
is sleep apnea over diagnosed
Testing and treatment. Sleep apnea is diagnosed using an overnight polysomnogram (PSG), or sleep study. (1,2,3) A PSG is conducted overnight at a sleep laboratory. Several monitoring devices continuously measure breathing, blood oxygen levels, heart rate, limb movements and more.
Since OSA is a serious medical condition, it must be diagnosed by a physician. Once a diagnosis of sleep apnea is made, the severity of the dysfunction can be classified and treatment options will be given.
eng_Latn
44,019
We present an adaptive power and rate TDMA scheduling algorithm for unicasting messages from base stations to their associated mobile clients in a cellular wireless network. To enhance network robustness, we consider a relay-aided system under which relay stations are placed in fixed locations in each cell area. Upon a base station failure, a relay station is capable of receiving unicast transmissions issued by a neighboring base station, or by a relay station placed in a neighboring cell. Our adaptive power and rate scheduling algorithm reacts to base station failures by reconfiguring the unicast schedule and packet distribution routes. The objective of this paper is to study the performance behavior attained for such a scheme under pre-failure and post-failure scenarios. We show that our failover scheduling algorithm is effective in adapting to base station failures, limiting the performance degradation incurred.
In recent years the research effort devoted to multi-hop communications (or relaying) and multi-antenna transmissions has been constantly growing due to the potential increase in the spectral efficiency that these technologies could provide. In particular, these play a key role for the air interface being designed within the European IST-4-027756 Project WINNER II due to the tight efficiency requirements it poses. In this paper a strategy is proposed for the dynamic resource partitioning in relay enhanced cells (REC). It allows the exploitation of spatial resource sharing through a simple approach based on a logic beam concept. Two different resource partitioning algorithms are proposed and assessed, one aiming at maximize the cell throughput while the other also ensuring fairness between the users. A quite significant performance gain is shown with respect to conventional solutions based on fixed sectorization.
Berzelius failed to make use of Faraday's electrochemical laws in his laborious determination of equivalent weights.
eng_Latn
44,115
The research aim of this paper is to investigate the effectiveness of a new Kriging model which uses Taylor expansion to predict wireless network connectivity. Wireless network connectivity is measured by the strength of emitted signal power from the tower to the point in question. The prediction results are compared with those from the literature where an Ordinary Kriging model and a neural network are used to conduct the same prediction. Root mean squared error (RMSE) and maximum absolute relative error (MARE) show that the prediction results of the new Kriging model are much better than those obtained before with average differences from 51.56% to 85%. This study shows the promise of the new Kriging model to accurately estimate wireless signal strength.
The simulation community has used metamodels to study the behavior of computer simulations for over twenty-five years. The most popular techniques have been based on parametric polynomial response surface approximations. In this state of the art review, we present recent developments in this area, with a particular focus on new developments in the experimental designs employed.
Berzelius failed to make use of Faraday's electrochemical laws in his laborious determination of equivalent weights.
eng_Latn
44,118
Device-to-device (D2D) communication is being considered as a traffic offloading solution as well as a public safety network solution in cellular networks specified by the Third Generation Partnership Project (3GPP). Discovering proximity devices before direct communication is one of the challenges in realizing D2D communication. This paper proposes an efficient network-assisted device discovery method for D2D systems operating under Long Term Evolution (LTE) cellular networks. Moreover, it provides a probabilistic model for predicting the worst case performance of the proposed method. Preliminary evaluations show that the proposed discovery method has a high probability of device discoverability within a given discovery interval.
Device-to-Device communication underlaying a cellular network enables local services with limited interference to the cellular network. In this paper we study the optimal selection of possible resource sharing modes with the cellular network in a single cell. Based on the learning from the single cell studies we propose a mode selection procedure for a multi-cell environment. Our evaluation results of the proposed procedure show that it enables a much more reliable device-to-device communication with limited interference to the cellular network compared to simpler mode selection procedures. A well performing and practical mode selection is critical to enable the adoption of underlay device-to-device communication in cellular networks.
Berzelius failed to make use of Faraday's electrochemical laws in his laborious determination of equivalent weights.
eng_Latn
44,126
Priority-Based fair Scheduling for Multimedia WiMAX Uplink Traffic
Worldwide interoperability for microwave access (WiMAX) is based on the IEEE 802.16 Standard with mobility support from the 802.16e amendment and it enables convergence of mobile and fixed broadband wireless networks covering metropolitan and rural areas. WiMAX traffic management aims at providing efficient delivery of multimedia applications with a range of QoS requirements. Focussing on the point-to-multipoint mode, we propose a priority-based fair scheduling algorithm for subscriber stations to serve a mixture of uplink traffic from different scheduling services and provide an analytical model for evaluating user-perceived delay performance under this scheduling scheme. The model is supported and validated by a simulation study. We present numerical results to illustrate the effect of traffic load and other design parameters on WiMAX message delay.
A dynamic interconnection ofPV modules basedonanirradiance equalization algorithm inGridConnected Systems based onaPlant Oriented configuration ispresented. Themaximumoutput power at thePV generator level obtained withapproach issimilar tothat obtained intheModuleIntegrated configuration.
eng_Latn
44,132
The influence of interference networks in QoS parameters in a WLAN 802.11g environment
This paper proposes a strategy to determine how much a given network can affect the QoS parameters of another, by interference. In order to achieve this, a measurement campaign was carried out in two stages: firstly with a single AP and later with two APs separated by a distance less than three meters, using the same channel. After the measurement, an analysis of the results and a set of inferences were made by using Bayesian Networks, whose inputs were the experimental data, i.e. QoS metrics such as: throughput, jitter, packet loss, PMOS and physical metrics like power and distance.
Abstract A modification of the quasi-normal theory is proposed for the study of inhomogeneous turbulent flows. In this approximation realizability conditions for third-order correlations are enforced. These conditions are based on generalized Schwarz' inequalities which limit the growth of triple correlations and the approximation consists in “clipping” these last quantities when they violate their respective inequalities. By requiring that the inequalities be satisfied, we take into account the damping effect of fourth-order correlations. The equations corresponding to this approximation are derived for the case of inhomogeneous turbulence in a Boussinesq fluid with the aid of a recently proposed hypothesis for pressure correlation terms.
eng_Latn
44,136
Successful Treatment of a Patient with Severe CarbonMonoxide Intoxication Complicated with ARDS UsingECMO and HFO Ventilation
Carbon monoxide (CO) intoxication is a common and underestimated problem. We report a 39-year-old woman who was exposed to CO for more than 12 hours and presented in an unconsciousness state with acute pulmonary edema. Extracorporeal membrane oxygenation (ECMO) and high frequency oscillatory ventilation (HFOV) were used because of acute respiratory distress syndrome (ARDS) and complicated bilateral pneumothorax during hospitalization. ECMO and HFOV were instituted for 30 days and 10 days, respectively. Full recovery of consciousness and cognition were observed, and her activities were not limited in the 6-month follow-up. (Thorac Med 2011; 26: 325-331)
In this paper we study the performance of IEEE 802.11e enhanced distributed channel access (EDCA) priority schemes under finite load and error-prone channel. We introduce a multi-dimensional Markov Chain model that includes all the mandatory differentiation mechanisms of the standard: QoS parameters, CWMIN, CWMAX arbitration inter-frame space (AIFS), and the virtual collision handler. The model faithfully represents the functionality of the EDCA access mechanisms, including lesser known details of the standard such as the management of the backoff counter which is technically different from the one used in the legacy DCF. We study the priority schemes under both finite load and saturation conditions. Our analysis also takes into consideration channel conditions.
eng_Latn
44,138
Power Allocation of Group Cell System
Power Control (PC) refers to the adjustments of transmission power level based on certain criteria and it plays an important role in exploiting the capacity potential [39]. The power control transmitted by the radio base stations has been proposed to reduce the output power level and thus to reduce interference and increase the life time of the equipment. The reduction of interference implies some improvements of QoS or allows increasing the capacity of a wireless cellular system [40].
In this article, we propose relay node placement for providing k-connectivity to randomly deployed sensor nodes with energy efficacy using Genetic Algorithm (GA). Here, we also explain the basic step of GA with suitable examples. Also, we carried out the extensive simulations to study proposed algorithm’s performance with existing one in terms of number of deployed nodes and lifetime of the network.
eng_Latn
44,146
BILEVEL OPTIMIZATION PROBLEMS IN TOPOLOGICAL SPACES
In this paper, we introduce and study the bilevel optimization ::: problems in topological spaces without linear structure. For these problems, ::: we establish two models which are different in the feasible region setting of ::: lower-level problems. Some new existence results are obtained in rather weak ::: conditions. These theorems improve and generalize the corresponding results ::: of bilevel optimization in the literatures.
The special section features six articles that focus on radio resource management issues in IEEE 802.16/WiMAX-based broadband wireless systems. This guest editorial discusses the major issues pertaining to the topic and summarizes the articles included in this section.
yue_Hant
44,150
A Spectrum Auction under Physical Interference Model
Spectrum auctions provide a platform for licensed spectrum users to share their underutilized spectrum with unlicensed users. Existing spectrum auctions either use the protocol interference model to characterize interference relationship as binary relationship, or do not allow the primary and secondary users to share channels simultaneously. To fill this void, we design SPA, a spectrum single-sided auction under the physical interference model, which considers the interference to be accumulative. We prove that SPA is truthful, individually rational, and computationally efficient. Results from extensive simulation studies demonstrate that, SPA achieves higher spectrum utilization and buyer satisfaction ratio, compared with an existing auction adapted for the physical interference model.
Two-pion Bose-Einstein correlations have been studied using the BNL-E866 Forward Spectrometer in 11.6 A {center_dot} GeV/c Au + Au collisions. The data were analyzed using three-dimensional correlation parameterizations to study transverse momentum-dependent source parameters. The freeze-out time and the duration of emission were derived from the source radii parameters.
eng_Latn
44,151
Cell planning and channel throughput of mobile WiMAX AT 2.5 GHz
Performance study of mobile WiMAX network with changing scenarios under different modulation and coding
Receiving Power Level Prediction for WiMAX Systems on 3.5 GHz
eng_Latn
44,157
Modeling of highly loaded 0-3 piezoelectric composites using a matrix method
A model previously developed for pure 0-3 connectivity piezocomposites has been extended to 3-3 connectivity. This matrix method allows the prediction of the effective electroelastic moduli of a piezocomposite according to its connectivity. It is used to optimize composite performance by choosing the optimal constituents for each phase. A simple combination of the results for 0-3 and 3-3 connectivities allows the effective proportion of 3-3 connectivity to be defined in highly loaded 0-3 piezocomposites. This theoretical analysis has been used to evaluate effective proportions of 3-3 connectivity in five composite samples. The values obtained are shown to be a function of the ceramic volume fraction and fabrication process. The results of this study were used to optimize the fabrication process.
For a three user Gaussian multiple access channel (MAC), we propose a new superposition block Markov encoding based cooperation scheme. Our scheme allows the three users to simultaneously cooperate both in pairs, and collectively, by dividing the transmitted messages into sub-messages intended for each cooperating partner. The proposed encoding and decoding at the transmitters take into account the relative qualities of the cooperation links between the transmitters. We obtain and evaluate the achievable rate region based on our encoding strategy, and compare it with the achievable rates for the two user cooperative MAC. We demonstrate that the added diversity by the presence of the third user improves the region of achievable rates, and this improvement is especially significant as far as the sum rate of the system is concerned.
eng_Latn
44,164
Introduction and study of fourth order theta schemes for linear wave equations
A new class of high order, implicit, three time step schemes for semi-discretized wave equations is introduced and studied. These schemes are constructed using the modified equation approach, generalizing the @q-scheme. Their stability properties are investigated via an energy analysis, which enables us to design super-convergent schemes and also optimal stable schemes in terms of consistency errors. Specific numerical algorithms for the fully discrete problem are tested and discussed, showing the efficiency of our approach compared to second order @q-schemes.
Deterministic network calculus (DNC) is not suitable for deriving performance guarantees for wireless networks due to their inherently random behaviors. In this paper, we develop a method for Quality of Service (QoS) analysis of wireless channels subject to Rayleigh fading based on stochastic network calculus. We provide closed-form stochastic service curve for the Rayleigh fading channel. With this service curve, we derive stochastic delay and backlog bounds. Simulation results verify that the bounds are reasonably tight. Moreover, through numerical experiments, we show the method is not only capable of deriving stochastic performance bounds, but also can provide guidelines for designing transmission strategies in wireless networks.
eng_Latn
44,173
Spectrum Sensing and the Utilization of Spectrum Opportunity Tradeoff in Cognitive Radio Network
Cognitive radio (CR) is proposed to use the spectrum opportunistically. In this letter, we study the tradeoff between the spectrum sensing performance and the utilization of spectrum opportunity in the CR network. We formulate the tradeoff problem mathematically. The optimum tradeoff mainly depends on the activity of primary user. Cooperative spectrum sensing is also studied based on the methodology of the proposed tradeoff problem. We perform the numerical analysis to quantify the effects of different parameters on the optimal spectrum sensing time.
The similarity and difference of regional distributions between management science publication and NSFC's management science fund are compared. Two scientometric models rank-frequency distribution and concentration measurement are used to show whether or not the amount of grants awarded to different provinces by the National Natural Science Foundation of China (NSFC) match with the research capacity of given provinces as indicated by its management science publication
eng_Latn
44,178
Performance study of LTE and mmWave in vehicle-to-network communications
recent development and applications of sumo – simulation of urban mobility .
Providing Effective Real-time Feedback in Simulation-based Surgical Training
eng_Latn
44,257
Optimization of cooperative spectrum sensing with energy detection in cognitive radio networks
Optimal Linear Cooperation for Spectrum Sensing in Cognitive Radio Networks
Bioaugmentation and biostimulation as strategies for the bioremediation of a burned woodland soil contaminated by toxic hydrocarbons: a comparative study.
eng_Latn
44,274
A Survey on Radio Resource Allocation in Cognitive Radio Sensor Networks
Cognitive radio for medical wireless body area networks
residents ' reluctance to challenge negative hierarchy in the operating room : a qualitative study .
eng_Latn
44,276
Adaptive Distributed Laser Charging for Efficient Wireless Power Transfer
Comparison of laser beam propagation at 785 nm and 1550 nm in fog and haze for optical wireless communications
A field study of API learning obstacles
eng_Latn
44,404
what is cognitive radio paradigm
CRN is well-. known as a promising paradigm to improve the utilization of. radio electromagnetic spectrum, by allowing unlicensed radios. to opportunistically access the idle spectrum licensed to the. primary radios [18]–[25]. CRN is in essence a radio system. with the objective to improve wireless network throughput.
Cognitive linguistics is a cluster of overlapping approaches to the study of language as a mental phenomenon. Cognitive linguistics emerged as a school of linguistic thought in the 1970s.
eng_Latn
44,415
Accreditation Commission for Education in Nursing: Your Supportive Partner in Successful Nursing Accreditation
Abstract In many nursing programs, there are a lot of new faculty members and new nurse administrators. These nurse educators are potentially preparing for their first accreditation visit since attaining their positions. Nurse educators may have questions about how to begin preparing for an initial accreditation visit or a reaccreditation visit. This article will provide information about the various resources available from the Accreditation Commission for Education in Nursing to assist in the accreditation process and to be a supportive partner in nursing accreditation.
In this paper we study the performance of IEEE 802.11e enhanced distributed channel access (EDCA) priority schemes under finite load and error-prone channel. We introduce a multi-dimensional Markov Chain model that includes all the mandatory differentiation mechanisms of the standard: QoS parameters, CWMIN, CWMAX arbitration inter-frame space (AIFS), and the virtual collision handler. The model faithfully represents the functionality of the EDCA access mechanisms, including lesser known details of the standard such as the management of the backoff counter which is technically different from the one used in the legacy DCF. We study the priority schemes under both finite load and saturation conditions. Our analysis also takes into consideration channel conditions.
eng_Latn
44,417
Novel adaptive time delay estimation method based on the fractional lower order cyclic correlation in impulsive noise environment
α-stable distribution was taken as the noise model.Considering the effect of α-stable distributed noises for the classical second order cyclic statistics,a novel fractional lower order cyclic statistic which was the generalization of the second order one.Based on the proposed concept,a novel adaptive time delay estimation method was developed.Simula-tions show that the proposed algorithm gets accurate estimation for both time-varying and time invariant time delays un-der Gaussian and impulsive noises conditions.Simulations also demonstrate that the performance of the proposed algo-rithm is superior to both LMP(least mean p-norm) and adaptive time delay estimation methods inα-stable distributed noises.
In this paper, we consider a network of N identical IEEE 802.11 DCF (Distributed Coordination Function) terminals with RTS/CTS mechanism, each of which is assumed to be saturated. For performance analysis, we propose a simple and efficient mathematical model to derive the statistical characteristics of the network such as the inter-transmission time of packets in the network and the service time (the inter-transmission time of successful packet transmissions) of the network. Numerical results and simulations are provided to validate the accuracy of our model and to study the performance of the IEEE 802.11 DCF network.
eng_Latn
44,422
Infinitesimal differential diffusion quantum Monte Carlo study of diatomic vibrational frequencies
We show how to extend the formalism of infinitesimal differential diffusion quantum Monte Carlo to the case of higher derivatives of the ground‐state energy of a molecule with respect to the molecular geometry. We use LiH as an example, but the technique can be extended to more complicated, nonliner molecules as well. We obtain good agreement with experimental values for the energy derivatives and for the harmonic and anharmonic frequencies of LiH and LiD, despite using a compact single‐determinant wave function.
In this paper, we consider a network of N identical IEEE 802.11 DCF (Distributed Coordination Function) terminals with RTS/CTS mechanism, each of which is assumed to be saturated. For performance analysis, we propose a simple and efficient mathematical model to derive the statistical characteristics of the network such as the inter-transmission time of packets in the network and the service time (the inter-transmission time of successful packet transmissions) of the network. Numerical results and simulations are provided to validate the accuracy of our model and to study the performance of the IEEE 802.11 DCF network.
eng_Latn
44,424
Spectrum sharing scenarios and resulting technical requirements for 5G systems
Auction-Based Spectrum Sharing
Why do we trust new technology? A study of initial trust formation with organizational information systems
eng_Latn
44,447
A measurement study on the application-level performance of LTE
A close examination of performance and power characteristics of 4G LTE networks
Analog-to-Digital Converters
eng_Latn
44,465
In the September 1996 issue of the Journal of the American Podiatric Medical Association, the authors published a retrospective review of their experiences with and results of plantar fasciotomy from 1992 through 1994. Since then, patients who underwent endoscopic plantar fasciotomy from 1994 through 1997 have been reviewed by utilizing materials and methods identical to those used in the original study. This article provides an update of the results of endoscopic plantar fasciotomy and compares them with the results described in the 1996 study.
Heel pain, whether plantar or posterior, is predominantly a mechanical pathology although an array of diverse pathologies including neurologic, arthritic, traumatic, neoplastic, infectious, or vascular etiologies must be considered. This clinical practice guideline (CPG) is a revision of the original 2001 document developed by the American College of Foot and Ankle Surgeons (ACFAS) heel pain committee.
Berzelius failed to make use of Faraday's electrochemical laws in his laborious determination of equivalent weights.
eng_Latn
44,466
Patritumab (P) or placebo (PBO) plus cetuximab (C) and platinum-based therapy in squamous cell carcinoma of the head and neck (SCCHN): a phase 2 study.
TPS6104Background: P is a fully human monoclonal antibody against human epidermal growth factor receptor 3. P blocks activation by the ligand, heregulin (HRG), inducing receptor internalization. Ev...
We demonstrate a methodology to simultaneously find the optimal impedances in the design space of continuum mode as well as the corresponding ideal matching network of RFPAs. The methodology relies on the Belevitch representation of S parameters to avoid the apriori selection of matching network topology. The proposed methodology in conjunction with particle swarm optimization (PSO) is applied to design the broadband power amplifier in the class BJF-1 continuum to target constant efficiency and output power between 2.5 - 4.6 GHz using a GaN HEMT CGH40010. The resulting ideal network is further optimized in EM simulations and its measured values show good match with EM simulations. Even though PSO algorithm is used in this work, any of the evolutionary algorithms or machine learning can be used to implement the proposed methodology.
eng_Latn
44,788
Time-Domain Analysis of RF and Microwave Autonomous Circuits by Vector Fitting-Based Approach
This work presents a new method for the analysis of RF and microwave autonomous circuits directly in the time-domain, which is the most effective approach at simulation level to evaluate nonlinear phenomena. For RF and microwave autonomous circuits, time-domain simulations usually experiment convergence problems or numerical inaccuracies due to the presence of distributed elements, preventing de-facto their use. The proposed solution is based on the Vector Fitting algorithm applied directly at circuit level. A case study relative to a RF hybrid oscillator is presented for practical demonstration and evaluation of performance reliability of the proposed method.
It is shown in this paper that : 1) The Minimum Variance Self Tuning Regulators and a configuration of Adaptive Controllers with Implicit (or Explicit) Reference Model feature an asymptotic duality character which extends the duality existing in the linear case with known parameters between the minimum variance control and the modal control. 2) Adaptive Control with an Explicit Reference Model is equivalent to an Adaptive Control with an Implicit Reference Model using an intermediate adaptive predictor, if the adaptive predictor plus the control behave like the explicit reference model. 3) The result of 1) is generalized by considering the asymptotic duality between Model Reference Adaptive Controllers and the class of Stochastic self Tuning Regulators which contains the self tuning minimum variance regulator as a particular case.
eng_Latn
44,804
Quad-switch push-pull (QSPP) RF amplifier with direct, simultaneous modulation of phase and pulse position for spread-spectrum power applications
Conducted EMI Reduction in Power Converters by Means of Periodic Switching Frequency Modulation
ATLANTIS: a Phase III study of lurbinectedin/doxorubicin versus topotecan or cyclophosphamide/doxorubicin/vincristine in patients with small-cell lung cancer who have failed one prior platinum-containing line.
eng_Latn
44,951
A 1.5–3.0GHz wideband VCO with low gain variation
Oscillator phase noise: a tutorial
A study of phase noise in CMOS oscillators
eng_Latn
44,998
A 5.9-GHz Fully Integrated GaN Frontend Design With Physics-Based RF Compact Model
A Wideband and Compact GaN MMIC Doherty Amplifier for Microwave Link Applications
Tweet, but verify: epistemic study of information verification on Twitter
eng_Latn
45,007
A 21–48 GHz Subharmonic Injection-Locked Fractional-N Frequency Synthesizer for Multiband Point-to-Point Backhaul Communications
A study of injection locking and pulling in oscillators
Intelligent Anti-Money Laundering System
eng_Latn
45,011
A scoping study of electric and magnetic field energy harvesting for wireless sensor networks in power system applications
Energy scavenging for mobile and wireless electronics
Recycling ambient microwave energy with broad-band rectenna arrays
eng_Latn
45,125
A 21–48 GHz Subharmonic Injection-Locked Fractional-N Frequency Synthesizer for Multiband Point-to-Point Backhaul Communications
A study of injection locking and pulling in oscillators
Community Deception or: How to Stop Fearing Community Detection Algorithms
eng_Latn
45,137
Towards an Open Government Data Success Model : A case study from Indonesia
An Open Government Maturity Model for social media-based public engagement
Design of a D-Band CMOS Amplifier Utilizing Coupled Slow-Wave Coplanar Waveguides
eng_Latn
45,200
Suokas, J., Heino, P. and Karvonen, I., 1990. Expert systems in safety management. Journal of Occupational Accidents , 12: 63–78. The development of computer support for safety and reliability analysis is reviewed, and the possibilities of knowledge based expert systems in safety management are discussed. Two areas are described in more detail: knowledge based safety analysis in process design, and knowledge based diagnosis and management of process disturbances in operation. Knowledge based safety management is illustrated by two examples. The first one is HAZOPEX expert system aiding hazard and operability studies of process systems. The second example is KRM-information system aiding process operators and engineers in the early identification and management of potentially hazardous disturbances. International trends in using knowledge engineering in safety analysis and management are also discussed.
This paper presents a system approach for action plan management process. Based on systematic usage of risk evaluation, it helps in actions priorities definition. Several organisations generate actions belonging to separate business goals, sometimes contradictory. Moreover, maintenance, quality, process, research and development organisations use different approaches to define actions priorities. In this case study, more than 400 engineers participate in this process and generating between 20 to 50 actions sets monthly (each set can contain around 3 actions). This paper demonstrates that it is possible to unify action plans management. This has been achieved by developing a model of management system integrating several quality methods. The proposed tool enables for local managers to have a clear picture of ongoing and planned actions of their areas. For plant managers, it smoothes the coordination among organizations.
We prove that groups acting geometrically on delta-quasiconvex spaces contain no essential Baumslag-Solitar quotients as subgroups. This implies that they are translation discrete, meaning that the translation numbers of their nontorsion elements are bounded away from zero.
eng_Latn
45,218
Re-Engineering Design of a Purchasing System by BPS and PCA
Simulation for business process re-engineering: case study of a database management system
No Relationships Between the Within-Subjects’ Variability of Pain Intensity Reports and Variability of Other Bodily Sensations Reports
eng_Latn
45,226
Service enterprise workflow simulation is an important approach to analyze service enterprise business processes dynamically. Traditionally, service workflow simulation is based on the discrete event queuing theory, which lacks flexibility and scalability. To address this problem, this paper proposes a service workflow simulation framework based on multi-agent cooperation. Social rationality of software agents is introduced into the proposed framework. Adopting rationality as a decision making strategy facilitates the implementation of flexible scheduling for activity instances. A simulation system prototype is developed and a business case study is conducted to validate the proposed framework.
The analysis of modern business processes implemented as orchestration of software services demands for new approaches that explicitly take into account the inherent complexity and distribution characteristics of such processes. In this respect, Distributed Simulation (DS) offers a viable tool to cope with such a demand, due to the aggregation, scalability, representativeness and load balancing properties that it allows to achieve. However, the use of DS is mostly limited by the specialized technical know-how and the extra-development that DS requires with respect to approaches based on conventional local simulation. This paper proposes a model-driven method that enables the DS-based analysis of business processes by introducing the automated transformation of business process models into analysis models that are specified as Extended Queueing Network (EQN) models and executed as distributed simulations. The paper also presents an example application to a business process for an e-commerce scenario.
Berzelius failed to make use of Faraday's electrochemical laws in his laborious determination of equivalent weights.
eng_Latn
45,231
Dispersed data sources, incompatible data formats and a lack of non-ambiguous and machine readable meta-data are major obstacles in data analytics and data mining projects in process industries. Usually, meta-information is only available in unstructured format optimized for human consumption. This contribution captures common problems when handling data from process plants in analytics, develops the vision of a data collection process supported by a search tool, and describe a search tool for the heterogeneous plant data as a proof-of-concept.
this contribution presents a novel approach for the automatic generation of simulation models for existing plants. The aim is to focus on brown-field projects for which the automation system has to be renewed. Whereas most research approaches for simulation based control code testing focus on green-field projects, the authors developed a method that considers the type of planning, as well as engineering data, which is given in brown-field projects. In addition, a prototype has been implemented. The paper presents the validation results of a first field study on a North Sea oil production plant - one of the largest oil discoveries in the UK North Sea.
We report enhancement of the mechanical stability of graphene through a one-step method to disperse gold nanoparticles on the pristine graphene without any added agent.
eng_Latn
45,240
Format selection in machine-scored classroom achievement tests.
Performance on classroom achievement tests should reflect student accomplishment on a targeted set of instructional objectives. One prerequisite for meaningful edumetric assessment is the structuring of each component item in an appropriate format. In a comparative study measuring the effect of item format on student performance, incorrect choices were selected 2.4-fold more frequently on multiple choice items than on their content-identical, multiple true-false derivatives. On similar trials, scores on complex multiple-choice items ranged upward of 20 percent higher than on their content-identical, multiple true-false counterparts. The strengths and weaknesses of various test item formats are discussed and specific recommendations for format selection are made.
Flexibility of Cyber Physical Production Systems (CPPS) has already been widely discussed, but apart from some definitions, only few measures exist. These, however, are required to compare different CPPS regarding their flexibility and the effort necessary to increase flexibility during evolution. This paper presents the application and evaluation of already proposed flexibility and adaptivity metrics for automated Production Systems. The interdependencies among product, process and resource are shown. An AML description is provided for a lab size application example as a basis to demonstrate changes that incorporate flexibility. Such changes are to produce heavier work pieces (WPs), realize different processes and use different resources. At the example of the model in Automation Markup Language (AML), we highlight the interdependencies between product, process and resource, thus showing the strengths and weaknesses of the proposed metric.
eng_Latn
45,243
A Proposal of a Simulation-Based Approach for Service Level Agreement in Cloud
Clouds offer resources as services, which are charged on the basis of their actual usage. An approach that is currently spreading, to face possible problems linked to a low quality of service offered, is the adoption of Service Level Agreements (SLAs). In this paper we propose the adoption of a simulation-based approach for supporting a framework in charge of SLA management in cloud applications. The idea stems from the consideration that to face the inevitable impact on QoS of the cloud elasticity, it is necessary to evaluate in a predictive way the performance properties of the many different configurations among which the automatic cloud management can choose to allocate resources to customers. This paper proposes the architecture of the framework for SLA management in clouds, identifying the requirements implied by the simulation to be performed and suggesting the adoption of a simulation engine that fits with the proposed requirements.
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,248
Process Platform Formation for Product Families
In accordance with the product families, process platforms have been recognized as a promising tool for companies to configure optimal, yet similar, production processes for producing different products. This paper tackles process platform formation from large volumes of production data available in companies’ production systems. A data mining methodology based on text mining and tree matching is developed for the formation of process platforms. A case study of high variety production of vibration motors for mobile phones is reported to prove the feasibility and potential of forming process platforms using text mining and tree matching.Copyright © 2007 by ASME
During the 11th five year plan period,Wuhan should adopt five policies and measures basied on the development situation of High-Tech industryconstructing the perfect macroscopic decision system molding whole industry chainenhancing the intelligent properity right competition strategysetting up key and total terrace and pushing forward the system innovation.In this way we can promote the whole competition ability of High-Tech industry and guarantee the health development of High-Tech industry.
eng_Latn
45,249
Atmospheric composition changes: Causes and processes involved - Chapter 1 in Belgian global change research 1990-2002: Assessment and Integration Report
Results of global change research activities realised between 1990 and 2002, supported by the Belgian Federal Science Policy Office (BELSPO), were subject to an assessment and integration in 2003 and 2004. This process focussed on a selection of scientific information, based on data provided by Belgian researchers, and resulted in two reports: - Belgian global change research 1990 – 2002 : Assessment and integration report - Belgian global change research 1990 – 2002 : Synthesis of the assessment and integration report This report is the extended assessment and integration report. The synthesis of this report is available in Dutch, English and French. Both reports emphasise the following four topics: - Atmospheric composition changes - Climate change - The role of the ocean in global change - Global change impacts on ecosystems
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,250
Complex preprocessing for pattern recognition
The construction of pattern recognition machines may eventually depend upon the development of highly complex preprocessors. This claim is supported by a discussion of the importance of perceptual grouping. Since complex preprocessing will assess more of the basic structure of a visual scene, internal representations will have to be more descriptive in nature. Two approches to descriptive internal representation are mentioned. Two of the author's programs are reviewed. One plays the Oriental game of GO at a human level and the other can recognize digitized hand printed characters. Both programs use a geometry preserving representation of features, so that calculations involving the features can assess the original geometry of the input. In addition, the GO program calculates groups of stones and performs other types of “complex”processing. Practical and philosophical arguments are given for the use of internal representation by pattern recognition programs.
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,253
ECO-FOOTPRINT: An Innovation in Enterprise System Customization Processing
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
Project report based on the development and deployment of an equipment automation framework at a 200mm semiconductor fab. The emphasis of this paper will be on the integrated Interface-A Port and the capabilities of the EES System - including screenshots of the implemented system.
kor_Hang
45,255
Operations performance for HANDI 2000 business management system
Performance management consists of performance monitoring and tuning as well as longer-term capacity planning. The objective of Operations Performance Management Plan is to assure that performance and response time commitments are satisfied. This is achieved through daily monitoring and capacity planning using workload forecasting and historical trends. The OPMP becomes effective as of this document`s acceptance. It will provide guidance through implementation efforts.
To adapt to the development of society economic and the need of basic medical treatment service,The articles directly expounds it what the severity defect of hospital administration mode and enhance the assurance strength of the development of hospital.We had explored hospital management innovation mode of " 541" though theory study and reform practice.The main content and operating effect were illustrated.
eng_Latn
45,295
The Life Cycle Challenge of ERP System Integration
Abstract To serve its purpose as a backbone for business integration, Enterprise Resource Planning (ERP) systems need to be integrated with other information systems inside and outside the boundaries of an enterprise. An inductive case study was made to examine a long-term ERP system of a large manufacturing enterprise to better understand the nature and importance of ERP system integration. Our results can be summarized as four findings about the current life cycle models: 1) integration should be a major consideration when choosing ERPs, 2) deployments are continuous, 3) external integration is not just an extension phase after the project and 4) integration remains as a continuous challenge which is never fully achieved due to the constantly changing business requirements and organizational landscape. The results can help managers when making decisions on integration issues, yet effective approaches for integration governance are needed in order to avoid the increased costs and complexity.
First volume in a three-part series. Book written by Sandra Lach Arlinghaus. Material underwent extensive classroom testing (pre and post publication in handbook form) in course created and taught by W. D. Drake and S. L. Arlinghaus: Population-Environment Dynamics--Transition Theory, NRE 545, School of Natural Resources and Environment, The University of Michigan (1991-1997). Links to published documents containing student work from this course appear elsewhere in Deep Blue.
eng_Latn
45,304
TRAINING AND TURNOVER (Version 1)
The purpose of the model presented by Glance et al is to study the ‘contribute vs. free-ride’ dilemma present in organizations. This social dilemma is present on two intertwined levels, the managerial front and the base level of workers. This model also seeks to define a correlation (if one does exist) between worker turnover and productivity rates.
According to the actual situation of China's enterprises, by analyzing work flow,logistics and cash flow in equipment management information system with related ERP principles, This paper presented a new model for the informationization of enterprises , and took an example ofthe design and development of equipment management information system based on ERP.
yue_Hant
45,306
[Development of a growth model for fetal head and body measurements].
Head parameters BPD, FOD and HC and abdomen parameters ATD, APD and AC have been ascertained in a prospective cross-sectional study in 515 healthy single fetuses between 13 and 40 weeks of gestation by ultrasound. In all cases ultrasound velocity was 1540 m/sec. From these data, a growth model was achieved for each parameter and corresponding growth curves with 5%, 10%, 50%, 90% and 95% percentiles were established.
This case study of a precision device manufacturer discusses knowledge integration in a product development organization after M&A. The goal is to contribute to establishment of a methodology that helps to accomplish the purpose of M&A. The special feature is to analyze establishment of a new product development organization, and its entry into a new market from the standpoint of knowledge management.
eng_Latn
45,309
Implementation of Failure Enterprise Systems in Organizational Perspective Framework
Failure percentage of Enterprise Resource Planning (ERP) implementation projects stay high, even following quite a while of endeavours to diminish them. In this paper, the author proposes the exact exploration that plans to decrease the failure percentage of ERP projects. Nonetheless, most endeavours to enhance project achievement have concentrated on varieties inside of the conventional project management pattern. Author contends that a main driver of high ERP Implementation project failure percentage is the conventional pattern itself. Implementation of another pattern is a Value-Driven Change Leadership (VDCL) of reducing ERP Implementation failure percentage. This paper proposes an exact examination to explain the part of the new pattern (VDCL) in diminishing ERP Implementation failure percentage. This paper portrays the exploratory procedure for an exact study to the use of VDCL in decreasing ERP Implementation failure percentage.
In this paper,the software integration model based on java is presented.The model with three lays structure which adopts distributed system structure based on web.It can integrates software systems of heterogeneous platform based on distributed environment.
eng_Latn
45,312
Computer Integrated Manufacturing. Volume 1: Revolution in Progress
This summary volume is the first in a four-volume set reporting a four-year IIASA study on computer integrated manufacturing (CIM). The IIASA study has attempted to define the existing world situation with regard to the underlying technologies of CIM, and the degree to which such technologies as NC/CNC machine tools, robotics, and CAD/CAM are currently being used in metal products manufacturing. The methodology adopted in the study is eclectic. It is multiperspective and multidisciplinary as well as multinational. It incorporates both "bottom-up" and "top-down" approaches. Historical analysis and model forecasts of the future, together with scenario analyses, also feature. The study provides perhaps the first truly international comparison of manufacturing technology and manufacturing management.
According to the actual situation of China's enterprises, by analyzing work flow,logistics and cash flow in equipment management information system with related ERP principles, This paper presented a new model for the informationization of enterprises , and took an example ofthe design and development of equipment management information system based on ERP.
eng_Latn
45,313
The interface between mental health, criminal justice and forensic mental health services for children and adolescents
The psychosocial and biological factors placing young people at risk of offending and of developing mental health problems are well established in the international literature. A recent series of epidemiological, demand and needs surveys reveal a lack of appropriate services to meet the needs of you
The goal of this research is to analyze the evaluation criteria used by integrated circuit (IC) designers when selecting foundry service providers. With an MCDM model considering the aspects of technology, production, customer service and support, and manufacturing location, we interviewed managers and experts of Taiwan's IC design firms, using the AHP survey with 16 attributes, to determine the areas of top concern with respect to foundry evaluation criteria. In this study process technology has been found to be the most significant evaluation criterion in view of competitiveness in the customer market.
eng_Latn
45,327
Application of grey model for machine degradation prognostics
Predicting machine degradation before final failure occurs is very important. This paper presents a method to predict the future state of machine degradation based on grey model and one-step-ahead forecasting technique. Specifically, the feasibility of grey model as a predictor for machine degradation prognostics system has been investigated. Grey model GM(1,1) has employed to forecast the future state of machine degradation, but the result is not satisfactory. Finally, a modification of GM(1,1) has made to improve the accuracy of prediction. However the model was built by using only four input data, it is able to track closely the sudden change of machine degradation condition. Real trending data of low methane compressor acquired from condition monitoring routine are employed for evaluating the proposed method.
By Tony Martini, Economics ::: Advisor: Michael Jones ::: Presentation ID: AM_D39 ::: Abstract: This study suggests new methods of measuring the effectiveness of universities in protecting students from the impact of automation on the labor market. Using US government data on occupational activities and traditional measures of university outcomes, we produce a ranking of college majors by their susceptibility to automation and determine correlations between predicted traditional and automation-based outcomes. These are combined into a complete metric for the "automation resistance" of a university. Using an approach centered on distinctly human skills, occupational activities can be ranked by their automation resistance. Institutions which develop student skills with a focus on automation resistance should likewise demonstrate greater long-term viability. We demonstrate this methodology through a case study using data from the University of Cincinnati.
eng_Latn
45,342
Empirical analysis of the evolution of a taxonomy for best practices
Taxonomies play an increasingly role in knowledge management, providing the basis on which to find and communicate knowledge, information and metrics. However, knowledge continues to evolve over time. As a result, taxonomies also need to continue to evolve. ::: ::: Two different evolved versions of a taxonomy for best practices, each based on the same original taxonomy were analyzed. This research investigated empirical approaches to trace the changes in the original taxonomy. In so doing, an approach using empirical findings to monitoring and anticipating taxonomy change is initiated. There were a number of findings, including a tendency to evolve to greater complexity.
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,347
A MCDM Approach for Prioritizing Production Lines: A Case Study
Several methods have been proposed for solving multi-criteria decision making problems (MCDM). A major criticism of MCDM is that different techniques may yield different results when applied to the same problem. A decision maker looks for a solution that is closest to the ideal, in which alternatives are evaluated according to all established criteria. The multiple criteria decision making (MCDM) methods including TOPSIS, ELECTRE and VIKOR are based on an aggregating function representing ‘‘closeness to the ideal’’, which are originated in the compromise programming method. This study provides a comparison analysis of the above-three methods: eight parallel production lines from a factory will be analyzed using these three methods and also aggregate methods will be exploited in order to compare these methods.
Based on a brief account of the PDM,expounded the concept of Workflow Management System in emphases,bring into effect and purpose.According to the Management object of Corporation.And then discussed upon the three function of system even more.Workflow Management is the most important function in PDM process.Developed and application of the system Provided strong foundation on implementing PDM in the company.
kor_Hang
45,350
An integrated path analysis approach for the study of determinants of family planning acceptance in Orissa.
The primary purpose of the study is to find out the socio-economic demographic and health variables influencing the family planning acceptance in the thirty newly formed districts of Orissa. The study seeks to answer the following questions. What is the dominant set of variables influencing the family planning performance during a given period of time in the districts of Orissa? What are the direct and indirect effects of these variables on the Couple Protection Rate (CPR) in the districts of Orissa? What are the joint effects of these variables on the CPR in the districts of Orissa? (excerpt)
In this paper, we have presented an effective yield improvement methodology that can help both manufacturing ::: foundries, fabless and fab-lite companies to identify systematic failures. It uses the physical addresses of failing bits ::: from wafer sort results to overlay to inline wafer defect inspection locations. The inline defect patterns or the design ::: patterns where overlay results showed matches were extracted and grouped by feature similarity or cell names. The potentially problematic design patterns can be obtained and used for design debug and process improvement.
eng_Latn
45,351
Synthetic method of data resource for concurrent relation patterns
Using statistical knowledge of exponential distribution,the Poisson distribution and the normal distribution,a method of synthetic data resource for concurrent relation patterns was presented,which aims at mining algorithm of characteristics and the needs for concurrent relation patterns.
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,352
Using Machine Learning to Identify Suicide Risk: A Classification Tree Approach to Prospectively Identify Adolescent Suicide Attempters
This study applies classification tree analysis to prospectively identify suicide attempters among a large adolescent community sample, to demonstrate the strengths and limitations of this approach...
Flexibility of Cyber Physical Production Systems (CPPS) has already been widely discussed, but apart from some definitions, only few measures exist. These, however, are required to compare different CPPS regarding their flexibility and the effort necessary to increase flexibility during evolution. This paper presents the application and evaluation of already proposed flexibility and adaptivity metrics for automated Production Systems. The interdependencies among product, process and resource are shown. An AML description is provided for a lab size application example as a basis to demonstrate changes that incorporate flexibility. Such changes are to produce heavier work pieces (WPs), realize different processes and use different resources. At the example of the model in Automation Markup Language (AML), we highlight the interdependencies between product, process and resource, thus showing the strengths and weaknesses of the proposed metric.
eng_Latn
45,355
Evaluation of postponement in manufacturing systems with non-negligible changeover times
This article aims to examine the cost or benefit implications of employing postponement in the manufacturing environments characterised by non-negligible changeover times incurred when switching production from one product to another. Four manufacturing configurations are distinguished based on the choice of manufacturing technology and on whether or not postponement is employed. Analytical evaluation methods based on queuing models are used to assess operational measures for each configuration and solution algorithms are developed to determine the optimal decisions that may include stocking level, batch size and differentiation point. A numerical experiment is carried out to identify how the system performance is affected by different parameters.
Purpose – The purpose of this paper is to describe maintenance in a generic process model, in order to support an alignment of maintenance with other company internal processes aimed at fulfilling external stakeholder requirements.Design/methodology/approach – The proposed maintenance process model is based on existing theories and is illustrated by examples from a paper‐mill case study related to the maintenance of DC‐motors.Findings – The proposed model supports a holistic view of maintenance and the alignment of the maintenance process with other company internal processes, in order to fulfil external stakeholder requirements.Research limitations/implications – Further research could include an application of the proposed maintenance model to test its usefulness to identify stakeholders and also hazard diagnosis.Practical implications – The importance of vertical and horizontal alignment between the maintenance process and other processes in order to achieve effectiveness and efficiency is illustrated....
eng_Latn
45,357
Microgenetic Analysis and Creativity: Analyzing Psychological Change Processes
Microgenetic analysis designates a type of idiographic and qualitative research that allows investigating developmental processes by conducting micro-analyses of social interactions among subjects in structured and natural activities and in specific settings circumscribed by chronological – thus, irreversible – time. The cultural psychology of creativity considers that creativity as a phenomenon originates from human interactions semiotically mediated by their socio-cultural contexts, which evolve in the course of ontogenesis by means of dialectical and dialogical processes. The merger of microgenesis and research in creativity has significantly contributed to research in this field, by enabling in-depth analyses of the genesis and development of creativity, including developmental changes and transformations experienced by interacting subjects in the contexts under study.
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
kor_Hang
45,358
Generation of evidence should be tailored to individuals
The process of generating “evidence” has an element of chance built into it.1 The problem of reproducibility of results is well known across the scientific disciplines, including medicine.2 At the heart of the problem lies the “chaotic” ways in which nature operates.3 The deeper you look into …
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,359
Augmenting behavior-modeling training: Testing the effects of pre- and post-training interventions
A number of recent training authors have suggested that pre- and post-training interventions may enhance training outcomes. In the present study, pre- and post-training interventions were added to an established behavior-modeling program on assertiveness, creating four conditions: (1) no intervention, (2) pretraining interventions, (3) post-training interventions, and (4) both. One hundred fifty trainees completed the module, and measures of trainee reaction, learning retention, and behavioral change were obtained. Results indicated that the post-training intervention strongly affected learning retention, as well as reactions immediately following training, with moderate effects on behavior. No significant effects were observed between the pretraining intervention and any of the trainee outcome measures. Implications of the findings for training research and practice are discussed.
Flexibility of Cyber Physical Production Systems (CPPS) has already been widely discussed, but apart from some definitions, only few measures exist. These, however, are required to compare different CPPS regarding their flexibility and the effort necessary to increase flexibility during evolution. This paper presents the application and evaluation of already proposed flexibility and adaptivity metrics for automated Production Systems. The interdependencies among product, process and resource are shown. An AML description is provided for a lab size application example as a basis to demonstrate changes that incorporate flexibility. Such changes are to produce heavier work pieces (WPs), realize different processes and use different resources. At the example of the model in Automation Markup Language (AML), we highlight the interdependencies between product, process and resource, thus showing the strengths and weaknesses of the proposed metric.
eng_Latn
45,360
Finite capacity flow control in a multi-stage/multi-product environment
Abstract Hierarchical Production Management uses different decision levels and handles a different model at each of these levels. The model of the top level leads usually to a continuous flow control problem under finite capacities. Here we study the problem of minimizing in-process inventory cost under finite capacity constraints and for a continuous time varying demand. The multi-product problem is examined. We finally propose a numerical application to illustrate the analytical results.
Developing concepts to support cognitive work in a future complex sociotechnical system requires an approach, and supporting analytical tools, with the power to provide design insights for first-of-a-kind systems. This work explored using a work-centred, constraint-based design framework to identify design requirements and propose design concepts for a future, optimized Joint Fires Coordination (JFC) capability for the Canadian Forces. The approach builds on adapting techniques from Cognitive Systems Engineering to model the work constraints of the JFC system and on using the information to systematically identify several hundred design concepts related to its process, technology and organizational structure.
eng_Latn
45,363
Examining the Role of Customer Self-Efficacy in Service Encounters
AbstractThis study developed a conceptual model depicting the relationships between perceived encounter quality, patient participation, self-efficacy, satisfaction and loyalty intentions. The hypot...
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,368
Linking knowledge management practices to organizational performance using the balanced scorecard approach
The purpose of this study is to develop a decomposed model to inspect the effect of knowledge management practices (knowledge sharing culture [KSC], knowledge-based human resource management [KHRM], strategy and leadership [SL S&L positively and significantly affects only L&G and IP but does not have any significant effect on the other two, i.e. CS and FP, while ICT practices did not affect any of the measures significantly.,The data are limited to 277 middle and senior level managers of Indian firms, which may be a limiting factor for generalizability.,The proposed model uncovers the dynamics of individual relationships between KM practices and measures of performance (proposed by BSC) in comparison to existing models which have mainly focused on the overall effect.
Flexibility of Cyber Physical Production Systems (CPPS) has already been widely discussed, but apart from some definitions, only few measures exist. These, however, are required to compare different CPPS regarding their flexibility and the effort necessary to increase flexibility during evolution. This paper presents the application and evaluation of already proposed flexibility and adaptivity metrics for automated Production Systems. The interdependencies among product, process and resource are shown. An AML description is provided for a lab size application example as a basis to demonstrate changes that incorporate flexibility. Such changes are to produce heavier work pieces (WPs), realize different processes and use different resources. At the example of the model in Automation Markup Language (AML), we highlight the interdependencies between product, process and resource, thus showing the strengths and weaknesses of the proposed metric.
eng_Latn
45,372
Understanding and Designing for Cultural Differences on Crowdsourcing Marketplaces
Crowdsourcing marketplaces, such as Mechanical Turk, enable tasks to be distributed to workers from all over the world for completion. While this global quality of crowdsourcing provides numerous benefits to requesters of work (e.g., more diverse labor force that is available at all hours of the day), it could, however, pose challenges in terms of labor management. Much prior work has demonstrated the effects of cultural backgrounds on individual’s thoughts, value, and even behavior. Therefore, the diverse cultural backgrounds on crowdsourcing marketplaces may interact with incentive types, amounts, and task types to impact workers’ task performance, engagement, and selection. My research program explores the effects of cultural differences on these online marketplaces in order to build more efficient global crowdsourcing marketplaces.
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,373
A Bibliometric Multicriteria Model on Smart Manufacturing from 2011 to 2018
Abstract This study reviews the academic literature on Smart Manufacturing from the industry perspective. The methodological approach combines bibliometric analysis and multi criteria decisionmaking models. Firstly, the paper aims to identify the most relevant topics on Smart Manufacturing. Secondly, it aims to weigh them and to define their interdependences. The initial sample consists of 1.498 articles published between the 2011 and 2018. The semi-qualitative analysis returns an objective numerical result useful to define the predominant area on smart manufacturing applications.
Four simultaneous economic revolutions and the global turbulent fields.The externalities of economic decisions. Corporate social responsibility. The duty of business to prevent the degradation of the work force. Each society creates its own model of strategic management: Their common factor is power, the differences arein values, in the source, the content and in the style of power behaviors. Society’s culture, the values of individual managers and the resulting style of management.
eng_Latn
45,376
Dynamic assessment and adaptive optimization of the psychotherapeutic process.
Presents a general feedback model for optimal control of psychotherapeutic processes. The model can be applied online by means of a recursive estimation scheme. Continuously updated estimates are obtained of the between-sessions variation in the degree of control that is exercised by therapeutic manipulations on client behavior. A case study of a 26-yr-old male client shows that the pattern of control by therapeutic manipulations may vary substantially across sessions.
In the overall ownership cost of enterprise system, the maintenance cost consists of a major percentage. During the lifetime of an enterprise system, process customization is the most frequent maintenance efforts. However, current processing method has limited scalability and efficiency. In this case study, we explained how a scalable and efficient customization processing method was implemented. This method used the carbon emission trading mechanism to facilitate the cost benefit analysis of customization request. It also used distributed processing principle to improve the overall processing efficiency. Feedback from a pilot implementation in a large manufacturer included.
eng_Latn
45,377
business process change : a study of methodologies , techniques , and tools .
Experiences in strategic information systems planning
Modeling Sparse Deviations for Compressed Sensing using Generative Models
eng_Latn
45,386
Influence of a Multipurpose Retention Reservoir on Extreme River Flows, a Case Study of the Nielisz Reservoir on the Wieprz River (Eastern Poland)
The objective of this paper is the assessment of the effect of a multifunctional mid-size retention reservoir on the occurrence of floods and low flows. The study object was moderate size reservoir located in Nielisz in eastern Poland on Wieprz river. The analysis conducted for the hydrological period 1976–2014 showed a positive effect of the reservoir on compensating low flows. In gauging sections located below the reservoir, streamflow droughts almost completely disappeared. The anti-flood function is particularly evident in the case of small floods, and involves a delay in the occurrence of cumulative flow, without reduction in flood volumes. Mean flow fluctuations decreased, while environmental flows were continuously ensured.
Workflow management techniques were developed in early 1990s,and its critical part was conceptual modeling.As a kind of graphical and mathematical modeling tools,Petri Net is applicable to the modeling requirement of workflow.A method of modeling for workflow based on Generalized Stochastic Petri Net(GSPN) is proposed and the validity and reliability is checked using reachable graph method.By utilizing the equivalence relation between GSPN and Markov chain,a method of combining GSPN and Markov chain is used to analyze the performance of workflow.The effectiveness of this method is verified by an application case,which will help to implement a workflow management system.
eng_Latn
45,397
Increasing Flexibility in Process Deployment with the Process Beans Composer
Organizations are increasingly interested in improving their business processes. One strategy for achieving this improvement has been to model their business processes and to automate them using workflow technology. One of the difficulties in this approach is that organizations must adapt their defined processes to each different situation and that workflow systems are yet not flexible enough to allow changes for each new process instance to be enacted. This work proposes a strategy for defining process components or building blocks already implemented in workflow systems which will assist process definition, adaptation and use in organizations. A tool called Beans Composer was implemented to achieve this objective.
In this article we study monotone stochastic orders for real-valued processes with independent increments and derive sufficient deterministic conditions which are easy to verify and suitable for applications. Our proof is based on an explicit construction of a coupling. We also derive comparison results for L\'evy processes and semimartingales with conditionally independent increments.
eng_Latn
45,398
Dynamics of the spout of gas plumes discharging from a melt: Experimental investigation with a large-scale water model
In the present study, the spout region of a gas plume discharging from a melt has been investigated using a water model of 180 cm in height and 160 cm in diameter. The lateral movement of the spout, as measured optically, increases with the gas flow rate and has been found to be ± 20-cm wide or wider, and very fast. The spout height, as measured with video-optical and electrical methods, strongly fluctuates with time. Clear definitions have to be made of the quantities to be determined in the highly dynamic process. Long-time averages of the radial height profiles and momentary maximum height values are reported. It is confirmed that the nondimensional spout height, defined and measured in a certain manner, is independent of the Froude number and of the nondimensional nozzle diameter.
Workflow management techniques were developed in early 1990s,and its critical part was conceptual modeling.As a kind of graphical and mathematical modeling tools,Petri Net is applicable to the modeling requirement of workflow.A method of modeling for workflow based on Generalized Stochastic Petri Net(GSPN) is proposed and the validity and reliability is checked using reachable graph method.By utilizing the equivalence relation between GSPN and Markov chain,a method of combining GSPN and Markov chain is used to analyze the performance of workflow.The effectiveness of this method is verified by an application case,which will help to implement a workflow management system.
eng_Latn
45,399
Influence Factors of Understanding Business Process Models
The increasing utilization of business process models both in business analysis and information systems development raises several issues regarding quality measures. In this context, this paper discusses understandability as a particular quality aspect and its connection with personal, model, and content related factors. We use an online survey to explore the ability of the model reader to draw correct conclusions from a set of process models. For the first group of the participants we used models with abstract activity labels (e.g. A, B, C) while the second group received the same models with illustrative labels such as “check credit limit”. The results suggest that all three categories indeed have an impact on the understandability.
Based on the study of SIP,the paper presents the overlay method to implement the mapping between SIP protocol state machine and IN call state model BCSM,and analyses the implementation concretely.The method has resolved the key technology in interworking of Internet and IN.
eng_Latn
45,402
An empirical study on the use of i* by non-technical stakeholders: the case of strategic dependency diagrams
modelling strategic relationships for process reengineering .
An actor dependency model of organizational work: with application to business process reengineering
eng_Latn
45,466
Process Mining for the multi-faceted analysis of business processes - A case study in a financial services organization
Workflow mining: discovering process models from event logs
Mining Process Models from Workflow Logs
eng_Latn
45,470
Flexible Heuristics Miner (FHM)
application of process mining in healthcare - a case study in a dutch hospital .
Entity Coherence for Descriptive Text Structuring
eng_Latn
45,479
A New Look at the Relationship between User Involvement in Systems Development and System Success
A case study of user participation in the information systems development process
Grammar-driven development of JSON processing applications
eng_Latn
45,541
A study of enterprise resource planning (ERP) system performance measurement using the quantitative balanced scorecard approach
Investment in Enterprise Resource Planning: Business Impact and Productivity Measures
Degree centrality in scientific collaboration supernetwork
eng_Latn
45,545
business process change : a study of methodologies , techniques , and tools .
Experiences in strategic information systems planning
Recommendations with Optimal Combination of Feature-Based and Item-Based Preferences
eng_Latn
45,557
what kind of process is research
Types of research. Quantitative research. Quantitative research is generally associated with the positivist/postpositivist paradigm. It usually involves collecting and converting data into numerical form so that statistical calculations can be made and conclusions drawn.
In this process, the study is documented in such a way that another individual can conduct the same study again. This is referred to as replicating the study. Any research done without documenting the study so that others can review the process and results is not an investigation using the scientific research process. The scientific research process is a multiple-step process where the steps are interlinked with the other steps in the process.
eng_Latn
45,635
can a mission oriented organization use a business model canvas
Like a mission statement, a business model statement acts as a touchstone: a reminder and a guide for the organization's focus and strategies.ission statement for both Organizations A and B: Our mission is to develop and implement evaluation tools that help nonprofits identify, understand, and increase their impact.. 1 Organization A's business model statement: Foundations contract with us to conduct evaluations with their grantees..
Most people intuitively understand a business process to be a procedure or event with the purpose of reaching a goal. When looking at our. Airport we can find many different business processes and goals: 1 The goal of our passenger is to go on vacation. business system that is to be modeled can span an entire organization. In this case, we talk about an organization model. It is also possible to consider and model only a selected part of an organization. In our case study, an IT system is to be integrated into the Passenger Services operation.
eng_Latn
45,639
Service quality and ERP implementation: A conceptual and empirical study of semiconductor-related industries in Taiwan
Identifying critical issues in enterprise resource planning (ERP) implementation
Multi-scale Volumes for Deep Object Detection and Localization
eng_Latn
45,647
Deep Gate Recurrent Neural Network
Gated Feedback Recurrent Neural Networks
A study of enterprise resource planning (ERP) system performance measurement using the quantitative balanced scorecard approach
eng_Latn
45,657
Surface-Based Li+ Complex Enables Uniform Lithium Deposition for Stable Lithium Metal Anodes
Uncontrollable Li dendrites growth and low Coulombic efficiency (CE) have severely prevented the practical applications of Li metal anode. Here, we propose to use poly(ethylene glycol) (PEG) as an electrolyte additive for suppressing dendritic growth and improving the cycling stability of Li metal anode. The PEG molecule can be absorbed on the surface of Li metal and complex with the Li+ ions effectively, which ensures a homogeneous diffusion of Li+ near the electrode surface and fundamentally avoids dendrite formation. Thus, full cells with LiFePO4|Li configuration can even maintain ∼100% capacity with CE > 99.5% after 250 cycles at 5.0 C.
Single crystals of tin iodide (SnI2) have been grown in silica gels. A detailed microtopographical study of {100} faces have been described. Horizontal striations are predominant on these faces for most of the crystals, while few of them show vertical striations. The horizontal striations are associated with the two-dimensional nucleation theory whereas the vertical striations relate to the growth fronts. Growth layers modified by the presence of misaligned microcystals have been illustrated. The natural etch pits on {100} faces of the crystal are attributed to the dissolution of crystals in the acid set gel. In the light of these observations, the mechanism of the development and growth of these faces have been assessed and the implications are discussed.
eng_Latn
45,706
The present study is mainly devoted to focus on the effect of elevated temperature on concrete properties and micro structure to avoid as possible as we can the various undesired effects of the severe environmental impacts, which affect the service life of concrete. The concrete casted with basalt aggregate was subjected to elevated temperatures (300, 600, & 800 o C) with a rate of 5 o C/min, and an exposure time of two hours. Furthermore, the obtained results showed that the mechanical properties of the heated concrete, associated with the internally induced chemical changes are attributed to temperature elevation effect. Both laboratorial investigations and mineralogic study revealed by using reflected light microscopic study on polished slabs, SEM investigation, confirmed the effect of elevated temperatures on cement paste and aggregates.
Concrete is exposed to elevated temperatures when subjected to accidental fires in buildings, or when is close to furnaces and reactors, as encountered in some industrial applications. Research studies indicate that a serious reduction in concrete strength may result in such cases, and that the type of aggregate used is an important parameter. In thus study, the influence of high temperatures (100–600°C) on the residual compressive and bond strengths of concrete made from limestone aggregates is experimentally investigated. The main test parameters involved were the maximum temperature, the time of exposure at the maximum temperature, the method of cooling, the age of concrete at the testing date, and the cement content. The results showed that exposure of limestone aggregate concrete to such temperatures may result in a noticeable reduction in its strength, especially in the range 400–600°C.
The resources of refractory gold ores are abundant, and their effective treatment can bring good economic benefits. This paper investigated the kinetics of leaching gold from refractory gold ores by ultrasonic-assisted electro-chlorination. The effects of ultrasound time ratio, initial hydrochloric acid concentration and leaching temperature on the kinetic parameters were discussed. It is found that the leaching ratio goes up with all the factors increasing. The reaction kinetics is controlled by diffusion. When ultrasound improves the diffusion by reducing the diffusion resistance, the activation energy increases to 37.1 kJ/mol.
eng_Latn
45,744