In the age of Information and Communication Technology (ICT), Web and the Internet have changed significantly the way applications are developed, deployed and used. One of recent trends is modern design of web-applications based on SOA. This process is based on the composition of existing web services into a single scenario from the point of view of a particular user or client. This allows IT companies to shorten the product-time to market process. On the other hand, it raises questions about the quality of the application, trade-offs between quality factors and attributes and measurements of these. Services are usually hosted and executed in an environment managed by its provider that assures the quality attributes such as availability or throughput. Therefore, in this paper an attempt has been made to perform quality measurements towards the creation of efficient, dependable and user-oriented Web applications. First, the process of designing service-based applications is described. Next, metrics for subsequent measurements of efficiency, dependability and usability of distributed applications are presented. These metrics will assess the efforts and trade-offs in a Web-based application development. As examples, we describe a pair of multimedia applications which we have developed in our department and executed in a cluster-based environment. One of them runs in the BeesyCluster middleware and the second one in the Kaskada platform. For these applications we present results of measurements and conclude about relations between quality attributes in the presented application development model. This knowledge can be used to reason about such relations for new similar applications and be used in rapid and quality development of the latter.
The article presents measurement results of prototype integrated circuits for acquisition and processing of images in real time. In order to verify a new concept of circuit solutions of analogue image processors, experimental integrated circuits were fabricated. The integrated circuits, designed in a standard 0.35 μm CMOS technology, contain the image sensor and analogue processors that perform low-level convolution-based image processing algorithms. The prototype with a resolution of 32 × 32 pixels allows the acquisition and processing of images at high speed, up to 2000 frames/s. Operation of the prototypes was verified in practice using the developed software and a measurement system based on a FPGA platform.
Considering the problem to diagnose incipient faults in nonlinear analog circuits, a novel approach based on fractional correlation is proposed and the application of the subband Volterra series is used in this paper. Firstly, the subband Volterra series is calculated from the input and output sequences of the circuit under test (CUT). Then the fractional correlation functions between the fault-free case and the incipient faulty cases of the CUT are derived. Using the feature vectors extracted from the fractional correlation functions, the hidden Markov model (HMM) is trained. Finally, the well-trained HMM is used to accomplish the incipient fault diagnosis. The simulations illustrate the proposed method and show its effectiveness in the incipient fault recognition capability.
An innovative system designed for the continuous monitoring of acoustic climate of urban areas was presented in the paper. The assessment of environmental threats is performed using online data, acquired through a grid of engineered monitoring stations collecting comprehensive information about the acoustic climate of urban areas. The grid of proposed devices provides valuable data for the purpose of long and short time acoustic climate analysis. Dynamic estimation of noise source parameters and real measurement results of emission data are utilized to create dynamic noise maps accessible to the general public. This operation is performed through the noise source prediction employing a propagation model being optimized for computer cluster implementation requirements. It enables the system to generate noise maps in a reasonable time and to publish regularly map updates in the Internet. Moreover, the functionality of the system was extended with new techniques for assessing noise-induced harmful effects on the human hearing system. The principle of operation of the dosimeter is based on a modified psychoacoustic model of hearing and on the results of research performed with participation of volunteers concerning the impact of noise on hearing. The primary function of the dosimeter is to estimate, in real time, auditory effects which are caused by exposure to noise. The results of measurements and simulations performed by the system prototype are depicted and analyzed. Several cases of long-term and short-term measurements of noise originating from various sources were considered in detail. The presented outcomes of predicted degree of the hearing threshold shift induced during the noise exposure can increase the awareness of harmfulness of excessive sound levels.
Disorders of the heart and blood vessels are the leading cause of health problems and death. Early detection of them is extremely valuable as it can prevent serious incidents (e.g. heart attack, stroke) and associated complications. This requires extending the typical mobile monitoring methods (e.g. Holter ECG, tele-ECG) by introduction of integrated, multiparametric solutions for continuous monitoring of the cardiovascular system.
In this paper we propose the wearable system that integrates measurements of cardiac data with actual estimation of the cardiovascular risk level. It consists of two wirelessly connected devices, one designed in the form of a necklace, the another one in the form of a bracelet (wrist watch). These devices enable continuous measurement of electrocardiographic, plethysmographic (impedance-based and optical-based) and accelerometric signals. Collected signals and calculated parameters indicate the electrical and mechanical state of the heart and are processed to estimate a risk level. Depending on the risk level an appropriate alert is triggered and transmitted to predefined users (e.g. emergency departments, the family doctor, etc.).
This paper describes the prototype version of a mobile application supporting independent movement of the blind. Its objective is to improve the quality of life of visually impaired people, providing them with navigational assistance in urban areas. The authors present the most important modules of the application. The module for precise positioning using DGPS data from the ASG-EUPOS network as well as enhancements of positioning in urban areas, based on the fusion with other types of data sources, are presented. The devices, tools and software for the acquisition and maintenance of dedicated spatial data are also described. The module responsible for navigation with a focus on an algorithms' quality and complexity, as well as the user interface dedicated for the blind are discussed. The system's main advantages over existing solutions are emphasized, current results are described, and plans for future work briefly outlined.
Biometric identification systems, i.e. the systems that are able to recognize humans by analyzing their physiological or behavioral characteristics, have gained a lot of interest in recent years. They can be used to raise the security level in certain institutions or can be treated as a convenient replacement for PINs and passwords for regular users. Automatic face recognition is one of the most popular biometric technologies, widely used even by many low-end consumer devices such as netbooks. However, even the most accurate face identification algorithm would be useless if it could be cheated by presenting a photograph of a person instead of the real face. Therefore, the proper liveness measurement is extremely important. In this paper we present a method that differentiates between video sequences showing real persons and their photographs. First we calculate the optical flow of the face region using the Farnebäck algorithm. Then we convert the motion information into images and perform the initial data selection. Finally, we apply the Support Vector Machine to distinguish between real faces and photographs. The experimental results confirm that the proposed approach could be successfully applied in practice.
The paper presents the results of experimental validation of a set of innovative software services supporting processes of achieving, assessing and maintaining conformance with standards and regulations. The study involved several hospitals implementing the Accreditation Standard promoted by the Polish Ministry of Health. First we introduce NOR-STA services that implement the TRUST-IT methodology of argument management. Then we describe and justify a set of metrics aiming at assessment of the effectiveness and efficiency of the services. Next we present values of the metrics that were built from the data collected. The paper concludes with giving the interpretation and discussing the results of the measurements with respect to the objectives of the validation experiment.
The growing number of mobile devices and the increasing popularity of multimedia services result in a new challenge of providing mobility in access networks. The paper describes experimental research on media (audio and video) streaming in a mobile IEEE 802.11 b/g/n environment realizing network-based mobility. It is an approach to mobility that requires little or no modification of the mobile terminal. Assessment of relevant parameters has been conducted in an IPv6 testbed. During the tests, both Quality of Service (QoS) and Quality of Experience (QoE) parameters have been considered. Against the background of standard L3 and L2 handovers, an emerging mobility solution named Proxy Mobile IPv6 (PMIPv6) has been examined. Its advantages (L3 connectivity maintenance) and disadvantages (packet loss during handover) are emphasized based on the obtained results. Moreover, a new solution for handover optimization has been proposed. A handoff influence upon audio/video generation and transfer imperfections has been studied and found to be an interesting direction of future work.
The quality of the supplied power by electricity utilities is regulated and of concern to the end user. Power quality disturbances include interruptions, sags, swells, transients and harmonic distortion. The instruments used to measure these disturbances have to satisfy minimum requirements set by international standards. In this paper, an analysis of multi-harmonic least-squares fitting algorithms applied to total harmonic distortion (THD) estimation is presented. The results from the different least-squares algorithms are compared with the results from the discrete Fourier transform (DFT) algorithm. The algorithms are assessed in the different testing states required by the standards.
In this paper, a discrete wavelet transform (DWT) based approach is proposed for power system frequency estimation. Unlike the existing frequency estimators mainly used for power system monitoring and control, the proposed approach is developed for fundamental frequency estimation in the field of energy metering of nonlinear loads. The characteristics of a nonlinear load is that the power signal is heavily distorted, composed of harmonics, inter-harmonics and corrupted by noise. The main idea is to predetermine a series of frequency points, and the mean value of two frequency points nearest to the power system frequency is accepted as the approximate solution. Firstly the input signal is modulated with a series of modulating signals, whose frequencies are those frequency points. Then the modulated signals are decomposed into individual frequency bands using DWT, and differences between the maximum and minimum wavelet coefficients in the lowest frequency band are calculated. Similarities among power system frequency and those frequency points are judged by the differences. Simulation results have proven high immunity to noise, harmonic and inter-harmonic interferences. The proposed method is applicable for real-time power system frequency estimation for electric energy measurement of nonlinear loads.
The secretiveness of sonar operation can be achieved by using continuous frequency-modulated sounding signals with reduced power and significantly prolonged repeat time. The application of matched filtration in the sonar receiver provides optimal conditions for detection against the background of white noise and reverberation, and a very good resolution of distance measurements of motionless targets. The article shows that target movement causes large range measurement errors when linear and hyperbolic frequency modulations are used. The formulas for the calculation of these errors are given. It is shown that for signals with linear frequency modulation the range resolution and detection conditions deteriorate. The use of hyperbolic frequency modulation largely eliminates these adverse effects.
The aim of this paper is the methodology of measurements executed in a radio link for the realization of radiolocation services in radiocommunication networks, particularly in cellular networks. The main results of the measurements obtained in the physical layer of the universal mobile telecommunications system (UMTS) are introduced. A new method for the utilization of the multipath propagation phenomenon to improve the estimation of the distance between the mobile station (MS) and the base station (BS) is outlined. This method significantly increases the quality of location services in systems which use a radio interface with direct sequence code division multiple access (DS CDMA).
Recently, the topic of ontologies has growing attention from the IT community. Various processes of ontology creation, integration, and deployment have been proposed. As a consequence there appeared an urgent need for evaluating the resulting ontologies in a quantitative way. A number of metrics has been defined along with different approaches to measuring the properties of ontologies. In the first part of this paper we review the state of the art in this domain. Special attention is devoted to discussing differences between syntactic measures (referring to various properties of graphs that represent ontologies) and semantic measures (reflecting the properties of the space of ontology models). In the second part we propose an alternative approach to quantification of semantics of an ontology. The original proposal presented here exploits specific methods of representing the space of semantic models used for optimization of reasoning. We argue that this approach enables us to capture different kinds of relations among ontology terms and offers possibilities of devising new useful measures.
Optical Coherence Tomography (OCT) is one of the most rapidly advancing techniques. This method is capable of non-contact and non-destructive investigation of the inner structure of a broad range of materials. Compared with other methods which belong to the NDE/NDT group (Non-Destructive Evaluation/Non-Destructive Testing methods), OCT is capable of a broad range of scattering material structure visualization. Such a non-invasive and versatile method is very demanded by the industry. The authors applied the OCT method to examine the corrosion process in metal samples coated by polymer films. The main aim of the research was the evaluation of the anti-corrosion protective coatings using the OCT method. The tested samples were exposed to a harsh environment. The OCT measurements have been taken at different stages of the samples degradation. The research and tests results have been presented, as well as a brief discussion has been carried out.
In this work, the influence of both characteristics of the lens and misalignment of the incident beams on roughness measurement is presented. To investigate how the focal length and diameter affect the degree of correlation between the speckle patterns, a set of experiments with different lenses is performed. On the other hand, the roughness when the beams separated by an amount are non-coincident at the same point on the sample is measured. To conclude the study, the uncertainty of the method is calculated.
This paper presents a voltammetric segmented voltage sweep mode that can be used to identify and measure heavy metals' concentrations. The proposed sweep mode covers a set of voltage ranges that are centered around the redox potentials of the metals that are under analysis. The heavy metal measurement system can take advantage of the historical database of measurements to identify the metals with higher concentrations in a given geographical area, and perform a segmented sweep around predefined voltage ranges or, alternatively, the system can perform a fast linear voltage sweep to identify the voltammetric current peaks and then perform a segmented voltage sweep around the set of voltages that are associated with the voltammetric current peaks. The paper also includes the presentation of two auto-calibration modes that can be used to improve system's reliability and proposes the usage of a Gaussian curve fitting of voltammetric data to identify heavy metals and to evaluate their concentrations. Several simulation and experimental results, that validate the theoretical expectations, are also presented in the paper.
Varistors are commonly used elements which protect power supply networks against high-voltage surges or lightning. Therefore, quality and endurance of these elements is important to avoid losses when an expensive laboratory equipment would not be protected from random overvoltages. Additionally, excessive leakage currents generate serious costs due to high energy consumption. The paper presents shortly properties of varistors that comprized different ZnO grain types and can have various quality which changes continuously during exploitation (due to exposition to overheating and overvoltage pulses). Therefore, it is important to monitor varistors during their ageing (causing changes within their microstructures). A few methods of varistor property diagnosis were considered and compared with the methods currently applied in laboratory or industry applications. A new measurement (diagnostic) system that can monitor varistors during ageing and can be widely applied in power networks is presented. The proposed system fulfills requirements of the industrial customers which demand various methods for power line protection. The proposed system can be simply developed into a more advanced wireless diagnostic system of long power supply lines.
The properties, superior calorific value (SCV) and the compressibility factor (z), of 77 natural gas (NG) samples are calculated from two different calibration approaches of gas chromatography, based on ISO 6974-2. The method A uses an analytical curve with seven points that the best adjust is confirmed by Analysis of Variance (ANOVA); it is required when the composition of the natural gas varies. The method B uses a single point calibration, with an allowed tolerance between the calibration gas mixture and sample mole fraction, so it is used to analyze constant natural gas streams. From natural gas composition analyzed by both methods, exceeding the method B allowed tolerance; SCV, z and its uncertainties are calculated and compared. The results show that all samples that comply with Brazilian legislation can be analyzed by method B, because there are no metrological differences in terms of SCV and z, even though the allowed tolerance has been exceeded. This simplified methodology minimizes operator exposure, besides saving about US$ 50,000.00 per chromatograph.
Noise diagnostics has been performed on the cold field-emission cathode in high-vacuum. The tested cold field-emission cathode, based on tungsten wire with ultra-sharp tip coated by epoxy was designed to meet the requirements of transmission electron microscopy, which uses a small and stable source of electrons. Current fluctuations are reduced by improving the structure and fabrication technology. Noise was measured both in time and frequency domains, which gives information about current fluctuations and also about charge transport. Mutual correlation between the noise spectral density, extractor voltage and beam brightness was analyzed.
Recently, the topic of ontologies has growing attention from the IT community. Various processes of ontology creation, integration, and deployment have been proposed. As a consequence there appeared an urgent need for evaluating the resulting ontologies in a quantitative way. A number of metrics has been defined along with different approaches to measuring the properties of ontologies. In the first part of this paper we review the state of the art in this domain. Special attention is devoted to discussing differences between syntactic measures (referring to various properties of graphs that represent ontologies) and semantic measures (reflecting the properties of the space of ontology models). In the second part we propose an alternative approach to quantification of semantics of an ontology. The original proposal presented here exploits specific methods of representing the space of semantic models used for optimization of reasoning. We argue that this approach enables us to capture different kinds of relations among ontology terms and offers possibilities of devising new useful measures.
Journal | Publisher | ISSN |
IOP Publishing | 0026-1394 | |
IEEE | 0018-9456 | |
Elsevier | 0263-2241 | |
IOP Publishing | 0957-0233 | |
Metrology and Measurement Systems | PAS | 0860-8229 |
IOP Publishing | 0034-6748 | |
IEEE | 1557-9948 | |
IET | 1751-8822 | |
SISSA, IOP Publishing | 1748-0221 | |
Walter de Gruyter | 1335-8871 | |
IEEE | 1094-6969 | |
Bulletin of the Polish Academy of Sciences: Technical Sciences | PAS | 2300-1917 |
PAS | 1896-3757 | |
IEEE | 1558-1748 | |
MDPI | 1424-8220 |