The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
We present a novel approach for detecting malicious user activity in databases. Specifically, we propose a new machine learning algorithm for detecting attacks such as a stolen user account or illegal use by a user. Our algorithm relies on two main components that examine the consistency of a user's activity and compare it with activity patterns learned from past access. The first component tests...
In recent years, crowdsourcing has become essential in a wide range of Web applications. Human factors play a key role in achieving high quality answers in crowdsourcing-based solving tasks. The most major factor is pertained to the uncertainty of workers about the responses that they provide to resolve the task at hand. On the other hand, workers may have diverse levels of expertise and skill. It...
We improve data access efficiency in mobile cloud storage by exploring users' context information. Specifically, we optimize data reading and writing on mobile devices from/to distributed cloud storage according to network condition, user mobility pattern, and data access preference. We propose a Reed-Solomon erasure code based context-aware distributed storage system. Then, a mixed integer linear...
As a standardized communication protocol, OPC UA is the main focal point with regard to information exchange in the ongoing initiative Industrie 4.0. But there are also considerations to use it within the Internet of Things. The fact that currently no open reference implementation can be used in research for free represents a major problem in this context. The authors have the opinion that open source...
The amount of different usages in sensor networks grows with the deployment of ever more diverse sensors within a same network. More varied data implies a more heterogeneous panel of users. Such a diversity in usages leads to a major problem about data modelling: being able to address, for each user, precisely his needs, and his needs only. In other words, providing a model which would avoid to overwhelm...
In this paper we describe how a video game designed to deliver a rehabilitation therapy can produce data of a standard that is clinically useful. Our approach is based entirely on commodity video game hardware, making our solution one that may be delivered in a cost efficient manner. The step of ensuring data fidelity was crucial in allowing clinical assessment to be derived from standard video game...
NoSQL databases opt not to offer important abstractions traditionally found in relational databases in order to achieve high levels of scalability and availability: transactional guarantees and strong data consistency. In this work we propose pH1, a generic middleware layer over NoSQL databases that offers transactional guarantees with Snapshot Isolation. This is achieved in a non-intrusive manner,...
Provenance awareness adds a new dimension to the engineering of service-based systems, enabling them to increase their accountability through answering questions about the provenance of any data produced. Provenance awareness can be achieved by recording provenance data during system execution. In our previous work we have proposed an overall research agenda towards a design and analysis framework...
Cloud storage federation improves service availability and reduces vendor lock-in risks of single-provider cloud storage solutions. Federation therefore distributes and replicates data among different cloud storage providers. Missing controls on data location and distribution however introduce security and compliance issues. This paper proposes a novel approach of using data-driven usage control to...
The problem of managing multidimensional stream cubes (i.e., data cubes originated from data streams) over Computational Grids still plays a critical role in Database and Data Warehousing research, since it covers a wide family of real-life application scenarios. Despite recent technological advancements, high dimensionality and massive size are still the most significant challenges to be addressed...
We describe Quality of Information (QoI) in terms of Type Theory in order to: clarify the meaning of QoI and desired capabilities of “quality aware” systems; to show how any quality metric can be expressed as a function of data and data provenance which negates the need for specific quality annotations; and to indicate how data processing may automatically be applied to deliver high quality information.
This paper presents an algorithm that generalizes big sets of contextual situations. Apart from giving details on its mechanisms and implementation, we discuss its employment in a context experience sharing system, KRAMER, and simulate its performance in function of several parameters modelling the expected real data experiment.
Empowering the enterprise to control their own session policy for mobile broadband is not only necessary for consumerization, but is beneficial in controlling budgets and protecting corporate network resources. We propose a practical method of establishing dynamically enterprise-Business-Context (eBC) status to determine whether or not the enterprise should fund employees'' service requests and what...
Analytical solutions are considered as increasingly important for modern enterprises. Currently, systematical adoption of analytical solutions is limited to only a small set of large enterprises, as the deployment cost is high due to high performance hardware requirement and expensive analytics software. Moreover, such on-premises solutions are not suitable for the occasional analytics consumers....
Large-scale online service providers have been increasingly relying on geographically distributed cloud infrastructures for service hosting and delivery. In this context, a key challenge faced by service providers is to determine the locations where service applications should be placed such that the hosting cost is minimized while key performance requirements (e.g. response time) are assured. Furthermore,...
JPEG 2000 part 9, or short JPIP, is an interactive image browsing protocol that allows the selective delivery of image regions, components or scales from JPEG 2000 image. Typical applications are browsing tools for medical databases where transmitting huge images from server to client in total would be uneconomical. Instead, JPIP allows extracting only the desired image parts for analysis by an http...
This paper discusses some of currently the most challenging problems of SCADA system design, and presents a concept of a general purpose SCADA intended for high-end applications. Application like this involves millions of data points and demanding real-time decision support system on top of SCADA and its communication infrastructure. The presented SCADA concept is actually evolution and enhancement...
TCP Timeouts are the primary impairment that hurts throughput in data centers. Binary Exponential Back off (BEB) algorithm is invoked to control interval between consecutive timeouts. In this paper we explore the impact of removing BEB algorithm from TCP on throughput. Our analysis and simulation results show that removing BEB algorithm, even in the case of lower RTT_min, cannot advance the onset...
The XML DOM (Document Object Model) provides a logical view of the in-memory structure. It represents the metadata that has a hierarchical treelike structure consisting of nodes. There are several benefits of implementing the XML DOM as a home management server. First of all, the process time decreases by three times in average sense than using general database. Secondly, it would make to meet with...
Web log files store data related to the use of a website. Analyzing these data in detail is therefore crucial for improving the user browsing experience. However, usually Web log data are stored in flat files in different formats which hinders their analysis, thus obliging to use specific Web log analysis tools. In this context, approaches for structuring Web log data to better analyze them are highly...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.