The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Continuous Integration (CI) and Continuous Delivery (CD) are widespread in both industrial and open-source software (OSS) projects. Recent research characterized build failures in CI and identified factors potentially correlated to them. However, most observations and findings of previous work are exclusively based on OSS projects or data from a single industrial organization. This paper provides...
Since computing infrastructure has intrinsically a redundant logic, budget saving practices or understaffing of IT departments lead to an “emergency room” service paradigm. However, a treasure of information that could facilitate preemptive maintenance is buried in the logs and messages that are automatically generated by regular processes. Such information is typically ignored because of the shear...
Program dependency artifacts such as call graphs help support a number of software engineering tasks such as software mining, program understanding, debugging, feature location, software maintenance and evolution. Java Enterprise Edition (JEE) applications represent a significant part of the recent legacy applications, and we are interested in modernizing them. This modernization involves, among other...
In this paper, we present a collection of Modern Code Review data for five open source projects. The data showcases mined data from both an integrated peer review system and source code repositories. We present an easy–to–use andricher data structure to retrieve the 1.) People 2.) Process and 3.) Product aspects of the peer review. This paperpresents the extraction methodology, the dataset structure,...
Exception handling is a powerful tool provided by many pro- gramming languages to help developers deal with unforeseen conditions. Java is one of the few programming languages to enforce an additional compilation check on certain sub- classes of the Exception class through checked exceptions. As part of this study, empirical data was extracted from soft- ware projects developed in Java. The intent...
Exception handling is a technique that addresses exceptional conditions in applications, allowing the normal flow of execution to continue in the event of an exception and/or to report on such events. Although exception handling techniques, features and bad coding practices have been discussed both in developer communities and in the literature, there is a marked lack of empirical evidence on how...
Most programs related to security communicate with servers through encrypted channels, and sensitive data that are transmitted should be encrypted using a suitable protocol. Transport Layer Security (TLS) is a protocol that encrypts data by exchanging key materials in a secure way. Protocol analyzers in turn decrypt packets to see the raw protocols in plain text. Decryption is not a minor job, and...
Software Vulnerabilities have been a major concern for enterprises for evaluating vulnerabilities of their software products. For assessing vulnerabilities there are various sites to look upon and search for exposures related to various software products. When vulnerability is detected and exposed on vulnerability database; discovery, analysis and redressal through manual intervention is too moderate,...
It is common knowledge in the IT service domain that changes to the system configuration are responsible for a major portion of incidents that result in client outages. However, it is typically very difficult to establish a relationship between changes and incidents as proper documentation takes lower priority at change creation time, as well as during incident management, in order to deal with the...
DIBBs Brown Dog is a recent cyberinfrastructure effort which aims to create two new services to aid users in the searching, accessing, and usage of digital data and provide these services in a manner that is as broadly and easily accessible as possible. At its lowest level, the Data Access Proxy (DAP) providing file format conversion capabilities and the Data Tilling Service (DTS) providing content...
Preventing system failure in cloud has become more important as a result of the prevalence of cloud use for mission-critical applications. One of the major causes of system failure in clouds is misconfiguration, as shown in recent studies. Hence, it is essential first to detect misconfiguration before it causes outage or degradation of service. Although cloud provides us flexible and auto-configurable...
Application-based access control technologies are used to protect systems from malicious or compromised software. Existing rule-based access control systems rely on a comprehensive policy, which defines the resources an application is allowed to access. The generation of these policies is a hard and error-prone task for system engineers. In this work, we provide a framework to automate this task and...
Tool-based code review is growing in popularity and has become a standard part of the development process at Mi-crosoft. Adoption of these tools makes it possible to mine data from code reviews and provide access to it. In this paper, we pre-sent an experience report for CodeFlow Analytics, a system that collects code review data, generates metrics from this data, and provides a number of ways for...
The present article is concerned about the knowledge of the different tools of Business Intelligence used to generate bases for Knowledge Management Systems that allow doing a better decisions making with less risk at any level of the organization. Tests will be made using a relational database management system and performing results in a common business application.
Indexing play an indispensable role in Search Engine. Indexing empower ease of mining of data and lessen the latency of searching a term in huge documents. In this paper, we propose a methodology to index documents in a parallel - distributed manner. Define Metadata structure of a document for indexing; from the metadata, the occurrence of a word shall be ascertained by document wise, page number...
The current state of affairs regarding the way events are logged by IT systems is the source of many problems for the developers of Intrusion Detection Systems (IDS) and Security Information and Event Management (SIEM) systems. These problems stand in the way of the development of more accurate security solutions that draw their results from the data included within the logs they process. This is...
In this paper we propose the quantitative analysis of the complexity of a simple reference game implemented on a particular gaming platform as means for characterizing how the platform succeeds in easing the development of networked multiplayer games. We first present our own open source tool based on Sneed's Object-Point (OP) method for the automatic quantitative assessment of the complexity of a...
Over the past decade, a number of tools and systems have been developed to manage various aspects of the software development lifecycle. Until now, tool supported code review, an important aspect of software development, has been largely ignored. With the advent of open source code review tools such as Gerrit along with projects that use them, code review data is now available for collection, analysis,...
Decision makers must know if their cyber assets are ready to execute critical missions and business processes. Net-work operators need to know who relies on a failed network asset (e.g. IP address, network service, application) and what critical operations are impacted. This requires a mapping between net-work assets and the critical operations that depend on them, cur-rently a manual and tedious...
Traditional Chinese medicine (TCM) is developed from clinical practice. Rich information can be obtained from the medical record of TCM. National clinical research data center of TCM is constructed to collect the clinical medical data of TCM from hospitals all over the country. The construction framework of national clinical data center of TCM includes infrastructure construction, software construction,...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.