The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Unit testing is based on the idea that units under test behave in a reproducible and deterministic way. If the unit's code is dependent on external context factors like time or location, these factors have to be controlled in order to produce meaningful results. Spaced repetition mobile learning games in which users are reminded to play at time intervals are based on previous user interaction with...
Currently, there are many tools that modify the program code of the compiled protected modules. The authors of the considered tools do not publish technique of embedding their protection system and any proofs of their tools' algorithm correctness. This article describes the technique of program module modification with algorithm preservation for the subsequent embedding a protection system. The article...
We present a new automated method for efficient detection of security vulnerabilities in binary programs. This method starts with a bounded symbolic execution of the target program so as to explore as many paths as possible. Constraints of the explored paths are collected and solved for inputs. The inputs will then be fed to the following interleaved coverage-based fuzzing and concolic execution....
When microscopy image data are stored and processed, the corresponding workflow is typically carried out by treating storing and processing separately. In many laboratories, it is common to store data on one computing system and process data on another system. This separation of storing and processing data has a negative impact on the traceability of results, and thus on reproducibility, as information...
Mining software repositories have frequently been investigated in recent research. Software modification in repositories are often recurring changes, similar but different changes across multiple locations. It is not easy for developers to find all the relevant locations to maintain such changes, including bug-fixes, new feature addition, and refactorings. Performing recurring changes is tedious and...
We introduce a novel algorithm for mining temporal intervals from real-time system traces with linear complexity using passive, black-box learning. Our interest is in mining nfer specifications from spacecraft telemetry to improve human and machine comprehension. Nfer is a recently proposed formalism for inferring event stream abstractions with a rule notation based on Allen Logic. The problem of...
Over Several years, we observed that our students were sceptical of Software Engineering practices, because we did not convey the experience and demands of production quality software development. Assessment focused on features delivered, rather than imposing responsibility for longer term `technical debt'. Academics acting as 'uncertain' customers were rejected as malevolent and implausible. Student...
As continuous delivery and continuous integration practices become more prevalent in industry, the need for education in these areas grows. Introducing these topics introduces complexities due to the learning curve of the involved tools and the amount of time available for teaching these topics. Furthermore, there has been limited research into effective teaching practices for incorporating continuous...
Electrocardiogram (ECG), Electrodermal Activity (EDA), Electromyogram (EMG) and Impedance Cardiography (ICG) are among physiological signals widely used in various biomedical applications including health tracking, sleep quality assessment, early disease detection/diagnosis and human affective state recognition. This paper presents the development of a biosignal-specific processing and feature extraction...
Network attack graphs are a type of analysis tool that can be used to determine the impact that security vulnerabilities have on the network. It is important, then, for attack graphs to be able to represent enough information to aid this analysis. Moreover, they must be able to handle and integrate new vulnerabilities that are being discovered by the security community. We developed a prototype tool...
The software composition using high-granularity entities nowadays is a common practice. The process of software composition is supported by various CASE tools. First tools were made on the basis of very simple formalisms (e.g. intuitionistic propositional logic). During the years the tools evolved to more efficient ones, which are able to deal with concurrency, multiparty sessions and other advanced...
We present RDCL 3D, a “model agnostic” web framework for the design and composition of NFV services and components. The framework allows editing and validating the descriptors of services and components both textually and graphically and supports the interaction with external orchestrators or with deployment and execution environments. RDCL 3D is open source and designed with a modular approach, allowing...
Software engineering (SE) educators are challenged to balance the scope and depth in their courses to train students in skills which will fulfill the ever-evolving industry needs. Capstone courses are a tool for educators to transfer hands-on experience into practical knowledge and skills of SE students. This paper describes the design of a Casptone course, at Lappeenranta University of Technology...
Provides an abstract for each of the workshop presentations and may include a brief professional biography of each presenter. The complete presentations were not made available for publication as part of the conference proceedings.
As an emerging approach to support fast delivery of software features with reliable quality, DevOps attracts more and more practitioners and shows the potential to become one of the mainstream approach for software development and operation. Many universities begin to offer DevOps related courses to the students majored in software engineering and computer science. However, as a critical part of a...
Background: Static analysis security testing (SAST) tools may be evaluated using synthetic micro benchmarks and benchmarks based on real-world software. Aims: The aim of this study is to address the limitations of the existing SAST tool benchmarks: lack of vulnerability realism, uncertain ground truth, and large amount of findings not related to analyzed vulnerability. Method: We propose Delta-Bench...
The mining of software repositories has provided significant advances in a multitude of software engineering fields, including defect prediction. Several studies show that the performance of a software engineering technology (e.g., prediction model) differs across different project repositories. Thus, it is important that the project selection is replicable. The aim of this paper is to present STRESS,...
[Context:] Software productivity analysis is an essential activity for software process improvement. It specifies critical factors to be resolved or accepted from project data. As the nature of project data is observational, not experimental, the project data involves bias that can cause spurious relationships among analyzed factors. Analysis methods based on linear regression suffer from the spurious...
Background: Merge conflicts are a common occurrence in software development. Researchers have shown the negative impact of conflicts on the resulting code quality and the development workflow. Thus far, no one has investigated the effect of bad design (code smells) on merge conflicts. Aims: We posit that entities that exhibit certain types of code smells are more likely to be involved in a merge conflict...
Despite the advancement in software build tools such as Maven and Gradle, human involvement is still often required in software building. To enable large-scale advanced program analysis and data mining of software artifacts, software engineering researchers need to have a large corpus of built software, so automatic software building becomes essential to improve research productivity. In this paper,...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.