The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
Deploying a large number of sensing buoys is a powerful tool for oceanographic, marine biology and climate change research. In this work we address the problem of efficient data collection by a team of Unmanned Aerial Vehicles (UAV) from low-power, small buoys with previously unknown locations in complex coastal or remote oceanic regions with aerial restrictions. To tackle this problem we propose...
Wireless cameras can be used to gather situation awareness information (e.g., humans in distress) in disaster recovery scenarios. However, blindly sending raw video streams from such cameras, to an operations center or controller can be prohibitive in terms of bandwidth. Further, these raw streams could contain either redundant or irrelevant information. Thus, we ask "how do we extract accurate...
Discriminating Distributed Denial of Service (DDoS) from Flash Crowds (FC) is a tough and challenging problem, because there are many similarities between each other existed in network layer. In this paper, according to an extensive analysis of user traffic behavior of DDoS and FC, it can be found that some traffic abnormalities are existed between Bots and legitimate users. So a behavior-based method...
We propose PathML, an available bandwidth (i.e., unused capacity of an end-to-end path) estimation method based on a data-driven paradigm that uses machine learning with a large amount of data. An experiment over an operational LTE network was performed to compare our method with prior work.
The measurement of perceptually relevant information about textures has been approached through profilometry, vibrometry, and tribometry. Manfredi et al. [1] used a laser Doppler vibrometer to measure skin surface vibrations as a texture sample slides across a fingertip. In our work, we treat the Manfredi et al. measurements as a gold standard, and assess the performance of a simpler and more portable...
We consider the {n, k, d, l) secure exact-repair regenerating code problem, which generalizes the {n, k, d) exact-repair regenerating code problem with the additional constraint that the stored file needs to be kept information-theoretically secure against an eavesdropper, who can access the data transmitted to regenerate a total of l different failed nodes. For all known results on this problem,...
Web usage mining is a data mining technique. There are large amount of data are stored on the internet. When user search any particular information by search engine like Google, Bing etc. is very difficult because the complexity of web pages is increases day by day. Web usage mining plays an important role to solve this problem. In web usage mining we are creating a suitable pattern according to the...
WiFi networks play a significant role in providing today's wireless connectivity, therefore, understanding and improving WiFi network performance is important for today's mobile applications and services. Previous studies conducted to investigate WiFi network performance have generally been performed using specific types of WiFi networks in relatively small areas and have been limited by either the...
The recent years have witnessed a sharp increase in the use of smart phones for internet applications, video calling, social networking and emails, etc. resulting in an unprecedented increase in the worldwide wireless network traffic. During the deployment of the wireless network topologies the prime focus has to be given towards the requirement for the bandwidth, user capacities and provision of...
Analog-to-information converters and Compressed Sampling (CS) sensor front-ends try to only extract the relevant, information-bearing elements of an incoming data stream. Information extraction and recognition tasks can run directly on the compressed data stream without needing full signal reconstruction. The accuracy of the extracted information or classification is strongly determined by the front-end...
Cyber-attacks are cheap, easy to conduct and often pose little risk in terms of attribution, but their impact could be lasting. The low attribution is because tracing cyber-attacks is primitive in the current network architecture. Moreover, even when attribution is known, the absence of enforcement provisions in international law makes cyber attacks tough to litigate, and hence attribution is hardly...
The term Big Data, refers to sizably voluminous data whose volume, variability, and velocity make it very arduous to manage, process or analyzed. To analyze this sizably voluminous kind of data Hadoop will be utilized. However, Processing is very time-consuming. To resolve this quandary & to decrement replication time one solution is to executing the job partially, where an approximate, early...
The effective bandwidth management in multi-service computer networks such as university networks has become a challenge in recent years. The growth of internet traffic and limitation of bandwidth resources persuade the information technology (IT) managers to focus on effective bandwidth allocation policies. One of the important issues discussed in this domain is how to assign the bandwidth fairly...
The problem of multilevel diversity coding with regeneration is considered in this work. Two new outer bounds on the optimal tradeoffs between the normalized storage capacity and repair bandwidth are established, by which the optimality of separate coding at the minimum-bandwidth-regeneration (MBR) point follows immediately. This resolves a question left open in a previous work by Tian and Liu.
Due to an explosive growth of Internet applications, the amount of data has increased enormously. In order to store and process this big data more efficiently, a solid-state device (SSD) has replaced a hard disk drive (HDD) as a primary storage media. In spite of high internal bandwidth, SSD has its performance bottleneck on the host interface whose bandwidth is relatively low. To overcome the problem...
The next generation wireless cellular communication networks are going to be completely internet protocol (IP) based. This leads to evolution of cellular networks into software controlled hierarchical abstract networks. Software controlling enables extraction of large volumes of network usage and transactions related information in the form of log files, configuration files, database entries etc....
This paper illustrates the nonlinear design and realization of L-Band broadband pulsed high power amplifier to deliver peak output power of 45 W at centre frequency of 1.25 GHz with 20% bandwidth. To account for the non-linear behavior of devices under large signal conditions, the extracted load-pull data is used to design both stages of power amplifier. Power amplifier has been realized on a single...
In this paper, a robust blind extracting audio watermarking scheme based on Improved Spread Spectrum (ISS) and Quadrature Phase Shift Keying (QPSK) is proposed. The frequency of direct spread spectrum sequence signal is reduced to limit its bandwidth, and then modulated to a higher frequency band with QPSK so as to lower the influence of watermarking to listener, meanwhile improve signal-noise ratio...
Linked Data can be published as RDF documents or embedded in HTML documents. A linked data crawler is a program that discovers the published linked data from the web by following RDF links. Note that there are RDF documents that are surrounded by HTML documents. Therefore, linked data crawlers require to follow HTML links in addition to RDF links to be able to discover such RDF documents as well as...
Logs are typically used for performing post mortem for abnormal activities. Most Internet service providers keep the history of users' web accesses in terms of proxy logs for investigating a misuse or fraud. However, the majority of the logs represent normal behavior, and no thorough analysis of such logs is usually performed, keeping them on storage would consume very big space. This paper analyzes...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.