In Edge and Fog Computing environments, it is usual to design and test distributed algorithms that implement scheduling and load balancing solutions. The operation paradigm that usually fits the context requires the users to make calls to the closer node for executing a task, and since the service...
01a Articolo in rivista
-
-
In this paper, colorimetric analysis for biochemical samples has been realized, by developing an easy-to-use smartphone colorimetric sensing android application that can measure the molar concentration of the biochemical liquid analyte. The designed application can be used for on-site testing and...
-
In this paper, we consider a load balancing protocol based on the power of random choices that is adapted to a fog deploy in which several independent fog nodes equipped with a set of servers or VM are serving the same geographical area. The protocol is based on a simple but effective mechanism...
-
Smart cities represent an archetypal example of infrastructures where the fog computing paradigm can express its potential: we have a large set of sensors deployed over a large geographic area where data should be pre-processed (e.g., to extract relevant information or to filter and aggregate data...
-
We revisit Byzantine tolerant reliable broadcast with honest dealer algorithms in multi-hop networks. To tolerate Byzantine faulty nodes arbitrarily spread over the network, previous solutions require a factorial number of messages to be sent over the network if the messages are not authenticated (...
-
Software vulnerabilities represent one of the main weaknesses of an Information Technology (IT) system w.r.t. cyber attacks and nowadays consolidated official data, like the Common Vulnerability Exposure (CVE) dictionary, provide precise and reliable details about them. This information, together...
-
The imputation of missing values in the detail data of Educational Institutions is a difficult task. These data contain multivariate time series, which cannot be satisfactory imputed by many existing imputation techniques. Moreover, almost all the data of an Institution are interconnected: the...
-
The heterogeneity of the Higher Education (HE) Institutions is one of the main critical issues in the assessment of their performance. This paper adopts a multi-level and multi-dimensional perspective, combining national (macro) and institution (micro) level data, and measuring both research and...
-
This paper relates the definition of data quality procedures for knowledge organizations such as Higher Education Institutions. The main purpose is to present the flexible approach developed for monitoring the data quality of the European Tertiary Education Register (ETER) database, illustrating...
-
Probabilistic Discrete Choice Models (PDCM) have been extensively used to interpret the behavior of heterogeneous decision makers that face discrete alternatives. The classification approach of Logical Analysis of Data (LAD) uses discrete optimization to generate patterns, which are logic formulas...