Please login to be able to save your searches and receive alerts for new content matching your search criteria.
Scientific computing communities often run their experiments using complex data- and compute-intensive workflows that utilize high performance computing (HPC), distributed clusters and specialized architectures targeting machine learning and artificial ...
This paper describes how we envision classifying events into the United Nations Sustainable Development Goals (SDGs) by utilizing machine learning techniques on global news data. We propose extracting data from a media intelligence platform using an ...
Knowledge Graphs and semantic technologies allow scientists and domain experts to model complex relations between data in a logically structured and machine readable format. metaphactory is a platform that enables users to build these kinds of semantic ...
Our society is increasingly digital, and its processes are increasingly digitalized. As an emerging technology for the digital society, graphs provide a universal abstraction to represent concepts and objects, and the relationships between them. However,...
It is our great pleasure to welcome you to the 2023 ACM/SPEC Workshop on Serverless, Extreme-Scale, and Sustainable Graph Processing Systems. This is the first such workshop, aiming to facilitate the exchange of ideas and expertise in the broad field of ...
We propose an incremental change detection method for data center (DC) energy efficiency metrics and consider its application to the power usage efficiency (PUE) metric. In recent years, there is an increasing focus on the sustainability of DCs and PUE ...
Provenance provides data lineage and history of different transformations applied to a dataset. A complete trace of data provenance can enable the reanalysis, reproducibility, and reusability of features, which are essential for validating results and ...
Over the last few years, DevOps methodologies have promoted a more streamlined operationalization of software components in production environments. Infrastructure as Code (IaC) technologies play a key role in the lifecycle management of applications, ...
It is important for developers to understand the performance of a software project as they develop new features, fix bugs, and try to generally improve the product. At MongoDB we have invested in building a performance infrastructure to support our ...
Microbenchmarking is a widely used method for evaluating the performance of a piece of code. However, the results of microbenchmarks for applications that utilize the Java Virtual Machine (JVM) are often unstable during the initial phase of execution, ...
Stable and repeatable measurements are essential for comparing the performance of different systems or applications, and benchmarks are used to ensure accuracy and replication. However, if the corresponding measurements are not stable and repeatable, ...
MongoDB has invested in developing a performance infrastructure and a corresponding performance culture. All development engineers are expected to improve MongoDB performance, through adding performance tests, optimizing code, and fixing performance ...
Change point detection has recently gained popularity as a method of detecting performance changes in software due to its ability to cope with noisy data. In this paper we present Hunter, an open source tool that automatically detects performance ...
Information systems that support interaction with users, especially social applications, when there are many functions, may omit the filtering of user input data due to functional cross-reference and improper switch logic settings. It leads to a series ...
Head-Related Transfer Function (HRTF) describes the acoustic reflection and diffraction effect caused by the influence of the human body (head, torso, etc.) in the transmission of sound waves to the human ear. In Virtual Reality(VR) / Augmented Reality(...
Building an effective fashion recommendation system is challenging due to the high level of subjectivity and the semantic complexity of the features involved. Users’ decision depends largely on their interest and the appearance of the product. Such ...
Team formation is concerned with the identification of a group of experts who have a high likelihood of effectively collaborating with each other in order to satisfy a collection of input skills. Solutions to this task have mainly adopted graph operations ...
Users frequently interact with software systems through data entry forms. However, form filling is time-consuming and error-prone. Although several techniques have been proposed to auto-complete or pre-fill fields in the forms, they provide limited ...
Graph Neural Networks (GNNs) such as Graph Convolutional Networks (GCN) can effectively learn node representations via aggregating neighbors based on the relation graph. However, despite a few exceptions, most of the previous work in this line does not ...
Collaborative filtering models have undoubtedly dominated the scene of recommender systems in recent years. However, due to the little use of content information, they narrowly focus on accuracy, disregarding a higher degree of personalization. In the ...