Quadratic and Higher Order Unconstrained Binary Optimization of Railway Dispatching Problem for Quantum Computing

This paper outlines QUBO and HOBO representations for dispatching problems of rail traffic management . The consequences of disruptions in railway traffic are the primary cause of passengers’ dissatisfaction . The main result is the hybrid algorithm to deal with disturbancesin rail traffic on single-, double- and multi-track lines; the demonstrativemodel illustrates the issue briefly .…

Reshaping Convex Polyhedra

Given a convex polyhedral surface P, we define a tailoring as excising from Pa simple polygonal domain that contains one vertex v . We show that P can be reshaped to anypolyhedral convex surface Q a subset of conv(P) by a sequence of tailorings .…

Geometric averages of partitioned datasets

We introduce a method for jointly registering ensembles of partitioneddatasets in a way which is both geometrically coherent and partition-aware . We establish basic theory in this general setting, including Alexandrov curvature bounds and a verifiable characterization of local means in symmetric products of metric spaces .…

IRS Aided WPCNs A New Optimization Framework for Dynamic IRS Beamforming

In this paper, we propose a new dynamic IRS beamforming framework to boost the sum throughput of an intelligent reflecting surface (IRS) aided wirelesspowered communication network (WPCN) Specifically, the IRS phase-shift vectorsacross time and resource allocation are jointly optimized to enhance theefficiencies of both downlink wireless power transfer (DL WPT) and uplinkwireless information transmission (UL WIT) between a hybrid access point (HAP) and multiple wirelessly powered devices .…

Self Adversarial Training incorporating Forgery Attention for Image Forgery Localization

Image editing techniques enable people to modify content of an imagewithout leaving visual traces and thus may cause serious security risks . Hencethe detection and localization of these forgeries become quite necessary and challenging . In this paper, we propose a self-adversarial training strategy and a reliable coarse-to-fine network that utilizes a selfattention mechanism to localize forged regions in forgery images .…

On Search Complexity of Discrete Logarithm

In this work, we study the discrete logarithm problem in the context of TFNP- the complexity class of search problems with a syntactically guaranteedexistence of a solution for all instances . Our main results establish thatsuitable variants of the discrete .…

On the Hardness of Compressing Weights

We investigate computational problems involving large weights through the lens of kernelization . Our main focus is the weighted Cliqueproblem, where we are given an edge-weighted graph and the goal is to detect aclique of total weight equal to a prescribed value .…

A Leap among Entanglement and Neural Networks A Quantum Survey

In recent years, Quantum Computing witnessed massive improvements in resources and algorithms development . The ability toharness quantum phenomena to solve computational problems is a long-standing dream that has drawn the scientific community’s interest since the late ’80s . In such a context, we pose our contribution to quantum computing .…

OptiMic A tool to generate optimized polycrystalline microstructures for materials simulations

Polycrystal microstructures, with their distinct physical, chemical,structural and topological entities, play an important role in determining theeffective properties of materials . The software allows for both monodispersive grains as well as irregulargrains obtained currently via Voronoi tessellations. These initialmicrostructures can then be optimized to reflect desired statistical features.…

Long Short Transformer Efficient Transformers for Language and Vision

Long-Short Transformer (Transformer-LS) is anefficient self-attention mechanism for modeling long sequences with linearcomplexity . It aggregates a novel long-rangeattention with dynamic projection to model distant correlations and ashort-term attention to capture fine-grained local correlations . The method outperforms the state-of-the-art models on multiple tasks in language and vision domains, including the Long Range Arena benchmark, autoregressive language modeling, andImageNet classification .…

MAJORITY 3SAT and Related Problems in Polynomial Time

Majority-SAT is the problem of determining whether an input $n$-variableformula in conjunctive normal form (CNF) has at least $2n-1 satisfyingassignments . We prove that for every $k$SAT, Majority-$k$ is in P for all $k\le 3$ and $k \le 4$ . We give an algorithm that can determine whether a given $k $k-CNF has $2k$ satisfying assignments, in deterministic linear time (whereas the previous best-known algorithm ran in exponential time) Our algorithms haveinteresting positive implications for counting complexity and the complexity ofinference, significantly reducing the known complexities of related problems .…

Decision problems for origin close top down tree transducers full version

Tree transductions are binary relations of finite trees . For treetransductions defined by non-deterministic top-down tree transducers,inclusion, equivalence and synthesis problems are known to be undecidable . We introduce a similarity measure for transducers with origin semantics . We show that we can decide inclusion and equivalence problems for origin-close non-dualistic top-downs tree transducers .…

Garbage Glitter or Gold Assigning Multi dimensional Quality Scores to Social Media Seeds for Web Archive Collections

Web is plagued by reference rot which causes important Web resources to disappear . Web archive collections help reduce the costly effectsof reference rot by saving Web resources that chronicle important stories/events before they disappear . The quality of social media content varies widely, therefore, we propose aframework for assigning multi-dimensional quality scores to social media seeds .…

Terminologies mod è les de donn é es arch é ologiques et th é saurus documentaires

The ISO 25964 standard is flexible enough to permit the comparison and linking of different scientific or documentary ”points of view” The tool is already adapted to the Linked Data. The challenges that remain to be met on this path do not prevent the thesaurus tool from already being a suitable support for a complete ”human-to-machine” interoperability, developed within the framework of the Bibracte Ville Ouverte project and exemplified through a research on theceramics of that archaeological site .…

Lucid A Language for Control in the Data Plane

Programmable switch hardware makes it possible to move fine-grained controllogic inside the network data plane . Lucid introduces abstractions that make it easy to write sophisticated data-plane applications with interleavedpacket-handling and control logic . In a stateful firewall written in Lucid, we find that moving control from a switch’s CPU to its data-planes processor using Lucid reduces the latency of performance-sensitive operations by over 300X.…

CoReD Generalizing Fake Media Detection with Continual Representation using Distillation

Continual learning is a growing field of research that examines how AI systems can learn sequentially from a continuous stream of linked datain . Fake media such asdeepfakes and synthetic face images have emerged as significant to currentmultimedia technologies . We design CoReD to perform sequential domainadaptation tasks on new deepfake and GAN-generated synthetic face datasets, while effectively minimizing the catastrophic forgetting in a teacher-student model setting .…

Irregular Invertible Bloom Look Up Tables

We consider invertible Bloom lookup tables (IBLTs) which are probabilistic data structures that allow to store keyvalue pairs . An IBLT supports insertionand deletion of key-value pairs, as long as the number of key value pairs stored in theIBLT does not exceed a certain number .…

Furthering a Comprehensive SETI Bibliography

Reyes & Wright used the NASA Astrophysics Data System (ADS) to start a comprehensive bibliography for SETI accessible to the public . Results were curated based on five SETIkeyword searches: “SETI”, “technosignature”, “Fermi Paradox,” “Drake Equation”, and “extraterrestrial intelligence” These keywords returned 553 publicationsthat merited inclusion in the bibliography that were not previously present .…

Practical I O Efficient Multiway Separators

We revisit the fundamental problem of I/O-efficiently computing $r$-wayseparators on planar graphs . We show how our algorithm can be generalized and applied directly to Delaunay triangulations without relying on a Koebe-embedding . Our algorithm outperforms traditional sweep-line-based algorithms on the digital elevation model of Denmark .…

Twin width and polynomial kernels

A polynomial kernel for $k$-Dominating Set on graphs of twin-width at most 4 would contradict a standard complexity-theoretic assumption . On the positive side, we obtain a simplequadratic . vertex kernel for Connected $k$.-Vertex Cover and Capacitated$k$$-vertex Cover . The kernel applies to graphs of Vapnik-Chervonenkis density 1, and does not require awitness sequence .…

Thread modular Analysis of Release Acquire Concurrency

We present a thread-modular abstract interpretation(TMAI) technique to verifyprograms under the release-acquire (RA) memory model for safety propertyviolations . We capture the executionorder of program statements as an abstract domain, and propose a sound upperapproximation over this domain to efficiently reason over RA concurrency .…

Exact Analytical Parallel Vectors

This paper demonstrates that parallel vector curves are piecewise cubicrational curves in 3D piecewise linear vector fields . We discuss how singularities of the rationals lead to different types of intersections with tetrahedral cells . We define the term \emph{generalized and underdetermined eigensystem} inthe form of the form of $\mathbf{A}\mathbf {x}+\mathf {a}=\lambda(\mathf)$ inorder to derive the piecewise rational representation of 3D parallel vector curve .…

CAP RAM A Charge Domain In Memory Computing 6T SRAM for Accurate and Precision Programmable CNN Inference

CAP-RAM is presented forenergy-efficient convolutional neural network (CNN) inference . It leverages anovel charge-domain multiply-and-accumulate (MAC) mechanism and circuitry to achieve superior linearity under process variations compared to conventionalIMC designs . A single 512×128 macro stores a complete pruned and quantized CNN model to achieve 98.8% inference accuracy on the MNIST data set and 89.0% on the CIFAR-10 data set, with a573.4-giga operations per second (GOPS) peak throughput .…

Accessible Color Cycles for Data Visualization

Data were collected with an online survey, and the results were used to train a machine-learning model . Optimal color cycles containing six, eight, and tencolors were generated using the data-driven aesthetic-preference model and accessibility constraints . Due to the balance of aesthetics and accessibility considerations, the resulting color cycles can serve as reasonable defaults indata-plotting codes for data visualization .…

ATC an Advanced Tucker Compression library for multidimensional data

ATC is a C++ library for advanced Tucker-based compression of numerical data . It is based on the sequentially truncated higher-order singular value decomposition (ST-HOSVD) and bit plane truncation . Numerical results show that ATC maintains state-of-the-art Tucker compression rates, while providing average speed-ups of 2.6-3.6 and halving memory usage .…

TransformerFusion Monocular RGB Scene Reconstruction using Transformers

TransformerFusion is a transformer-based 3D scene reconstruction approach . From an input monocular RGB video, the video frames are processed by a transformer network that fuses the observations into a volumetric featuregrid representing the scene . This feature grid is then decoded to a higher-resolutionscene reconstruction, using an MLP-based surface occupancy prediction frominterpolated coarse-to-fine 3D features .…

State efficient QFA Algorithm for Quantum Computers

Moore-Crutchfield quantum finite automaton(MCQFA) is proven to be exponentially more succinct than classical finiteautomata models . In this paper, we present a modified MCQFA algorithm for the language $mathtt{MOD}_{\rm p}$, the operators of which are selected based on the basis gates on the availablereal quantum computers .…

Demonstration of Faceted Search on Scholarly Knowledge Graphs

Traditional scholarly search systems listdocuments instead of providing direct answers to the search queries . As data in knowledge graphs are not acquainted semantically, they are not machine-readable . Instead, a search on scholarly knowledge graphs ends up in a full-text search, not a search in the content of scholarly literature .…

A Data Driven Method for Recognizing Automated Negotiation Strategies

Understanding an opponent agent helps in negotiating with it . Wepropose a novel data-driven approach for recognizing an opponent’s snegotiation strategy . The approach includes a data generation method for anagent to generate domain-independent sequences by negotiating with a variety of opponents across domains, a feature engineering method for representingnegotiation data as time series with time-step features and overall features, and a hybrid (recurrent neural network-based) deep learning method .…

Recovering the Unbiased Scene Graphs from the Biased Ones

Scene graph generation (SGG) aims to producecomprehensive, graphical representations describing visual relationships among objects . However, the imbalance in the fraction of missing labels of differentclasses, or reporting bias, exacerbating the long tail is rarely considered and cannot be solved by the existing debiasing methods .…

Analyzing a Knowledge Graph of Industry4 0 Standards

In this article, we tackle the problem of standard interoperability across different standardization frameworks, and devise a knowledge-driven approach . The STO ontology representsproperties of standards and standardization . The I40KG integrates more than 200 standards and fourstandardization frameworks . We analyze both thenumber of discovered relations between standards and accuracy of theserelations .…

A Lottery Ticket Hypothesis Framework for Low Complexity Device Robust Neural Acoustic Scene Classification

We propose a novel neural model compression strategy combining dataaugmentation, knowledge transfer, pruning, and quantization for device-robustacoustic scene classification . Acoustic Lottery could compress an ASC model over$1/10^{4}$ and attain a superior performance (validation accuracy of 74.01% and log loss of 0.76) compared to its not compressed seed model .…