By Hendrik Decker, Lenka Lhotská, Sebastian Link, Marcus Spies, Roland R. Wagner
This quantity set LNCS 8644 and LNCS 8645 constitutes the refereed complaints of the twenty fifth overseas convention on Database and professional platforms purposes, DEXA 2014, held in Munich, Germany, September 1-4, 2014. The 37 revised complete papers awarded including forty six brief papers, and a couple of keynote talks, have been conscientiously reviewed and chosen from 159 submissions. The papers speak about a number subject matters together with: facts caliber; social internet; XML key-phrase seek; skyline queries; graph algorithms; info retrieval; XML; safety; semantic net; class and clustering; queries; social computing; similarity seek; score; info mining; huge information; approximations; privateness; facts trade; info integration; net semantics; repositories; partitioning; and company applications.
Read or Download Database and Expert Systems Applications: 25th International Conference, DEXA 2014, Munich, Germany, September 1-4, 2014. Proceedings, Part II PDF
Similar data mining books
The LNCS magazine Transactions on tough units is dedicated to the whole spectrum of tough units comparable matters, from logical and mathematical foundations, via all features of tough set idea and its functions, comparable to information mining, wisdom discovery, and clever info processing, to kinfolk among tough units and different ways to uncertainty, vagueness, and incompleteness, reminiscent of fuzzy units and conception of facts.
Fresh advancements have enormously elevated the quantity and complexity of knowledge on hand to be mined, top researchers to discover new how one can glean non-trivial info instantly. wisdom Discovery Practices and rising purposes of information Mining: developments and New domain names introduces the reader to contemporary learn actions within the box of knowledge mining.
This publication constitutes the complaints of the second one Asia Pacific necessities Engineering Symposium, APRES 2015, held in Wuhan, China, in October 2015. The nine complete papers offered including three device demos papers and one brief paper, have been conscientiously reviewed and chosen from 18 submissions. The papers care for a variety of features of necessities engineering within the massive info period, reminiscent of computerized standards research, specifications acquisition through crowdsourcing, requirement procedures and requirements, necessities engineering instruments.
- Geographic Information Systems and Health Applications
- Practical Approaches to Causal Relationship Exploration
- Data Mining the Web: Uncovering Patterns in Web Content, Structure, and Usage
- Multi-objective evolutionary algorithms for knowledge discovery from databases
Extra info for Database and Expert Systems Applications: 25th International Conference, DEXA 2014, Munich, Germany, September 1-4, 2014. Proceedings, Part II
The results show that participating in activities to achieve social sustainability is most important to increase the awareness of sustainability. 5 Case Study III: Crime Mapping via Social Media A crime map is a tool that visualizes crime information based on the geographical location of crimes. In earliest time, the police use a crime map to recognize the inherent geographical component of crimes by sticking pins into maps displayed on walls, where each pin on the map represents a crime incident.
Although we cannot reason that two authors in similar domain correspond to same person, but it is unlikely for same person to publish in very distinct research areas. Information like title, journal/conference name and keyword depict domain characteristics of a given author. We concatenate all these text features to get a virtual documents. By generating virtual document for each individual author, we have a virtual corpus to model domain features. We utilize two classic approaches in text modeling: TF/IDF-weighted vectors and topic vectors inference by Latent Dirichlet Allocation .
Each feature vector corresponds to two virtual publications. However, we know that not all papers have complete information. We give each attribute an identity, and encode feature vector using combination of identities of shared attributes from two virtual publications. For example, if publication A has title, year, venue, keywords, publication B has title, year, venue and aﬃliation, the feature vector of theirs are encode as “124”. Although there are total 31 kinds of encodes theoretically, we only ﬁnd 20 feature encodes for the practical academic social network.