004 Datenverarbeitung; Informatik
Refine
Year of publication
Document Type
- Article (80) (remove)
Language
- English (80) (remove)
Keywords
- Computer Science Education (4)
- Competence Measurement (3)
- Secondary Education (3)
- Big Data (2)
- Competence Modelling (2)
- Computational thinking (2)
- Informatics Education (2)
- Informatics Modelling (2)
- Informatics System Application (2)
- Informatics System Comprehension (2)
- Machine learning (2)
- computational thinking (2)
- knowledge representation and nonmonotonic reasoning (2)
- (FPGA) (1)
- 21st century skills, (1)
- ABRACADABRA (1)
- Achievement (1)
- Activity Theory (1)
- Activity-orientated Learning (1)
- Advanced Video Codec (AVC) (1)
- Animal building (1)
- Arduino (1)
- Assessment (1)
- Augmented and virtual reality (1)
- Austria (1)
- Automated Theorem Proving (1)
- Automatically controlled windows (1)
- Automatisches Beweisen (1)
- Bean (1)
- Bloom’s Taxonomy (1)
- CS concepts (1)
- Capability approach (1)
- Challenges (1)
- Clause Learning (1)
- Cognitive Skills (1)
- Competences (1)
- Competencies (1)
- Computational Thinking (1)
- Computer Science (1)
- Computer Science in Context (1)
- Computing (1)
- Contest (1)
- Contextualisation (1)
- Contradictions (1)
- Convolution (1)
- Curriculum (1)
- Curriculum Development (1)
- Customer ownership (1)
- DPLL (1)
- Data Analysis (1)
- Data Management (1)
- Data Privacy (1)
- Databases (1)
- Defining characteristics of physical computing (1)
- Digital Competence (1)
- Digital Education (1)
- Digital Revolution (1)
- Digital image analysis (1)
- Digitalization (1)
- Dynamic assessment (1)
- Early Literacy (1)
- Educational Standards (1)
- Educational software (1)
- Embedded Systems (1)
- Euclid’s algorithm (1)
- FPGA (1)
- Facebook (1)
- Feature extraction (1)
- Fibonacci numbers (1)
- Field programmable gate arrays (1)
- Finite automata (1)
- Function (1)
- Fundamental Ideas (1)
- Graphensuche (1)
- H.264 (1)
- Hardware accelerator (1)
- Histograms (1)
- ICT Competence (1)
- ICT competencies (1)
- ICT skills (1)
- Image resolution (1)
- Imperative calculi (1)
- Improving classroom (1)
- Inference (1)
- Informatics (1)
- Inquiry-based Learning (1)
- Insurance industry (1)
- Interface design (1)
- Kernel (1)
- Key Competencies (1)
- Klausellernen (1)
- Learners (1)
- Learning Fields (1)
- Learning ecology (1)
- Learning interfaces development (1)
- Learning with ICT (1)
- Lindenmayer systems (1)
- Logarithm (1)
- Loss (1)
- Low Latency (1)
- Lower Secondary Level (1)
- MOOCs (1)
- Machine Learning (1)
- Massive Open Online Courses (1)
- Measurement (1)
- Media in education (1)
- Multi-sided platforms (1)
- Music Technology (1)
- NUI (1)
- Natural Science Education (1)
- Natural ventilation (1)
- NoSQL (1)
- Norway (1)
- Novice programmers (1)
- Optimization (1)
- Pedagogical content knowledge (1)
- Pedagogical issues (1)
- Physical Science (1)
- Plant identification (1)
- Preprocessing (1)
- Problem Solving (1)
- Random access memory (1)
- Recommendations for CS-Curricula in Higher Education (1)
- Region of Interest (1)
- Relevanz (1)
- Reversibility (1)
- SAT (1)
- Scale-invariant feature transform (SIFT) (1)
- Sensors (1)
- Sharing (1)
- Signal processing (1)
- Simulations (1)
- Single event upsets (1)
- Small Private Online Courses (1)
- Social (1)
- Systems of parallel communicating (1)
- Tasks (1)
- Teacher perceptions (1)
- Teachers (1)
- Teaching information security (1)
- Technology proficiency (1)
- Terminology (1)
- Tests (1)
- Theorembeweisen (1)
- Theory (1)
- Type and effect systems (1)
- UX (1)
- Unifikation (1)
- VGG16 (1)
- Value network (1)
- Vocational Education (1)
- Young People (1)
- abstraction (1)
- action and change (1)
- algorithms (1)
- analogical thinking (1)
- answer set programming (1)
- architecture (1)
- automata (1)
- automated planning (1)
- bibliometric analysis (1)
- binary representation (1)
- binary search (1)
- citation analysis (1)
- classroom language (1)
- co-citation analysis (1)
- co-occurrence analysis (1)
- cognitive modifiability (1)
- combined task and motion planning (1)
- competence (1)
- competencies (1)
- competency (1)
- complexity (1)
- comprehension (1)
- computer science education (1)
- computer science teachers (1)
- computer vision (1)
- cs4fn (1)
- curriculum theory (1)
- determinism (1)
- developmental systems (1)
- digitally-enabled pedagogies (1)
- divide and conquer (1)
- e-mentoring (1)
- education (1)
- education and public policy (1)
- educational programming (1)
- educational systems (1)
- edutainment (1)
- environments (1)
- exponentiation (1)
- field-programmable gate array (1)
- formal languages (1)
- fun (1)
- functions (1)
- graph-search (1)
- hardware accelerator (1)
- hardware architecture (1)
- high school (1)
- higher (1)
- image processing (1)
- informal and formal learning (1)
- informatics education (1)
- innovation (1)
- interactive course (1)
- interactive workshop (1)
- key competences in physical computing (1)
- key competencies (1)
- kinaesthetic teaching (1)
- learning (1)
- logic programming methodology and applications (1)
- machine learning (1)
- machine learning algorithms (1)
- manipulation planning (1)
- mediated learning experience (1)
- mobile learning (1)
- mobile technologies and apps (1)
- monitoring (1)
- networks (1)
- online learning (1)
- operating system (1)
- organisational evolution (1)
- paper prototyping (1)
- parallel processing (1)
- parallel rewriting (1)
- parameter (1)
- pedagogy (1)
- personal (1)
- personal response systems (1)
- philosophical foundation of informatics pedagogy (1)
- physical computing tools (1)
- pre-primary level (1)
- predictive models (1)
- preprocessing (1)
- primary education (1)
- primary level (1)
- problem-solving (1)
- professional development (1)
- programming (1)
- programming in context (1)
- real-time (1)
- relevance (1)
- reliability (1)
- restricted parallelism (1)
- secondary computer science education (1)
- secondary education (1)
- self-adaptive multiprocessing system (1)
- self-efficacy (1)
- single event upset (1)
- social media (1)
- solar particle event (1)
- space missions (1)
- student activation (1)
- student experience (1)
- student perceptions (1)
- students’ conceptions (1)
- students’ knowledge (1)
- teacher competencies (1)
- teaching (1)
- teaching informatics in general education (1)
- technical notes and rapid communications (1)
- theorem (1)
- theory (1)
- tools (1)
- tracing (1)
- unification (1)
- user experience (1)
- user-centred (1)
- virtual reality (1)
- ‘unplugged’ computing (1)
Institute
- Institut für Informatik und Computational Science (80) (remove)
Current curricular trends require teachers in Baden-
Wuerttemberg (Germany) to integrate Computer Science (CS) into
traditional subjects, such as Physical Science. However, concrete guidelines
are missing. To fill this gap, we outline an approach where a
microcontroller is used to perform and evaluate measurements in the
Physical Science classroom.
Using the open-source Arduino platform, we expect students to acquire
and develop both CS and Physical Science competencies by using a
self-programmed microcontroller. In addition to this combined development
of competencies in Physical Science and CS, the subject matter
will be embedded in suitable contexts and learning environments,
such as weather and climate.
How Things Work
(2015)
Recognizing and defining functionality is a key competence
adopted in all kinds of programming projects. This study investigates
how far students without specific informatics training are able to identify
and verbalize functions and parameters. It presents observations
from classroom activities on functional modeling in high school chemistry
lessons with altogether 154 students. Finally it discusses the potential
of functional modelling to improve the comprehension of scientific
content.
ProtoSense
(2015)
This paper originated from discussions about the need for
important changes in the curriculum for Computing including two focus
group meetings at IFIP conferences over the last two years. The
paper examines how recent developments in curriculum, together with
insights from curriculum thinking in other subject areas, especially mathematics
and science, can inform curriculum design for Computing.
The analysis presented in the paper provides insights into the complexity
of curriculum design as well as identifying important constraints and
considerations for the ongoing development of a vision and framework
for a Computing curriculum.
Exploratory Data Analysis
(2014)
In bioinformatics the term exploratory data analysis refers to different methods to get an overview of large biological data sets. Hence, it helps to create a framework for further analysis and hypothesis testing. The workflow facilitates this first important step of the data analysis created by high-throughput technologies. The results are different plots showing the structure of the measurements. The goal of the workflow is the automatization of the exploratory data analysis, but also the flexibility should be guaranteed. The basic tool is the free software R.
Geocoder accuracy ranking
(2014)
Finding an address on a map is sometimes tricky: the chosen map application may be unfamiliar with the enclosed region. There are several geocoders on the market, they have different databases and algorithms to compute the query. Consequently, the geocoding results differ in their quality. Fortunately the geocoders provide a rich set of metadata. The workflow described in this paper compares this metadata with the aim to find out which geocoder is offering the best-fitting coordinate for a given address.
In recent years, many efforts have been made to apply image processing techniques for plant leaf identification. However, categorizing leaf images at the cultivar/variety level, because of the very low inter-class variability, is still a challenging task. In this research, we propose an automatic discriminative method based on convolutional neural networks (CNNs) for classifying 12 different cultivars of common beans that belong to three various species. We show that employing advanced loss functions, such as Additive Angular Margin Loss and Large Margin Cosine Loss, instead of the standard softmax loss function for the classification can yield better discrimination between classes and thereby mitigate the problem of low inter-class variability. The method was evaluated by classifying species (level I), cultivars from the same species (level II), and cultivars from different species (level III), based on images from the leaf foreside and backside. The results indicate that the performance of the classification algorithm on the leaf backside image dataset is superior. The maximum mean classification accuracies of 95.86, 91.37 and 86.87% were obtained at the levels I, II and III, respectively. The proposed method outperforms the previous relevant works and provides a reliable approach for plant cultivars identification.
Think logarithmically!
(2015)
We discuss here a number of algorithmic topics which we
use in our teaching and in learning of mathematics and informatics to
illustrate and document the power of logarithm in designing very efficient
algorithms and computations – logarithmic thinking is one of the
most important key competencies for solving real world practical problems.
We demonstrate also how to introduce logarithm independently
of mathematical formalism using a conceptual model for reducing a
problem size by at least half. It is quite surprising that the idea, which
leads to logarithm, is present in Euclid’s algorithm described almost
2000 years before John Napier invented logarithm.
The use of neural networks is considered as the state of the art in the field of image classification. A large number of different networks are available for this purpose, which, appropriately trained, permit a high level of classification accuracy. Typically, these networks are applied to uncompressed image data, since a corresponding training was also carried out using image data of similar high quality. However, if image data contains image errors, the classification accuracy deteriorates drastically. This applies in particular to coding artifacts which occur due to image and video compression. Typical application scenarios for video compression are narrowband transmission channels for which video coding is required but a subsequent classification is to be carried out on the receiver side. In this paper we present a special H.264/Advanced Video Codec (AVC) based video codec that allows certain regions of a picture to be coded with near constant picture quality in order to allow a reliable classification using neural networks, whereas the remaining image will be coded using constant bit rate. We have combined this feature with the ability to run with lowest latency properties, which is usually also required in remote control applications scenarios. The codec has been implemented as a fully hardwired High Definition video capable hardware architecture which is suitable for Field Programmable Gate Arrays.
Geometric generalization is a fundamental concept in the digital mapping process. An increasing amount of spatial data is provided on the web as well as a range of tools to process it. This jABC workflow is used for the automatic testing of web-based generalization services like mapshaper.org by executing its functionality, overlaying both datasets before and after the transformation and displaying them visually in a .tif file. Mostly Web Services and command line tools are used to build an environment where ESRI shapefiles can be uploaded, processed through a chosen generalization service and finally visualized in Irfanview.