Refine
Has Fulltext
- yes (163) (remove)
Year of publication
- 2015 (163) (remove)
Language
- English (114)
- German (46)
- Spanish (2)
- Multiple languages (1)
Is part of the Bibliography
- no (163) (remove)
Keywords
Institute
- Institut für Physik und Astronomie (74)
- Institut für Informatik und Computational Science (45)
- Institut für Romanistik (13)
- Vereinigung für Jüdische Studien e. V. (13)
- MenschenRechtsZentrum (8)
- Kommunalwissenschaftliches Institut (7)
- Institut für Chemie (2)
- Mathematisch-Naturwissenschaftliche Fakultät (1)
Die Studieneingangsphase stellt für Studierende eine Schlüsselphase des tertiären Ausbildungsabschnitts dar. Fachwissenschaftliches Wissen wird praxisfern vermittelt und die Studierenden können die Zusammenhänge zwischen den Themenfeldern der verschiedenen Vorlesungen nicht erkennen. Zur Verbesserung der Situation wurde ein Workshop entwickelt, der die Verbindung der Programmierung und der Datenstrukturen vertieft. Dabei wird das Spiel Go-Moku1 als Android-App von den Studierenden selbständig entwickelt. Die Kombination aus Software (Java, Android-SDK) und Hardware (Tablet-Computer) für ein kleines realistisches Softwareprojekt stellt für die Studierenden eine neue Erfahrung dar.
We investigate the ergodic properties of a random walker performing (anomalous) diffusion on a random fractal geometry. Extensive Monte Carlo simulations of the motion of tracer particles on an ensemble of realisations of percolation clusters are performed for a wide range of percolation densities. Single trajectories of the tracer motion are analysed to quantify the time averaged mean squared displacement (MSD) and to compare this with the ensemble averaged MSD of the particle motion. Other complementary physical observables associated with ergodicity are studied, as well. It turns out that the time averaged MSD of individual realisations exhibits non-vanishing fluctuations even in the limit of very long observation times as the percolation density approaches the critical value. This apparent non-ergodic behaviour concurs with the ergodic behaviour on the ensemble averaged level. We demonstrate how the non-vanishing fluctuations in single particle trajectories are analytically expressed in terms of the fractal dimension and the cluster size distribution of the random geometry, thus being of purely geometrical origin. Moreover, we reveal that the convergence scaling law to ergodicity, which is known to be inversely proportional to the observation time T for ergodic diffusion processes, follows a power-law ∼T−h with h < 1 due to the fractal structure of the accessible space. These results provide useful measures for differentiating the subdiffusion on random fractals from an otherwise closely related process, namely, fractional Brownian motion. Implications of our results on the analysis of single particle tracking experiments are provided.
The evolution of massive stars in very low metallicity galaxies is less well observationally
constrained than in environments more similar to the Milky Way, M33, or the LMC. We discuss
in this contribution the current state of our program to search for and characterize Wolf-Rayet stars (and other massive emission line stars) in low metallicity galaxies in the Local Volume.
PopIII-star siblings in IZw18 and metal-poor WR galaxies unveiled from integral field spectroscopy
(2015)
Here, we highlight our recent results from the IFS study of Mrk178, the closest metal-poor WR galaxy, and of IZw18, the most metal-poor star-forming galaxy known in the local Universe. The IFS data of Mrk178 show the importance of aperture effects on the search for WR features, and the extent to which physical variations in the ISM properties can be detected. Our IFS data of IZw18 reveal its entire nebular HeIIλ4686-emitting region, and indicate for the very first time that peculiar, hot (nearly) metal-free ionizing stars (called here PopIII-star siblings) might hold the key to the HeII-ionization in IZw18.
The main objective of this work is to investigate the evolution of massive stars, and the interplay between them and the ionized gas for a sample of local metal-poor Wolf-Rayet galaxies.
Optical integral field spectrocopy was used in combination with multi-wavelength radio data.
Combining optical and radio data, we locate Wolf-Rayet stars and supernova remnants across the Wolf-Rayet galaxies to study the spatial correlation between them. This study will shed light on the massive star formation and its feedback, and will help us to better understand
distant star-forming galaxies.
We highlight the basic physics that allows fundamental parameters, such as the effective
temperature, luminosity, abundances, and mass-loss rate, of Wolf-Rayet (W-R) stars to be
determined. Since the temperature deduced from the spectrum of a W-R star is an ionization
temperature, a detailed discussion of the ionization structure of W-R winds, and how it is set, is given. We also provide an overview of line and continuum formation in W-R stars. Mechanisms that contribute to the strength of different emission lines, such as collisional excitation, radiative recombination, dielectronic recombination, and continuum uorescence, are discussed.
For some years now, spectroscopic measurements of massive stars in the amateur domain have been fulfilling professional requirements. Various groups in the northern and southern hemispheres have been established, running successful professional-amateur (ProAm) collaborative campaigns, e.g., on WR, O and B type stars. Today high quality data (echelle and long-slit) are regularly delivered and corresponding results published. Night-to-night long-term observations over months to years open a new opportunity for massive-star research. We introduce recent and ongoing sample campaigns (e.g. ∊ Aur, WR 134, ζ Pup), show respective results and highlight the vast amount of data collected in various data bases. Ultimately it is in the time-dependent domain where amateurs can shine most.
In this paper, I review observational evidence from spectroscopy and polarimetry for the presence of small and large scale structure in the winds of Wolf-Rayet (WR) stars. Clumping is known to be ubiquitous in the winds of these stars and many of its characteristics can be deduced from spectroscopic time-series and polarisation lightcurves. Conversely, a much smaller fraction of WR stars have been shown to harbour larger scale structures in their wind (∼ 1/5) while they are thought to be present is the winds of most of their O-star ancestors. The reason for this difference is still unknown.
Because most massive stars have been or will be affected by a companion during the course of their evolution, we cannot afford to neglect binaries when discussing the progenitors of supernovae and GRBs. Analyzing linear polarization in the emission lines of close binary systems allows us to probe the structures of these systems' winds and mass flows, making it possible to map the complex morphologies of the mass loss and mass transfer structures that shape their subsequent evolution. In Wolf-Rayet (WR) binaries, line polarization variations with orbital phase distinguish polarimetric signatures arising from lines that scatter near the stars from those that scatter far from the orbital plane. These far-scattering lines may form the basis for a "binary line-effect method" of identifying rapidly rotating WR stars (and hence GRB progenitor candidates) in binary systems.
Before GAIA improves the HIPPARCOS survey, direct determination of the distance via parallax is only possible for γ Vel, but the analysis of the cluster or association to which WR stars are associated can give distances with a 50% to a 10% accuracy. The list of Galactic clusters, associations and clusters/association candidates has grown significantly in the last decade with the numerous deep, high resolution surveys of the Milky Way. In this work, we revisit the fundamental parameters of known clusters with WR stars, and we present the search for new ones. All our work is based on the catalogs from the VVV (from the VISTA telescope) and the UKIDS (from the UKIRT telescope) near infrared surveys. Finally, the relations between the fundamental parameters of clusters with WR stars are explored.
The Westerlund 1 (Wd1) cluster hosts a rich and varied collection of massive stars. Its dynamical youth and the absence of ongoing star formation indicate a coeval population. As such, the simultaneous presence of both late-type supergiants and Wolf-Rayet stars has defied explanation in the context of single-star evolution. Observational evidence points to a high binary fraction, hence this stellar population offers a robust test for stellar models accounting for both single-star and binary evolution. We present an optical to near-IR (VLT & NTT) spectroscopic analysis of 22 WR stars in Wd 1, delivering physical properties for the WR stars.
We discuss how these differ from the Galactic field population, and how they may be reconciled with the predictions of single and binary evolutionary models.
I review our current understanding of the interaction between a Wolf-Rayet star's fast wind and the surrounding medium, and discuss to what extent the predictions of numerical simulations coincide with multiwavelength observations of Wolf-Rayet nebulae. Through a series of examples, I illustrate how changing the input physics affects the results of the numerical simulations. Finally, I discuss how numerical simulations together with multiwavelength observations of these objects allow us to unpick the previous mass-loss history of massive stars.
We analyse whether a stellar atmosphere model computed with the code CMFGEN provides an optimal description of the stellar observations of WR 136 and simultaneously reproduces the nebular observations of NGC 6888, such as the ionization degree, which is modelled with the pyCloudy code. All the observational material available (far and near UV and optical spectra) were used to constrain such models. We found that the stellar temperature T∗, at τ = 20, can be in a range between 70 000 and 110 000 K, but when using the nebula as an additional restriction, we found that the stellar models with T∗ ∼ 70 000 K represent the best solution for both, the star and the nebula.
The Galactic Center (GC) hosts three of the most massive WR rich, resolved young clusters in the Local Group as well as a large number of apparently isolated massive stars. Therefore, it constitutes a test bed to study the star formation history of the region, to probe a possible top-heavy scenario and to address massive star formation (clusters vs isolation) in such a dense and harsh environment. We present results from our ongoing infrared spectroscopic studies of WRs and other massive stars at the Center of the Milky Way.
The super massive binary system, η Car, experienced periastron passage in the summer of 2014. We observed the star twice around the maximum (forb =0.97, 2014 June 6) and just before the minimum (ϕorb =0.99, 2014 July 28) of its wind-wind colliding (WWC) X-ray emis-sion using the XMM-Newton and NuSTAR observatories, the latter of which is equipped with extremely hard X-ray (>10 keV) focusing mirrors. In both observations, NuSTAR detected the thermal X-ray tail up to 40-50 keV. The hard slope is consistent with an electron tem- perature of ∼6 keV, which is significantly higher than the ionization temperature (kT ∼4 keV) measured from the Fe K emission lines, assuming collisional equilibrium plasma. The spectrum did not show a hard power-law component above this energy range, unlike earlier detections with INTEGRAL and Suzaku. In the second NuSTAR observation, the X-ray flux above 5 keV declined gradually in ∼1 day. This result suggests that the WWC apex was gradually hidden behind the optically thick primary wind around conjunction.
Helium stars
(2015)
There are outstanding problems in trying to reproduce the observed nature of Wolf–Rayet stars from theoretical stellar models. We have investigated the effects of uncertainties, such as composition and mass-loss rate, on the evolution and structure of Wolf–Rayet stars and their lower mass brethren. We find that the normal Conti scenario needs to be altered, with different WR types being due to different initial masses as well as different stages of evolution.
Wolf-Rayet stars are very hot stars close to the Eddington limit. In the conditions encountered in their radiation pressure dominated outer layers several instabilities are expected to arise. These instabilities could influence both the dynamic of their optically thick winds and the observed spectral lines introducing small and large scale variability. We investigate the conditions in the convective envelopes of our helium star models and relate them to the appearance of a high number of stochastic density inhomogeneities, i.e. clumping in the optically thick wind. We also investigate the pulsational stability of these envelope, considering the effect of the high stellar wind mass loss rates.
We suggest several ideas which when combined could lead to a new mechanism for long-term pulsations of very hot and luminous stars. These involve the interplay between convection, radiation, atmospheric clumping and winds, which collectively feed back to stellar expansion and contraction. We discuss these ideas and point out the future work required in order to fill in the blanks.
We first give a short historical overview with some key facts of massive star population synthesis with binaries. We then discuss binary population codes and focus on two ingredients which are important for massive star population synthesis and which may be different in different codes. Population simulations with binaries is the third part where we consider the initial massive binary frequency, the RSG/WR and WC/WN and SNII/SNIbc number ratio's, the probable initial rotational velocity distribution of massive stars.
We investigate the rarity of the Wolf-Rayet X-ray binaries (WRXRBs) in contrast to their predecessors, the high mass X-ray binaries (HMXRBs). Recent studies suggest that common envelope (CE) mergers during the evolution of a HMXRBs may be responsible (Linden et al. 2012). We conduct a binary population synthesis to generate a population of HMXRBs mimicking the Galactic sample and vary the efficiency parameter during the CE phase to match the current WRXRB to HMXRB ratio. We find that ∼50% of systems must merge to match observational constraints.
Wolf-Rayet (WR) stars, as they are advanced stages of the life of massive stars, provide a good test for various physical processes involved in the modelling of massive stars, such as rotation and mass loss. In this paper, we show the outputs of the latest grids of single massive stars computed with the Geneva stellar evolution code, and compare them with some observations. We present a short discussion on the shortcomings of single stars models and we also briefly discuss the impact of binarity on the WR populations.
We compute spectral libraries for populations of coeval stars using state-of-the-art massive-star evolutionary tracks that account for different astrophysics including rotation and close-binarity. Our synthetic spectra account for stellar and nebular contributions. We use our models to obtain E(B – V ), age, and mass for six clusters in spiral galaxy NGC 1566, which have ages of < 50 Myr and masses of > 5 x 104M⊙ according to standard models. NGC 1566 was observed from the NUV to the I-band as part of the imaging Treasury HST program LEGUS: Legacy Extragalactic UV Survey. We aim to establish i) if the models provide reasonable fits to the data, ii) how well the models and photometry are able to constrain the cluster properties, and iii) how different the properties obtained with different models are.
The interaction between massive star formation and gas is a key ingredient in galaxy evolution. Given the level of observational detail currently achievable in nearby starbursts, they constitute ideal laboratories to study interaction process that contribute to global evolution in all types of galaxies. Wolf-Rayet (WR) stars, as an observational marker of high mass star formation, play a pivotal role and their winds can strongly influence the surrounding gas. Imaging spectroscopy of two nearby (<4 Mpc) starbursts, both of which show multiple regions with WR stars, are discussed. The relation between the WR content and the physical and chemical properties of the surrounding ionized gas is explored.
Using ESPaDOnS optical spectra of WR6, we search variations on the stellar wind parameters during the different phases of the spectral variations. We use the radiative transfer code CMFGEN (Hillier & Miller 1998) to determine the wind parameters. Our work gives mean parameters for WR6, Teff = 55 kK, M = 2.7 × 10^-5 M⊙/yr and v∞ =1700 km/s. Furthermore the line profiles variations at different phases are the consequence of a variation of mass loss rate and temperature un the winds. Effective temperature reaches 59 kK at the highest intensity, whereas the mass-loss rate decreases to 2.5 × 10^-5 M⊙/yr in that case. On the other hand, effective temperature decreases to 52.5 kK and the mass-loss rate increases to 3 × 10^-5 M/⊙yr when the line profile reach its minimum intensity. Results confirm the variable nature of the stellar wind, presented in this case on two of its fundamental parameters: temperature and mass-loss; which could be used to constrain the nature of the instability at the basis of the wind.
The X-ray observations of the colliding wind binary WR 21a is reported. The first monitoring performed by Swift/XRT in order to reveal the phase-locked variation. Our observations cover 201 different epochs from 2013 October 1 to 2015 January 30 for a total exposure of about 306 ks. It is found for the first time that the luminosity varies roughly in inverse proportion to the separation of the two stars before the X-ray maximum but later drops rapidly toward periastron.
Auf der Grundlage der Planung, Durchführung, Evaluation und Revision eines gemeinsamen Seminars von Medienpädagogik und Didaktik der Informatik stellen wir in diesem Aufsatz dar, wo die Defizite klassischer Medienbildung in Bezug auf digitale bzw. interaktive Medien liegen und welche Inhalte der Informatik für Studierende aller Lehrämter – im allgemeinbildenden Sinne – aus dieser Perspektive relevant erscheinen.
Der folgende Artikel beschreibt die Evaluation eines Lehrvideos zum informatischen Problemlösen, welches auf der Grundlage einer Vergleichsstudie mit starken und schwachen Problemlösern entwickelt wurde. Beispielhaft wird in dem Film ein Färbeproblem durch einen fiktiven Hochleister unter lautem Denken gelöst, die einzelnen Arbeitsschritte werden abschnittsweise kommentiert und erklärt. Ob dieses Lernkonzept von Studenten akzeptiert wird und sich durch Anschauen des Videos tatsächlich ein Lerneffekt einstellt, wurde durch eine Befragung und eine erste Vergleichsstudie untersucht.
In der Lehre zur MCI (Mensch-Computer-Interaktion) stellt sich immer wieder die Herausforderung, praktische Übungen mit spannenden Ergebnissen durchzuführen, die sich dennoch nicht in technischen Details verlieren sondern MCI-fokussiert bleiben. Im Lehrmodul „Interaktionsdesign“ an der Universität Hamburg werden von Studierenden innerhalb von drei Wochen prototypische Interaktionskonzepte für das Spiel Neverball entworfen und praktisch umgesetzt. Anders als in den meisten Grundlagenkursen zur MCI werden hier nicht Mock-Ups, sondern lauffähige Software entwickelt. Um dies innerhalb der Projektzeit zu ermöglichen, wurde Neverball um eine TCP-basierte Schnittstelle erweitert. So entfällt die aufwändige Einarbeitung in den Quellcode des Spiels und die Studierenden können sich auf ihre Interaktionsprototypen konzentrieren. Wir beschreiben die Erfahrungen aus der
mehrmaligen Durchführung des Projektes und erläutern unser Vorgehen bei der Umsetzung. Die Ergebnisse sollen Lehrende im Bereich MCI unterstützen, ähnliche praxisorientierte Übungen mit Ergebnissen „zum Anfassen“ zu gestalten.
Es wird ein Informatik-Wettbewerb für Schülerinnen und Schüler der Sekundarstufe II beschrieben, der über mehrere Wochen möglichst realitätsnah die Arbeitswelt eines Informatikers vorstellt. Im Wettbewerb erarbeiten die Schülerteams eine Android-App und organisieren ihre Entwicklung durch Projektmanagementmethoden, die sich an professionellen, agilen Prozessen orientieren. Im Beitrag werden der theoretische Hintergrund zu Wettbewerben, die organisatorischen und didaktischen Entscheidung, eine erste Evaluation sowie Reflexion und Ausblick dargestellt.
The growing impact of globalisation and the development of
a ‘knowledge society’ have led many to argue that 21st century skills are
essential for life in twenty-first century society and that ICT is central
to their development. This paper describes how 21st century skills, in
particular digital literacy, critical thinking, creativity, communication
and collaboration skills, have been conceptualised and embedded in the
resources developed for teachers in iTEC, a four-year, European project.
The effectiveness of this approach is considered in light of the data
collected through the evaluation of the pilots, which considers both the
potential benefits of using technology to support the development of
21st century skills, but also the challenges of doing so. Finally, the paper
discusses the learning support systems required in order to transform
pedagogies and embed 21st century skills. It is argued that support is
required in standards and assessment; curriculum and instruction; professional
development; and learning environments.
In the project MoKoM, which is funded by the German
Research Foundation (DFG) from 2008 to 2012, a test instrument
measuring students’ competences in computer science was developed.
This paper presents the results of an expert rating of the levels of
students’ competences done for the items of the instrument.
At first we will describe the difficulty-relevant features that were
used for the evaluation. These were deduced from computer science,
psychological and didactical findings and resources. Potentials and
desiderata of this research method are discussed further on. Finally
we will present our conclusions on the results and give an outlook on
further steps.
BugHunt
(2015)
Competencies related to operating systems and computer
security are usually taught systematically. In this paper we present
a different approach, in which students have to remove virus-like
behaviour on their respective computers, which has been induced by
software developed for this purpose. They have to develop appropriate
problem-solving strategies and thereby explore essential elements of
the operating system. The approach was implemented exemplarily in
two computer science courses at a regional general upper secondary
school and showed great motivation and interest in the participating
students.
The objectives of this study were to examine (a) the effect
of dynamic assessment (DA) in a 3D Immersive Virtual Reality
(IVR) environment as compared with computerized 2D and noncomputerized
(NC) situations on cognitive modifiability, and (b) the
transfer effects of these conditions on more difficult problem solving
administered two weeks later in a non-computerized environment. A
sample of 117 children aged 6:6-9:0 years were randomly assigned
into three experimental groups of DA conditions: 3D, 2D, and NC, and
one control group (C). All groups received the pre- and post-teaching
Analogies subtest of the Cognitive Modifiability Battery (CMB-AN).
The experimental groups received a teaching phase in conditions similar
to the pre-and post-teaching phases. The findings showed that cognitive
modifiability, in a 3D IVR, was distinctively higher than in the two
other experimental groups (2D computer group and NC group). It was
also found that the 3D group showed significantly higher performance
in transfer problems than the 2D and NC groups.
This article shows a discussion about the key competencies
in informatics and ICT viewed from a philosophical foundation presented
by Martha Nussbaum, which is known as ‘ten central capabilities’.
Firstly, the outline of ‘The Capability Approach’, which has been presented
by Amartya Sen and Nussbaum as a theoretical framework of
assessing the state of social welfare, will be explained. Secondly, the
body of Nussbaum’s ten central capabilities and the reason for being
applied as the basis of discussion will be shown. Thirdly, the relationship
between the concept of ‘capability’ and ‘competency’ is to be
discussed. After that, the author’s assumption of the key competencies
in informatics and ICT led from the examination of Nussbaum’s ten
capabilities will be presented.
This paper originated from discussions about the need for
important changes in the curriculum for Computing including two focus
group meetings at IFIP conferences over the last two years. The
paper examines how recent developments in curriculum, together with
insights from curriculum thinking in other subject areas, especially mathematics
and science, can inform curriculum design for Computing.
The analysis presented in the paper provides insights into the complexity
of curriculum design as well as identifying important constraints and
considerations for the ongoing development of a vision and framework
for a Computing curriculum.
IT EnGAGES!
(2015)
Durch den Einsatz von Spielen und Spielelementen in Lernkontexten wird versucht, Lernende zur Beschäftigung mit den Lerninhalten zu motivieren. Spielerische Elemente haben allerdings nicht nur positive motivationale Effekte: Sie können sich beispielsweise negativ auf die intrinsische Motivation auswirken, und auch nicht jeder Lernende spielt gerne. Um negativen Einflüssen von Gamification entgegenzuwirken, wurde ein Toolkit für adaptierbare Lernumgebungen entwickelt. Damit erzeugte Lernumgebungen erlauben es Studierenden, den Grad der Gamification selbst zu bestimmen, indem Spielelemente an- und abgeschaltet werden. Im Rahmen einer Anfängerprogrammiervorlesung wurden Lernspielaufgaben aus den existierenden, optionalen interaktiven eTests entwickelt und Studierenden als zusätzliche Lerngelegenheit angeboten. Eine erste explorative Studie bestätigt die Vermutung, dass die Akzeptanz des adaptierbaren Lernspiels sehr hoch ist, es aber dennoch Studierende gibt, welche die Lernumgebung ohne Spielelemente durcharbeiten. Somit bietet adaptierbare Gamification verschiedenen Studierenden die Möglichkeit, sich zusätzliche motivationale Anreize durch Zuschalten von Spielelementen zu verschaffen, ohne dabei zum Spielen „genötigt“ zu werden.
Peer Assessment ist eine Methode, bei der die Teilnehmer eine gestellte Aufgabe nicht nur bearbeiten und einreichen, sondern – in einer zweiten Phase – diese auch gegenseitig überprüfen, kommentieren und bewerten. Durch diese Methode wird, auch in sehr großen Veranstaltungen, das Üben mit individuellen Bewertungen und individuellem Feedback möglich.
Im Wintersemester 2013/14 wurde dieser Ansatz in der Erstsemesterveranstaltung Programmieren an der Technischen Hochschule Nürnberg mit 340 Studierenden als semesterbegleitendes Online-Pflichtpraktikum erprobt. Bei gleichen Leistungsanforderungen wurde bei Studierenden, die erfolgreich am Praktikum teilnahmen, eine Reduzierung der Durchfallquote um durchschnittlich 60 % und eine Verbesserung der Durchschnittsnote um 0,6 – 0,9 Notenstufen erzielt. Zudem lernten die teilnehmenden Studierenden kontinuierlicher, bereiteten Lerninhalte besser nach und gelangten zu einer überwiegend positiven Einschätzung des Praktikums und der Methode. Im E-Learning System Moodle kann Peer Assessment, mit moderatem Umsetzungs- und Betreuungsaufwand, mit der Workshop-Aktivität realisiert werden. Im Beitrag wird auf die Schlüsselelemente des erfolgreichen Einsatzes von Peer Assessment eingegangen.
Es wird ein umfassendes Mentoring Konzept im Studiengang Informatik an der RWTH Aachen vorgestellt, das den Übergang von der Schule zur Universität unterstützt und gleichzeitig beim Auftreten von Schwierigkeiten im Verlauf des Studiums effiziente und kompetente Beratung bietet. Das Programm erreicht durchgängig hohe Akzeptanzwerte bei den Studierenden trotz verpflichtender Teilnahme im ersten Semester. Die Wirksamkeit des Programms ist durch die zahlreichen einflussgebenden Variablen zwar rein quantitativ kaum messbar, die Möglichkeit auf organisatorische und fachliche Probleme eines Jahrgangs reagieren zu können sowie einen Einblick auf die Gründe für einen Studienabbruch zu bekommen, bestätigt aber die Notwendigkeit der Maßnahme.
Der Beitrag stellt das Konzept des Semantischen Positionierens als eine Möglichkeit vor, Grundformen des wissenschaftlichen Arbeitens und elementare Formen der diskursiven Auseinandersetzung zu vermitteln, ohne dass die Studierenden sich inhaltlich an der aktuellen Forschung beteiligen müssten. Die Umsetzung dieses Konzepts im Bachelorstudium der Informatik verdeutlicht, dass mit diesem Ansatz sowohl die Kompetenzen für den Übergang in den mehr forschungsgetriebenen Masterstudiengang als auch für die berufliche Wissensarbeit erworben werden können.
Computational thinking is a fundamental skill set that is learned
by studying Informatics and ICT. We argue that its core ideas can
be introduced in an inspiring and integrated way to both teachers and
students using fun and contextually rich cs4fn ‘Computer Science for
Fun’ stories combined with ‘unplugged’ activities including games and
magic tricks. We also argue that understanding people is an important
part of computational thinking. Computational thinking can be fun for
everyone when taught in kinaesthetic ways away from technology.
As a result of the Bologna reform of educational systems in
Europe the outcome orientation of learning processes, competence-oriented
descriptions of the curricula and competence-oriented assessment
procedures became standard also in Computer Science Education
(CSE). The following keynote addresses important issues of shaping
a CSE competence model especially in the area of informatics system
comprehension and object-oriented modelling. Objectives and research
methodology of the project MoKoM (Modelling and Measurement
of Competences in CSE) are explained. Firstly, the CSE competence
model was derived based on theoretical concepts and then secondly the
model was empirically examined and refined using expert interviews.
Furthermore, the paper depicts the development and examination of
a competence measurement instrument, which was derived from the
competence model. Therefore, the instrument was applied to a large
sample of students at the gymnasium’s upper class level. Subsequently,
efforts to develop a competence level model, based on the retrieved empirical
results and on expert ratings are presented. Finally, further demands
on research on competence modelling in CSE will be outlined.
Regardless of what is intended by government curriculum
specifications and advised by educational experts, the competencies
taught and learned in and out of classrooms can vary considerably.
In this paper, we discuss in particular how we can investigate the
perceptions that individual teachers have of competencies in ICT,
and how these and other factors may influence students’ learning. We
report case study research which identifies contradictions within the
teaching of ICT competencies as an activity system, highlighting issues
concerning the object of the curriculum, the roles of the participants and
the school cultures. In a particular case, contradictions in the learning
objectives between higher order skills and the use of application tools
have been resolved by a change in the teacher’s perceptions which
have not led to changes in other aspects of the activity system. We look
forward to further investigation of the effects of these contradictions in
other case studies and on forthcoming curriculum change.
The paper presents two approaches to the development of
a Computer Science Competence Model for the needs of curriculum
development and evaluation in Higher Education. A normativetheoretical
approach is based on the AKT and ACM/IEEE curriculum
and will be used within the recommendations of the German
Informatics Society (GI) for the design of CS curricula. An empirically
oriented approach refines the categories of the first one with regard to
specific subject areas by conducting content analysis on CS curricula of
important universities from several countries. The refined model will be
used for the needs of students’ e-assessment and subsequent affirmative
action of the CS departments.
The paper discusses the issue of supporting informatics
(computer science) education through competitions for lower and
upper secondary school students (8–19 years old). Competitions play
an important role for learners as a source of inspiration, innovation,
and attraction. Running contests in informatics for school students
for many years, we have noticed that the students consider the contest
experience very engaging and exciting as well as a learning experience.
A contest is an excellent instrument to involve students in problem
solving activities. An overview of infrastructure and development
of an informatics contest from international level to the national one
(the Bebras contest on informatics and computer fluency, originated
in Lithuania) is presented. The performance of Bebras contests in 23
countries during the last 10 years showed an unexpected and unusually
high acceptance by school students and teachers. Many thousands of
students participated and got a valuable input in addition to their regular
informatics lectures at school. In the paper, the main attention is paid
to the developed tasks and analysis of students’ task solving results in
Lithuania.
Social networks are currently at the forefront of tools that
lend to Personal Learning Environments (PLEs). This study aimed to
observe how students perceived PLEs, what they believed were the
integral components of social presence when using Facebook as part
of a PLE, and to describe student’s preferences for types of interactions
when using Facebook as part of their PLE. This study used mixed
methods to analyze the perceptions of graduate and undergraduate
students on the use of social networks, more specifically Facebook as a
learning tool. Fifty surveys were returned representing a 65 % response
rate. Survey questions included both closed and open-ended questions.
Findings suggested that even though students rated themselves relatively
well in having requisite technology skills, and 94 % of students used
Facebook primarily for social use, they were hesitant to migrate these
skills to academic use because of concerns of privacy, believing that
other platforms could fulfil the same purpose, and by not seeing the
validity to use Facebook in establishing social presence. What lies
at odds with these beliefs is that when asked to identify strategies in
Facebook that enabled social presence to occur in academic work, the
majority of students identified strategies in five categories that lead to
social presence establishment on Facebook during their coursework.
Teaching Data Management
(2015)
Data management is a central topic in computer science as
well as in computer science education. Within the last years, this topic is
changing tremendously, as its impact on daily life becomes increasingly
visible. Nowadays, everyone not only needs to manage data of various
kinds, but also continuously generates large amounts of data. In
addition, Big Data and data analysis are intensively discussed in public
dialogue because of their influences on society. For the understanding of
such discussions and for being able to participate in them, fundamental
knowledge on data management is necessary. Especially, being aware
of the threats accompanying the ability to analyze large amounts of
data in nearly real-time becomes increasingly important. This raises the
question, which key competencies are necessary for daily dealings with
data and data management.
In this paper, we will first point out the importance of data management
and of Big Data in daily life. On this basis, we will analyze which are
the key competencies everyone needs concerning data management to
be able to handle data in a proper way in daily life. Afterwards, we will
discuss the impact of these changes in data management on computer
science education and in particular database education.
The Student Learning Ecology
(2015)
Educational research on social media has showed that
students use it for socialisation, personal communication, and informal
learning. Recent studies have argued that students to some degree use
social media to carry out formal schoolwork. This article gives an
explorative account on how a small sample of Norwegian high school
students use social media to self-organise formal schoolwork. This
user pattern can be called a “student learning ecology”, which is a
user perspective on how participating students gain access to learning
resources.