Refine
Year of publication
Document Type
- Article (596)
- Preprint (299)
- Postprint (257)
- Conference Proceeding (160)
- Doctoral Thesis (51)
- Working Paper (48)
- Review (36)
- Monograph/Edited Volume (26)
- Part of a Book (24)
- Other (16)
Language
- English (1517) (remove)
Is part of the Bibliography
- no (1517) (remove)
Keywords
- Curriculum Framework (31)
- European values education (31)
- Europäische Werteerziehung (31)
- Familie (31)
- Family (31)
- Lehrevaluation (31)
- Studierendenaustausch (31)
- Unterrichtseinheiten (31)
- curriculum framework (31)
- lesson evaluation (31)
Institute
- Institut für Mathematik (309)
- Extern (259)
- Institut für Physik und Astronomie (119)
- Vereinigung für Jüdische Studien e. V. (108)
- Department Linguistik (91)
- Department Psychologie (86)
- Institut für Chemie (65)
- Hasso-Plattner-Institut für Digital Engineering GmbH (55)
- Historisches Institut (52)
- Institut für Umweltwissenschaften und Geographie (49)
Content: Introduction: Do the Arts Really Matter? Aesthetic Cognition and Human Development The Significance of Arts in Everyday Life: Evidence from Case StudiesArts and Quality of Experience: A Systematic Analysis The Conditions of Optimal Experience The Representation of Experience in Personality Consequences for Teaching the Arts
This study investigated the relation between interest in four different subject areas (mathematics, biology, English, history) and the quality of experience in class. The strength of interest as a predictor of experience was contrasted with that of achievement motivation and scholastic ability. A total of208 highly able freshmen and sophomores completed interest ratings, an achievement motivation questionnaire, and the Preliminary Scholastic Aptitude Test (PSAT). These assessments were followed by one week of experience sampling. In addition, grades were available for the subject areas involved. The results showed that interest was a significant predictor of the experience of potency, intrinsic motivation, self-esteem, and perception of skill. Controlling for ability and achievement motivation did not decrease the strength of these relations. Achievement motivation and ability proved to be considerably weaker predictors of the quality of experience than was interest. In addition, interest contributed significantly to the prediction of grades in mathematics, biology, and history, but not English. The main results and some limitations of the study are discussed, and suggestions for future research are made.
Habsburg Central Europe
(2024)
Central Europe is characterized by linguistic and cultural density as well as by endogenous and exogenous cultural influences. These constellations were especially visible in the former Habsburg Empire, where they influenced the formation of individual and collective identities. This led not only to continual crises and conflicts, but also to an equally enormous creative potential as became apparent in the culture of the fin-de-siècle.
Computational thinking is a fundamental skill set that is learned
by studying Informatics and ICT. We argue that its core ideas can
be introduced in an inspiring and integrated way to both teachers and
students using fun and contextually rich cs4fn ‘Computer Science for
Fun’ stories combined with ‘unplugged’ activities including games and
magic tricks. We also argue that understanding people is an important
part of computational thinking. Computational thinking can be fun for
everyone when taught in kinaesthetic ways away from technology.
In Search of Belonging
(2021)
More than 200,000 Jews left the Habsburg province of Galicia between 1881 and 1910. No longer living in the places of their childhood, they settled in urban centers, such as in New York’s Lower East Side. In this neighborhood, Galician Jews began to search for new relationships that linked the places they left and the ones where they arrived and settled. By looking at Galicia through the lens of autobiographical writings by former Jewish immigrants who became established residents of New York, this article emphasizes the role of regionalism in the context of transnational conceptions of a new American Jewish self-understanding. It argues that the key to analyzing the evolution of “eastern Europe” as a common place of origin for American Jewry is the constant dialogue between the places of origin and arrival. Specifically, philanthropic efforts during and after the First World War and the proliferation of tourism both enabled these settled immigrants to gradually replace regional notions, such as the idea of Galicia, with a mythical image of eastern Europe to create a sense of community as American Jews.
We present an algorithm that computes a function that assigns consecutive integers to trees recognized by a deterministic, acyclic, finite-state, bottom-up tree automaton. Such function is called minimal perfect hashing. It can be used to identify trees recognized by the automaton. Its value may be seen as an index in some other data structures. We also present an algorithm for inverted hashing.
A comparison of current trends within computer science teaching in school in Germany and the UK
(2013)
In the last two years, CS as a school subject has gained a lot of attention worldwide, although different countries have differing approaches to and experiences of introducing CS in schools. This paper reports on a study comparing current trends in CS at school, with a major focus on two countries, Germany and UK. A survey was carried out of a number of teaching professionals and experts from the UK and Germany with regard to the content and delivery of CS in school. An analysis of the quantitative data reveals a difference in foci in the two countries; putting this into the context of curricular developments we are able to offer interpretations of these trends and suggest ways in which curricula in CS at school should be moving forward.
The paper discusses the issue of supporting informatics
(computer science) education through competitions for lower and
upper secondary school students (8–19 years old). Competitions play
an important role for learners as a source of inspiration, innovation,
and attraction. Running contests in informatics for school students
for many years, we have noticed that the students consider the contest
experience very engaging and exciting as well as a learning experience.
A contest is an excellent instrument to involve students in problem
solving activities. An overview of infrastructure and development
of an informatics contest from international level to the national one
(the Bebras contest on informatics and computer fluency, originated
in Lithuania) is presented. The performance of Bebras contests in 23
countries during the last 10 years showed an unexpected and unusually
high acceptance by school students and teachers. Many thousands of
students participated and got a valuable input in addition to their regular
informatics lectures at school. In the paper, the main attention is paid
to the developed tasks and analysis of students’ task solving results in
Lithuania.
We analyze the notions of monotonicity and complete monotonicity for Markov Chains in continuous-time, taking values in a finite partially ordered set. Similarly to what happens in discrete-time, the two notions are not equivalent. However, we show that there are partially ordered sets for which monotonicity and complete monotonicity coincide in continuous time but not in discrete-time.
Messianic Jews are Jewish individuals who syncretically accept both the messianic character of Jesus and the ritual cultic practices provided by traditional Judaism. The present article examines the emergence of this marginal syncretic movement in contemporary Israel, and maintains that it represents a radical development in the bimillenary history of Jewish-Christian relations. This article offers a general introduction to the notion of Jewish-Christian identity, a brief history of the first group of Messianic Jews in the Land of Israel, the cultural influence and religious syncretism of the Messianic Jews in modern Israel, and, finally, the implication that Messianic Judaism is supposed to become the new paradigm within the various branches of Judaism.
rezensiertes Werk: Grossman, David: Eine Frau flieht vor einer Nachricht. - München : Hanser, 2009. - 728 S. ISBN 978-3-446-23397-3
We examined relations between eye movements (single-fixation durations) and RSVP-based event-related potentials (ERPs; N400’s) recorded during reading the same sentences in two independent experiments. Longer fixation durations correlated with larger N400 amplitudes. Word frequency and predictability of the fixated word as well as the predictability of the upcoming word accounted for this covariance in a path-analytic model. Moreover, larger N400 amplitudes entailed longer fixation durations on the next word, a relation accounted for by word frequency. This pattern offers a neurophysiological correlate for the lag-word frequency effect on fixation durations: Word processing is reliably expressed not only in fixation durations on currently fixated words, but also in those on subsequently fixated words.
Many hot stars exhibit stochastic polarimetric variability, thought to arise from clumping low in the wind. Here we investigate the wind properties required to reproduce this variability using analytic models, with particular emphasis on Luminous Blue Variables. We find that the winds must be highly structured, consisting of a large number of optically-thin clumps; while we find that the overall level of polarization should scale with mass-loss rate – consistent with observations of LBVs. The models also predict variability on very short timescales, which is supported by the results of a recent polarimetric monitoring campaign.
The genus-dependence of multi-loop superstring ams is estimated at large orders in perturbation theory using the super-Schottky group parameterization of supermoduli space. Restriction of the integration region to a subset of supermoduli space and a single fundamental domain of the super-modular group suggests an exponential dependence on the genus. Upper bounds for these estimates are obtained for arbitrary N-point superstring scattering amplitudes and are shown to be consistent with exact results obtained for special type II string amplitudes for orbifold or Calabi-Yau compactifications. The genus-dependence is then obtained by considering the effect of the remaining contribution to the superstring amplitudes after the coefficients of the formally divergent parts of the integrals vanish as a result of a sum over spin structures. The introduction of supersymmetry therefore leads to the elimination of large-order divergences in string pertubation theory, a result which is based only on the supersymmetric generalization of the polyakov measure and not the gauge group of the string model.
The derivation of the standard model from a higher-dimensional action suggests a further study of the fibre bundle formulation of gauge theories to determine the variations in the choice of structure group that are allowed in this geometrical setting. The action of transformations on the projection of fibres to their submanifolds are characteristic of theories with fewer gauge vector bosons, and specific examples are given, which may have phenomenological relevance. The spinor space for the three generations of fermions in the standard model is described algebraically.
On the existence of a non-zero lower bound for the number of Goldbach partitions of an even integer
(2002)
The Goldbach partitions of an even number greater than 2, given by the sums of two prime addends, form the non-empty set for all integers 2n with 2 ≤ n ≤ 2 × 1014. It will be shown how to determine by the method of induction the existence of a non-zero lower bound for the number of Goldbach partitions of all even integers greater than or equal to 4. The proof depends on contour arguments for complex functions in the unit disk.
The quantum cosmological wavefunction for a quadratic gravity theory derived from the heterotic string effective action is obtained near the inflationary epoch and during the initial Planck era. Neglecting derivatives with respect to the scalar field, the wavefunction would satisfy a third-order differential equation near the inflationary epoch which has a solution that is singular in the scale factor limit a(t) → 0. When scalar field derivatives are included, a sixth-order differential equation is obtained for the wavefunction and the solution by Mellin transform is regular in the a → 0 limit. It follows that inclusion of the scalar field in the quadratic gravity action is necessary for consistency of the quantum cosmology of the theory at very early times.
Cloud computing is a model for enabling on-demand access to a shared pool of computing resources. With virtually limitless on-demand resources, a cloud environment enables the hosted Internet application to quickly cope when there is an increase in the workload. However, the overhead of provisioning resources exposes the Internet application to periods of under-provisioning and performance degradation. Moreover, the performance interference, due to the consolidation in the cloud environment, complicates the performance management of the Internet applications. In this dissertation, we propose two approaches to mitigate the impact of the resources provisioning overhead. The first approach employs control theory to scale resources vertically and cope fast with workload. This approach assumes that the provider has knowledge and control over the platform running in the virtual machines (VMs), which limits it to Platform as a Service (PaaS) and Software as a Service (SaaS) providers. The second approach is a customer-side one that deals with the horizontal scalability in an Infrastructure as a Service (IaaS) model. It addresses the trade-off problem between cost and performance with a multi-goal optimization solution. This approach finds the scale thresholds that achieve the highest performance with the lowest increase in the cost. Moreover, the second approach employs a proposed time series forecasting algorithm to scale the application proactively and avoid under-utilization periods. Furthermore, to mitigate the interference impact on the Internet application performance, we developed a system which finds and eliminates the VMs suffering from performance interference. The developed system is a light-weight solution which does not imply provider involvement. To evaluate our approaches and the designed algorithms at large-scale level, we developed a simulator called (ScaleSim). In the simulator, we implemented scalability components acting as the scalability components of Amazon EC2. The current scalability implementation in Amazon EC2 is used as a reference point for evaluating the improvement in the scalable application performance. ScaleSim is fed with realistic models of the RUBiS benchmark extracted from the real environment. The workload is generated from the access logs of the 1998 world cup website. The results show that optimizing the scalability thresholds and adopting proactive scalability can mitigate 88% of the resources provisioning overhead impact with only a 9% increase in the cost.
We present the latest results on the observational dependence of the mass-loss rate in stellar winds of O and early-B stars on the metal content of their atmospheres, and compare these with predictions. Absolute empirical rates for the mass loss of stars brighter than 10$^{5.2} L_{\odot}$, based on H$\alpha$ and ultraviolet (UV) wind lines, are found to be about a factor of two higher than predictions. If this difference is attributed to inhomogeneities in the wind this would imply that luminous O and early-B stars have clumping factors in their H$\alpha$ and UV line forming regime of about a factor of 3--5. The investigated stars cover a metallicity range $Z$ from 0.2 to 1 $Z_{\odot}$. We find a hint towards smaller clumping factors for lower $Z$. The derived clumping factors, however, presuppose that clumping does not impact the predictions of the mass-loss rate. We discuss this assumption and explain how we intend to investigate its validity in more detail.
Underpinning a legal system with certain values and helping to resolve norm conflicts is in domestic legal systems usually achieved through hierarchical superiority of certain norms of a constitutional nature. The present paper examines the question whether jus cogens can discharge this function within the traditionally horizontal and decentralized international legal order. In so doing, it commences with an overview of the historical origins of peremptory norms in legal scholarship, followed by its endorsement by positive law and courts and tribunals. This analysis illustrates that there are lingering uncertainties pertaining to the process of identification of peremptory norms. Even so, the concept has been invoked in State executive practice (although infrequently) and has been endorsed by various courts. However, such invocation thus far has had a limited impact from a legal perspective. It was mainly confined to a strengthened moral appeal and did in particular not facilitate the resolution of norm conflicts. The contribution further suggests that this limited impact results from the fact that the content of peremptory obligations is either very narrow or very vague. This, in turn, implies a lack of consensus amongst States regarding the content (scope) of jus cogens, including the values underlying these norms. As a result, it is questionable whether the construct of jus cogens is able to provide meaningful legal protection against the erosion of legal norms. It is too rudimentary in character to entrench and stabilize core human rights values as the moral foundation of the international legal order.
In this paper, by a new constructive method, the authors reprove the global exact boundary controllability of a class of quasilinear hyperbolic systems of conservation laws with linearly degenerate fields. It is shown that the system with nonlinear boundary conditions is globally exactly boundary controllable in the class of piecewise C¹ functions. In particular, the authors give the optimal control time of the system. Finally, a new application is also given.
Under Brazil's ex-president Bolsonaro, deforestation of the Amazon increased dramatically. An Austrian NGO filed a complaint to the Prosecutor of the International Criminal Court (ICC) against Bolsonaro in October 2021, accusing him of crimes against humanity against the backdrop of his involvement in environmental destruction. This paper deals with the question of whether this initi-ative constitutes a promising means of juridification to mitigate conflicts revolving around mass deforestation in Brazil. It thematizes attempts to juridify environmental destruction in international criminal law and examines the Climate Fund Case at the Brazilian Supreme Court. Finally, emerging problems and arguments in favour of starting preliminary examinations at the ICC against Bolsonaro are illuminated. This paper provides arguments as to why the initiative might be a promising undertaking, even though it is unlikely that Bolsonaro will be arrested.
A lot has been published about the competencies needed by
students in the 21st century (Ravenscroft et al., 2012). However, equally
important are the competencies needed by educators in the new era
of digital education. We review the key competencies for educators in
light of the new methods of teaching and learning proposed by Massive
Open Online Courses (MOOCs) and their on-campus counterparts,
Small Private Online Courses (SPOCs).
1. Developing lesson plans and choosing strategies 2. The aims of the lesson plans in general 3. Strategies as a means to achieve theaims of the lesson plans 4. Evaluating the quality of lesson plans 5. Difficulties during lessons and adaptations afterwards 6. Student teachers’ overall feeling about their work 7. Using the strategies in future classes 8. Conclusion
It is shown that an elliptic scattering operator A on a compact manifold with boundary with operator valued coefficients in the morphisms of a bundle of Banach spaces of class (HT ) and Pisier’s property (α) has maximal regularity (up to a spectral shift), provided that the spectrum of the principal symbol of A on the scattering cotangent bundle avoids the right half-plane. This is accomplished by representing the resolvent in terms of pseudodifferential operators with R-bounded symbols, yielding by an iteration argument the R-boundedness of λ(A − λ)−1 in R(λ)≥ τ for some τ ∈ IR. To this end, elements of a symbolic and operator calculus of pseudodifferential operators with R-bounded symbols are introduced. The significance of this method for proving maximal regularity results for partial differential operators is underscored by considering also a more elementary situation of anisotropic elliptic operators on Rd with operator valued coefficients.
We analyse different Gibbsian properties of interactive Brownian diffusions X indexed by the lattice $Z^{d} : X = (X_{i}(t), i ∈ Z^{d}, t ∈ [0, T], 0 < T < +∞)$. In a first part, these processes are characterized as Gibbs states on path spaces of the form $C([0, T],R)Z^{d}$. In a second part, we study the Gibbsian character on $R^{Z}^{d}$ of $v^{t}$, the law at time t of the infinite-dimensional diffusion X(t), when the initial law $v = v^{0}$ is Gibbsian.
Universitat Politècnica de València’s Experience with EDX MOOC Initiatives During the Covid Lockdown
(2021)
In March 2020, when massive lockdowns started to be enforced around the world to contain the spread of the COVID-19 pandemic, edX launched two initiatives to help students around the world providing free certificates for its courses, RAP, for member institutions and OCE, for any accredited academic institution. In this paper we analyze how Universitat Poltècnica de València contributed with its courses to both initiatives, providing almost 14,000 free certificate codes in total, and how UPV used the RAP initiative as a customer, describing the mechanism used to distribute more than 22,000 codes for free certificates to more than 7,000 UPV community members, what led to the achievement of more than 5,000 free certificates. We also comment the results of a post initiative survey answered by 1,612 UPV members about 3,241 edX courses, in which they communicated a satisfaction of 4,69 over 5 with the initiative.
Open edX is an incredible platform to deliver MOOCs and SPOCs, designed to be robust and support hundreds of thousands of students at the same time. Nevertheless, it lacks a lot of the fine-grained functionality needed to handle students individually in an on-campus course. This short session will present the ongoing project undertaken by the 6 public universities of the Region of Madrid plus the Universitat Politècnica de València, in the framework of a national initiative called UniDigital, funded by the Ministry of Universities of Spain within the Plan de Recuperación, Transformación y Resiliencia of the European Union. This project, led by three of these Spanish universities (UC3M, UPV, UAM), is investing more than half a million euros with the purpose of bringing the Open edX platform closer to the functionalities required for an LMS to support on-campus teaching. The aim of the project is to coordinate what is going to be developed with the Open edX development community, so these developments are incorporated into the core of the Open edX platform in its next releases. Features like a complete redesign of platform analytics to make them real-time, the creation of dashboards based on these analytics, the integration of a system for customized automatic feedback, improvement of exams and tasks and the extension of grading capabilities, improvements in the graphical interfaces for both students and teachers, the extension of the emailing capabilities, redesign of the file management system, integration of H5P content, the integration of a tool to create mind maps, the creation of a system to detect students at risk, or the integration of an advanced voice assistant and a gamification mobile app, among others, are part of the functionalities to be developed. The idea is to transform a first-class MOOC platform into the next on-campus LMS.
In this review, I discuss the suitability of massive star progenitors, evolved in isolation or in interacting binaries, for the production of observed supernovae (SNe) IIb, Ib, Ic. These SN types can be explained through variations in composition. The critical need of non-thermal effects to produce He I lines favours low-mass He-rich ejecta (in which ^56 Ni can be more easily mixed with He) for the production of SNe IIb/Ib, which thus may arise preferentially from moderate-mass donors in interacting binaries. SNe Ic may instead arise from higher mass progenitors, He-poor or not, because their larger CO cores prevent efficient non-thermal excitation of He i lines. However, current single star evolution models tend to produce Wolf-Rayet (WR) stars at death that have a final mass of > 10 M⊙. Single WR star explosion models produce ejecta that are too massive to match the observed light curve widths and rise times of SNe IIb/Ib/Ic, unless their kinetic energy is systematically and far greater than the canonical value of 10^56 erg. Future work is needed to evaluate the energy/mass degeneracy in light curve properties. Alternatively, a greater mass loss during the WR phase, perhaps in the form of eruptions, as evidenced in SNe Ibn, may reduce the final WR mass. If viable, such explosions would nonetheless favour a SN Ic, not a Ib.
Measuring the metabolite profile of plants can be a strong phenotyping tool, but the changes of metabolite pool sizes are often difficult to interpret, not least because metabolite pool sizes may stay constant while carbon flows are altered and vice versa. Hence, measuring the carbon allocation of metabolites enables a better understanding of the metabolic phenotype. The main challenge of such measurements is the in vivo integration of a stable or radioactive label into a plant without perturbation of the system. To follow the carbon flow of a precursor metabolite, a method is developed in this work that is based on metabolite profiling of primary metabolites measured with a mass spectrometer preceded by a gas chromatograph (Wagner et al. 2003; Erban et al. 2007; Dethloff et al. submitted). This method generates stable isotope profiling data, besides conventional metabolite profiling data. In order to allow the feeding of a 13C sucrose solution into the plant, a petiole and a hypocotyl feeding assay are developed. To enable the processing of large numbers of single leaf samples, their preparation and extraction are simplified and optimised. The metabolite profiles of primary metabolites are measured, and a simple relative calculation is done to gain information on carbon allocation from 13C sucrose. This method is tested examining single leaves of one rosette in different developmental stages, both metabolically and regarding carbon allocation from 13C sucrose. It is revealed that some metabolite pool sizes and 13C pools are tightly associated to relative leaf growth, i.e. to the developmental stage of the leaf. Fumaric acid turns out to be the most interesting candidate for further studies because pool size and 13C pool diverge considerably. In addition, the analyses are also performed on plants grown in the cold, and the initial results show a different metabolite pool size pattern across single leaves of one Arabidopsis rosette, compared to the plants grown under normal temperatures. Lastly, in situ expression of REIL genes in the cold is examined using promotor-GUS plants. Initial results suggest that single leaf metabolite profiles of reil2 differ from those of the WT.
This paper focuses on one particular issue which has arisen in the course of the ongoing debate on the reform of investor-State dispute settlement (ISDS), namely that of the appointment of arbitrators. Taking as its starting point that there now exists tentative consensus that the present system for the appointment of arbitrators either causes or exacerbates certain problematic aspects of the current ISDS system, the paper explores one option for reform, namely the introduction of an independent panel for the selection of investment arbitrators. In doing so, it is argued that a shift in the normative basis of the rules governing appointments is required in order to accommodate the principles of party autonomy and the international rule of law. Such reform, while not completely removing the initiative that parties presently enjoy, is the most efficient way to introduce rule of law considerations such as a measure of judicial independence into the current appointments system. This, it is argued, would in turn help to address some of the problematic features of the appointment of arbitrators in ISDS.
Received views of utterance context in pragmatic theory characterize the occurrent subjective states of interlocutors using notions like common knowledge or mutual belief. We argue that these views are not compatible with the uncertainty and robustness of context-dependence in humanhuman dialogue. We present an alternative characterization of utterance context as objective and normative. This view reconciles the need for uncertainty with received intuitions about coordination and meaning in context, and can directly inform computational approaches to dialogue.
In single photon emission computed tomography (SPECT) one is interested in reconstructing the activity distribution f of some radiopharmaceutical. The data gathered suffer from attenuation due to the tissue density µ. Each imaged slice incorporates noisy sample values of the nonlinear attenuated Radon transform (formular at this place in the original abstract) Traditional theory for SPECT reconstruction treats µ as a known parameter. In practical applications, however, µ is not known, but either crudely estimated, determined in costly additional measurements or plainly neglected. We demonstrate that an approximation of both f and µ from SPECT data alone is feasible, leading to quantitatively more accurate SPECT images. The result is based on nonlinear Tikhonov regularization techniques for parameter estimation problems in differential equations combined with Gauss-Newton-CG minimization.
This paper describes the key aspects of the system SynCoP (Syntactic Constraint Parser) developed at the Berlin-Brandenburgische Akademie der Wissenschaften. The parser allows to combine syntactic tagging and chunking by means of constraint grammar using weighted finite state transducers (WFST). Chunks are interpreted as local dependency structures within syntactic tagging. The linguistic theories are formulated by criteria which are formalized by a semiring; these criteria allow structural preferences and gradual grammaticality. The parser is essentially a cascade of WFSTs. To find the most likely syntactic readings a best-path search is used.
A new method is used in an eye-tracking pilot experiment which shows that it is possible to detect differences in common ground associated with the use of minimally different types of indefinite anaphora. Following Richardson and Dale (2005), cross recurrence quantification analysis (CRQA) was used to show that the tandem eye movements of two Swedish-speaking interlocutors are slightly more coupled when they are using fully anaphoric indefinite expressions than when they are using less anaphoric indefinites. This shows the potential of CRQA to detect even subtle processing differences in ongoing discourse.
Let’s talk about CS!
(2015)
To communicate about a science is the most important key
competence in education for any science. Without communication we
cannot teach, so teachers should reflect about the language they use in
class properly. But the language students and teachers use to communicate
about their CS courses is very heterogeneous, inconsistent and
deeply influenced by tool names. There is a big lack of research and
discussion in CS education regarding the terminology and the role of
concepts and tools in our science. We don’t have a consistent set of
terminology that we agree on to be helpful for learning our science.
This makes it nearly impossible to do research on CS competencies as
long as we have not agreed on the names we use to describe these. This
workshop intends to provide room to fill with discussion and first ideas
for future research in this field.
“How can a course structure be redesigned based on empirical data to enhance the learning effectiveness through a student-centered approach using objective criteria?”, was the research question we asked. “Digital Twins for Virtual Commissioning of Production Machines” is a course using several innovative concepts including an in-depth practical part with online experiments, called virtual labs. The teaching-learning concept is continuously evaluated. Card Sorting is a popular method for designing information architectures (IA), “a practice of effectively organizing, structuring, and labeling the content of a website or application into a structuref that enables efficient navigation” [11]. In the presented higher education context, a so-called hybrid card sort was used, in which each participants had to sort 70 cards into seven predefined categories or create new categories themselves. Twelve out of 28 students voluntarily participated in the process and short interviews were conducted after the activity. The analysis of the category mapping creates a quantitative measure of the (dis-)similarity of the keywords in specific categories using hierarchical clustering (HCA). The learning designer could then interpret the results to make decisions about the number, labeling and order of sections in the course.
Microsaccades are very small, involuntary flicks in eye position that occur on average once or twice per second during attempted visual fixation. Microsaccades give rise to EMG eye muscle spikes that can distort the spectrum of the scalp EEG and mimic increases in gamma band power. Here we demonstrate that microsaccades are also accompanied by genuine and sizeable cortical activity, manifested in the EEG. In three experiments, high-resolution eye movements were corecorded with the EEG: during sustained fixation of checkerboard and face stimuli and in a standard visual oddball task that required the counting of target stimuli. Results show that microsaccades as small as 0.15° generate a field potential over occipital cortex and midcentral scalp sites 100 –140 ms after movement onset, which resembles the visual lambda response evoked by larger voluntary saccades. This challenges the standard assumption of human brain imaging studies that saccade-related brain activity is precluded by fixation, even when fully complied with. Instead, additional cortical potentials from microsaccades were present in 86% of the oddball task trials and of similar amplitude as the visual response to stimulus onset. Furthermore, microsaccade probability varied systematically according to the proportion of target stimuli in the oddball task, causing modulations of late stimulus-locked event-related potential (ERP) components. Microsaccades present an unrecognized source of visual brain signal that is of interest for vision research and may have influenced the data of many ERP and neuroimaging studies.
Mixed elliptic boundary value problems are characterised by conditions which have a jump along an interface of codimension 1 on the boundary. We study such problems in weighted edge Sobolev spaces and show the Fredholm property and the existence of parametrices under additional conditions of trace and potential type on the interface. Our methods from the calculus of boundary value problems on a manifold with edges will be illustrated by the Zaremba problem and other mixed problems for the Laplace operator.
The ellipticity of operators on a manifold with edge is defined as the bijectivity of the components of a principal symbolic hierarchy σ = (σψ, σ∧), where the second component takes value in operators on the infinite model cone of the local wedges. In general understanding of edge problems there are two basic aspects: Quantisation of edge-degenerate operators in weighted Sobolev spaces, and verifying the elliptcity of the principal edge symbol σ∧ which includes the (in general not explicitly known) number of additional conditions on the edge of trace and potential type. We focus here on these queations and give explicit answers for a wide class of elliptic operators that are connected with the ellipticity of edge boundary value problems and reductions to the boundary. In particular, we study the edge quantisation and ellipticity for Dirichlet-Neumann operators with respect to interfaces of some codimension on a boundary. We show analogues of the Agranovich-Dynin formula for edge boundary value problems, and we establish relations of elliptic operators for different weights, via the spectral flow of the underlying conormal symbols.
We construct a class of elliptic operators in the edge algebra on a manifold M with an embedded submanifold Y interpreted as an edge. The ellipticity refers to a principal symbolic structure consisting of the standard interior symbol and an operator-valued edge symbol. Given a differential operator A on M for every (sufficiently large) s we construct an associated operator As in the edge calculus. We show that ellipticity of A in the usual sense entails ellipticity of As as an edge operator (up to a discrete set of reals s). Parametrices P of A then correspond to parametrices Ps of As, interpreted as Mellin-edge representations of P.
The annotation guidelines introduced in this chapter present an attempt to create a unique infrastructure for the encoding of data from very different languages. The ultimate target of these annotations is to allow for data retrieval for the study of information structure, and since information structure interacts with all levels of grammar, the present guidelines cover all levels of grammar too. After introducing the guidelines, the current chapter also presents an evaluation by means of measurements of the inter-annotator agreement.
ANNIS
(2004)
In this paper, we discuss the design and implementation of our first version of the database "ANNIS" ("ANNotation of Information Structure"). For research based on empirical data, ANNIS provides a uniform environment for storing this data together with its linguistic annotations. A central database promotes standardized annotation, which facilitates interpretation and comparison of the data. ANNIS is used through a standard web browser and offers tier-based visualization of data and annotations, as well as search facilities that allow for cross-level and cross-sentential queries. The paper motivates the design of the system, characterizes its user interface, and provides an initial technical evaluation of ANNIS with respect to data size and query processing.
Elderly adults (N = 116; average age = 73 years) were randomly assigned to one of four treatment groups varying in the amount of training and testing on fluid intelligence tests. They were compared before and after treatment on self-efficacy and utility beliefs for intelligence tests and everyday competence. Although both ability training and extended retest practice resulted in significant gains in objective test performance (Baltes, Kliegl, & Dittmann-Kohli, 1988), only ability training resulted in positive changes in self-efficacy. However, these changes were restricted to testrelated self-efficacy. Training had no impact on perceived utility or on everyday self-efficacy beliefs. Implications of the results are discussed with regard to interventions to increase intellectual self-efficacy in elderly persons.
The main aim of this article is to explore how learning analytics and synchronous collaboration could improve course completion and learner outcomes in MOOCs, which traditionally have been delivered asynchronously. Based on our experience with developing BigBlueButton, a virtual classroom platform that provides educators with live analytics, this paper explores three scenarios with business focused MOOCs to improve outcomes and strengthen learned skills.