page_content
stringlengths
51
3.15k
Backup validation Validation Backup_validation > Validation Validation is the process of finding out whether a backup attempt succeeded or not, or, whether the data is backed up enough to consider it "protected". This process usually involves the examination of log files, the "smoking gun" often left behind after a backup attempts takes place, as well as media databases, data traffic and even magnetic tapes. Patterns can be detected, key error messages identified and statistics extracted in order to determine which backups worked and which did not. According to Veeam Availability Report in 2014 organizations test their backups for recoverability on average every eight days.
Analog Signal Processing Convolution Analog_Signal_Processing > Tools used in analog signal processing > Convolution Convolution is the basic concept in signal processing that states an input signal can be combined with the system's function to find the output signal. It is the integral of the product of two waveforms after one has reversed and shifted; the symbol for convolution is *. y ( t ) = ( x ∗ h ) ( t ) = ∫ a b x ( τ ) h ( t − τ ) d τ {\displaystyle y(t)=(x*h)(t)=\int _{a}^{b}x(\tau )h(t-\tau )\,d\tau } That is the convolution integral and is used to find the convolution of a signal and a system; typically a = -∞ and b = +∞.
Kelvin–Planck statement Account given by Clausius Second_law_of_thermodynamics > History > Account given by Clausius In terms of time variation, the mathematical statement of the second law for an isolated system undergoing an arbitrary transformation is: d S d t ≥ 0 {\displaystyle {\frac {dS}{dt}}\geq 0} where S is the entropy of the system and t is time.The equality sign applies after equilibration. An alternative way of formulating of the second law for isolated systems is: d S d t = S ˙ i {\displaystyle {\frac {dS}{dt}}={\dot {S}}_{i}} with S ˙ i ≥ 0 {\displaystyle {\dot {S}}_{i}\geq 0} with S ˙ i {\displaystyle {\dot {S}}_{i}} the sum of the rate of entropy production by all processes inside the system. The advantage of this formulation is that it shows the effect of the entropy production.
Islanding Passive methods Islanding > Islanding detection methods > Passive methods Passive methods include any system that attempts to detect transient changes on the grid, and use that information as the basis as a probabilistic determination of whether or not the grid has failed, or some other condition has resulted in a temporary change.
Fast algorithms Second definition Quadratic_time > Sub-exponential time > Second definition Some authors define sub-exponential time as running times in 2 o ( n ) {\displaystyle 2^{o(n)}} . This definition allows larger running times than the first definition of sub-exponential time. An example of such a sub-exponential time algorithm is the best-known classical algorithm for integer factorization, the general number field sieve, which runs in time about 2 O ~ ( n 1 / 3 ) {\displaystyle 2^{{\tilde {O}}(n^{1/3})}} , where the length of the input is n. Another example was the graph isomorphism problem, which the best known algorithm from 1982 to 2016 solved in 2 O ( n log ⁡ n ) {\displaystyle 2^{O\left({\sqrt {n\log n}}\right)}} .
Continuous Fourier transform Signal processing Fourier_transforms > Applications > Signal processing The Fourier transform is used for the spectral analysis of time-series. The subject of statistical signal processing does not, however, usually apply the Fourier transformation to the signal itself. Even if a real signal is indeed transient, it has been found in practice advisable to model a signal by a function (or, alternatively, a stochastic process) which is stationary in the sense that its characteristic properties are constant over all time. The Fourier transform of such a function does not exist in the usual sense, and it has been found more useful for the analysis of signals to instead take the Fourier transform of its autocorrelation function.
Twin-width Summary Twin-width The twin-width of an undirected graph is a natural number associated with the graph, used to study the parameterized complexity of graph algorithms. Intuitively, it measures how similar the graph is to a cograph, a type of graph that can be reduced to a single vertex by repeatedly merging together twins, vertices that have the same neighbors. The twin-width is defined from a sequence of repeated mergers where the vertices are not required to be twins, but have nearly equal sets of neighbors.
Process monitoring Development of modern process control operations Process_Control > Development of modern process control operations Effectively this was the centralisation of all the localised panels, with the advantages of lower manning levels and easier overview of the process. Often the controllers were behind the control room panels, and all automatic and manual control outputs were transmitted back to plant. However, whilst providing a central control focus, this arrangement was inflexible as each control loop had its own controller hardware, and continual operator movement within the control room was required to view different parts of the process.
Generative grammar Historical development of models of transformational grammar Generative_grammar > Frameworks > Historical development of models of transformational grammar Leonard Bloomfield, an influential linguist in the American Structuralist tradition, saw the ancient Indian grammarian Pāṇini as an antecedent of structuralism. However, in Aspects of the Theory of Syntax, Chomsky writes that "even Panini's grammar can be interpreted as a fragment of such a 'generative grammar'", a view that he reiterated in an award acceptance speech delivered in India in 2001, where he claimed that "the first 'generative grammar' in something like the modern sense is Panini's grammar of Sanskrit".Military funding to generativist research was influential to its early success in the 1960s.Generative grammar has been under development since the mid 1950s, and has undergone many changes in the types of rules and representations that are used to predict grammaticality. In tracing the historical development of ideas within generative grammar, it is useful to refer to the various stages in the development of the theory:
The speed of light in vacuum Measurement Speed_of_Light > Measurement There are different ways to determine the value of c. One way is to measure the actual speed at which light waves propagate, which can be done in various astronomical and Earth-based setups. However, it is also possible to determine c from other physical laws where it appears, for example, by determining the values of the electromagnetic constants ε0 and μ0 and using their relation to c. Historically, the most accurate results have been obtained by separately determining the frequency and wavelength of a light beam, with their product equalling c. This is described in more detail in the "Interferometry" section below. In 1983 the metre was defined as "the length of the path travelled by light in vacuum during a time interval of 1⁄299792458 of a second", fixing the value of the speed of light at 299792458 m/s by definition, as described below. Consequently, accurate measurements of the speed of light yield an accurate realization of the metre rather than an accurate value of c.
Computer-automated design Exhaustive search Computer-automated_design > Exhaustive search In theory, this adjustment process can be automated by computerised search, such as exhaustive search. As this is an exponential algorithm, it may not deliver solutions in practice within a limited period of time.
Renewable energy commercialization Economic trends Renewable_energy_policy > Background > Economic trends As the cost of renewable power falls, the scope of economically viable applications increases. Renewable technologies are now often the most economic solution for new generating capacity. Where "oil-fired generation is the predominant power generation source (e.g. on islands, off-grid and in some countries) a lower-cost renewable solution almost always exists today". As of 2012, renewable power generation technologies accounted for around half of all new power generation capacity additions globally. In 2011, additions included 41 gigawatt (GW) of new wind power capacity, 30 GW of PV, 25 GW of hydro-electricity, 6 GW of biomass, 0.5 GW of CSP, and 0.1 GW of geothermal power.
Relativistic Mass Popular science and textbooks Relativistic_Mass > History of the relativistic mass concept > Popular science and textbooks ... The sound and rigorous approach to relativistic dynamics is through direct development of that expression for momentum that ensures conservation of momentum in all frames: rather than through relativistic mass. C. Alder takes a similarly dismissive stance on mass in relativity. Writing on said subject matter, he says that "its introduction into the theory of special relativity was much in the way of a historical accident", noting towards the widespread knowledge of E = mc2 and how the public's interpretation of the equation has largely informed how it is taught in higher education.
Genetically modified organisms Plants Genetically_Modified_Organism > Plants Its small genome and short life cycle makes it easy to manipulate and it contains many homologues to important crop species. It was the first plant sequenced, has a host of online resources available and can be transformed by simply dipping a flower in a transformed Agrobacterium solution.In research, plants are engineered to help discover the functions of certain genes. The simplest way to do this is to remove the gene and see what phenotype develops compared to the wild type form.
Skin temperature (of an atmosphere) Background Skin_temperature_(of_an_atmosphere) > Background The concept of a skin temperature builds on a radiative-transfer model of an atmosphere, in which the atmosphere of a planet is divided into an arbitrary number of layers. Each layer is transparent to the visible radiation from the Sun but acts as a blackbody in the infrared, fully absorbing and fully re-emitting infrared radiation originating from the planet's surface and from other atmospheric layers. Layers are warmer near the surface and colder at higher altitudes. If the planet's atmosphere is in radiative equilibrium, then the uppermost of these opaque layers should radiate infrared radiation upwards with a flux equal to the incident solar flux.
GUT theories Motivation GUT_theories > Motivation This would automatically predict the quantized nature and values of all elementary particle charges. Since this also results in a prediction for the relative strengths of the fundamental interactions which we observe, in particular, the weak mixing angle, grand unification ideally reduces the number of independent input parameters but is also constrained by observations. Grand unification is reminiscent of the unification of electric and magnetic forces by Maxwell's field theory of electromagnetism in the 19th century, but its physical implications and mathematical structure are qualitatively different.
TL431 Construction and operation TL431 > Construction and operation The TL431 is a three-terminal bipolar transistor switch, functionally equivalent to an ideal n-type transistor with a stable 2.5 V switching threshold and no apparent hysteresis. "Base", "collector" and "emitter" of this "transistor" are traditionally called reference (R or REF), cathode (C) and anode (A). The positive control voltage, VREF, is applied between reference input and the anode; the output current, ICA, flows from the cathode to the anode.On a functional level the TL431 contains a 2.5 V voltage reference, and an open-loop operational amplifier that compares the input control voltage with the reference. This, however, is merely an abstraction: both functions are inextricably linked inside the TL431's front end.
Premack principle Origin and description Premack_principle > Origin and description An individual will be more motivated to perform a particular activity if they know that they will partake in a more desirable activity as a consequence. Stated objectively, if high-probability behaviors (more desirable behaviors) are made contingent upon lower-probability behaviors (less desirable behaviors), then the lower-probability behaviors are more likely to occur.
Atkinson-Shiffrin memory model Summary Atkinson-Shiffrin_memory_model > Summary A summary of the evidence given for the distinction between long-term and short-term stores is given below. Additionally, Atkinson and Shiffrin included a sensory register alongside the previously theorized primary and secondary memory, as well as a variety of control processes which regulate the transfer of memory. Following its first publication, multiple extensions of the model have been put forth such as a precategorical acoustic store, the search of associative memory model, the perturbation model, and permastore. Additionally, alternative frameworks have been proposed, such as procedural reinstatement, a distinctiveness model, and Baddeley and Hitch's model of working memory, among others.
Domain (function) Summary Domain_(function) In mathematics, the domain of a function is the set of inputs accepted by the function. It is sometimes denoted by dom ⁡ ( f ) {\displaystyle \operatorname {dom} (f)} or dom ⁡ f {\displaystyle \operatorname {dom} f} , where f is the function. In layman's terms, the domain of a function can generally be thought of as "what x can be".More precisely, given a function f: X → Y {\displaystyle f\colon X\to Y} , the domain of f is X. In modern mathematical language, the domain is part of the definition of a function rather than a property of it. In the special case that X and Y are both subsets of R {\displaystyle \mathbb {R} } , the function f can be graphed in the Cartesian coordinate system.
Chemical reaction network theory Overview Chemical_reaction_network > Overview A chemical reaction network (often abbreviated to CRN) comprises a set of reactants, a set of products (often intersecting the set of reactants), and a set of reactions. For example, the pair of combustion reactions form a reaction network. The reactions are represented by the arrows. The reactants appear to the left of the arrows, in this example they are H 2 {\displaystyle {\ce {H2}}} (hydrogen), O 2 {\displaystyle {\ce {O2}}} (oxygen) and C (carbon).
Heat transfer physics Length and time scales Heat_transfer_physics > Length and time scales Thermophysical properties of matter and the kinetics of interaction and energy exchange among the principal carriers are based on the atomic-level configuration and interaction. Transport properties such as thermal conductivity are calculated from these atomic-level properties using classical and quantum physics. Quantum states of principal carriers (e.g.. momentum, energy) are derived from the Schrödinger equation (called first principle or ab initio) and the interaction rates (for kinetics) are calculated using the quantum states and the quantum perturbation theory (formulated as the Fermi golden rule).
Surround suppression Stimulus and attention dependence Surround_suppression > Characteristics: effects on neural responses > Stimulus and attention dependence Surround suppression is also modulated by attention. By training monkeys to attend to certain areas of their visual field, researchers have studied how directed attention can enhance the suppressive effects of stimuli surrounding the area of attention. Similar perceptual studies haven’t been performed on human subjects as well.
Runoff model Linear reservoir Runoff_model_(reservoir) > Linear reservoir The availability of the foregoing runoff equation eliminates the necessity of calculating the total hydrograph by the summation of partial hydrographs using the IUH as is done with the more complicated convolution method.Determining the response factor A When the response factor A can be determined from the characteristics of the watershed (catchment area), the reservoir can be used as a deterministic model or analytical model, see hydrological modelling. Otherwise, the factor A can be determined from a data record of rainfall and runoff using the method explained below under non-linear reservoir. With this method the reservoir can be used as a black box model. Conversions 1 mm/day corresponds to 10 m3/day per ha of the watershed 1 L/s per ha corresponds to 8.64 mm/day or 86.4 m3/day per ha
Anderson acceleration Summary Anderson_acceleration In mathematics, Anderson acceleration, also called Anderson mixing, is a method for the acceleration of the convergence rate of fixed-point iterations. Introduced by Donald G. Anderson, this technique can be used to find the solution to fixed point equations f ( x ) = x {\displaystyle f(x)=x} often arising in the field of computational science.
Complement activation Alternative pathway Complement_proteins > Overview > Alternative pathway Accordingly, the alternative complement pathway is one element of innate immunity.Once the alternative C3 convertase enzyme is formed on a pathogen or cell surface, it may bind covalently another C3b, to form C3bBbC3bP, the C5 convertase. This enzyme then cleaves C5 to C5a, a potent anaphylatoxin, and C5b. The C5b then recruits and assembles C6, C7, C8 and multiple C9 molecules to assemble the membrane attack complex. This creates a hole or pore in the membrane that can kill or damage the pathogen or cell.
Topological phase transition Potential impact Topological_phase > Potential impact Landau symmetry-breaking theory is a cornerstone of condensed matter physics. It is used to define the territory of condensed matter research. The existence of topological order appears to indicate that nature is much richer than Landau symmetry-breaking theory has so far indicated. So topological order opens up a new direction in condensed matter physics—a new direction of highly entangled quantum matter.
Rhombohedral symmetry Crystal systems Hexagonal_crystal_family > Crystal systems Hence, the trigonal crystal system is the only crystal system whose point groups have more than one lattice system associated with their space groups. The hexagonal crystal system consists of the 7 point groups that have a single six-fold rotation axis. These 7 point groups have 27 space groups (168 to 194), all of which are assigned to the hexagonal lattice system.
Green's function (many-body theory) Hilbert transform Green's_function_(many-body_theory) > Spatially uniform case > Spectral representation > Hilbert transform The similarity of the spectral representations of the imaginary- and real-time Green functions allows us to define the function which is related to G {\displaystyle {\mathcal {G}}} and G R {\displaystyle G^{\mathrm {R} }} by and A similar expression obviously holds for G A {\displaystyle G^{\mathrm {A} }} . The relation between G ( k , z ) {\displaystyle G(\mathbf {k} ,z)} and ρ ( k , x ) {\displaystyle \rho (\mathbf {k} ,x)} is referred to as a Hilbert transform.
Cross-sectional study Advantages Cross‐sectional_design > Healthcare > Advantages The use of routinely collected data allows large cross-sectional studies to be made at little or no expense. This is a major advantage over other forms of epidemiological study. A natural progression has been suggested from cheap cross-sectional studies of routinely collected data which suggest hypotheses, to case-control studies testing them more specifically, then to cohort studies and trials which cost much more and take much longer, but may give stronger evidence. In a cross-sectional survey, a specific group is looked at to see if an activity, say alcohol consumption, is related to the health effect being investigated, say cirrhosis of the liver. If alcohol use is correlated with cirrhosis of the liver, this would support the hypothesis that alcohol use may be associated with cirrhosis.
Stress–strain analysis Load transfer Stress–strain_analysis > Load transfer The evaluation of loads and stresses within structures is directed to finding the load transfer path. Loads will be transferred by physical contact between the various component parts and within structures. The load transfer may be identified visually or by simple logic for simple structures.
Normal maps Normal mapping in video games Normal_maps > Normal mapping in video games Much of this efficiency is made possible by distance-indexed detail scaling, a technique which selectively decreases the detail of the normal map of a given texture (cf. mipmapping), meaning that more distant surfaces require less complex lighting simulation. Many authoring pipelines use high resolution models baked into low/medium resolution in-game models augmented with normal maps. Basic normal mapping can be implemented in any hardware that supports palettized textures.
Garbage (computer science) Effects Garbage_(computer_science) > Effects Syntactic garbage can be collected automatically, and garbage collectors have been extensively studied and developed. Semantic garbage cannot be automatically collected in general, and thus causes memory leaks even in garbage-collected languages. Detecting and eliminating semantic garbage is typically done using a specialized debugging tool called a heap profiler, which allows one to see which objects are live and how they are reachable, enabling one to remove the unintended reference.
Darwinian anthropology Selection pressure on genes or strategy of individuals Darwinian_anthropology > Theoretical background > Selection pressure on genes or strategy of individuals But the idea of the inclusive fitness of an individual is nevertheless a useful one. Just as in the sense of classical selection we may consider whether a given character expressed in an individual is adaptive in the sense of being in the interest of his personal fitness or not, so in the present sense of selection we may consider whether the character or trait of behaviour is or is not adaptive in the sense if being in the interest of his inclusive fitness.” (Hamilton 1996 , 38) It is clear here that the formal treatment is of the selection pressures on types (genes or traits), whilst the notion of individual inclusive fitness may serve as a guide to the adaptiveness of the trait; just as consideration of effects of a trait on an individual's fitness can be instructive when considering classical selection on traits. At the same time, it is understandable that Alexander took the inclusive fitness of individuals as a heuristic device.
Minimum-heap property Applications Minimum-heap_property > Applications The heap data structure has many applications. Heapsort: One of the best sorting methods being in-place and with no quadratic worst-case scenarios. Selection algorithms: A heap allows access to the min or max element in constant time, and other selections (such as median or kth-element) can be done in sub-linear time on data that is in a heap. Graph algorithms: By using heaps as internal traversal data structures, run time will be reduced by polynomial order.
Tumor associated antigen Mechanism of tumor antigenesis Tumor-specific_antigen > Mechanism of tumor antigenesis Normal proteins in the body are not antigenic because of self-tolerance, a process in which self-reacting cytotoxic T lymphocytes (CTLs) and autoantibody-producing B lymphocytes are culled "centrally" in primary lymphatic tissue (BM) and "peripherally" in secondary lymphatic tissue (mostly thymus for T-cells and spleen/lymph nodes for B cells). Thus any protein that is not exposed to the immune system triggers an immune response. This may include normal proteins that are well sequestered from the immune system, proteins that are normally produced in extremely small quantities, proteins that are normally produced only in certain stages of development, or proteins whose structure is modified due to mutation.
View camera Lenses Bellows_camera > Lenses A view camera lens typically consists of: A front lens element— sometimes referred to as a cell. A shutter—an electronic or spring-actuated mechanism that controls exposure duration. Some early shutters were air-actuated.
Brain biopsy Interpretation Brain_biopsy > Interpretation If infection is suspected, the infectious organism can be cultured from the tissue and identified. Classification of tumors is also possible after biopsy. == References ==
Optical telescope Principles Optical_telescope > Principles The basic scheme is that the primary light-gathering element, the objective (1) (the convex lens or concave mirror used to gather the incoming light), focuses that light from the distant object (4) to a focal plane where it forms a real image (5). This image may be recorded or viewed through an eyepiece (2), which acts like a magnifying glass. The eye (3) then sees an inverted, magnified virtual image (6) of the object.
Fluid physics Summary Continuum_assumption Fluid mechanics, especially fluid dynamics, is an active field of research, typically mathematically complex. Many problems are partly or wholly unsolved and are best addressed by numerical methods, typically using computers. A modern discipline, called computational fluid dynamics (CFD), is devoted to this approach. Particle image velocimetry, an experimental method for visualizing and analyzing fluid flow, also takes advantage of the highly visual nature of fluid flow.
Sensitive information Public information Sensitive_data > Non-sensitive information > Public information This refers to information that is already a matter of public record or knowledge. With regard to government and private organizations, access to or release of such information may be requested by any member of the public, and there are often formal processes laid out for how to do so. The accessibility of government-held public records is an important part of government transparency, accountability to its citizens, and the values of democracy. Public records may furthermore refer to information about identifiable individuals that is not considered confidential, including but not limited to: census records, criminal records, sex offender registry files, and voter registration.
Iterated map Invariant measure Iterated_function > Invariant measure Smaller eigenvalues correspond to unstable, decaying states. In general, because repeated iteration corresponds to a shift, the transfer operator, and its adjoint, the Koopman operator can both be interpreted as shift operators action on a shift space. The theory of subshifts of finite type provides general insight into many iterated functions, especially those leading to chaos.
Bohmian Mechanics Heisenberg's uncertainty principle Bohmian_quantum_mechanics > Results > Heisenberg's uncertainty principle What one can know about a particle at any given time is described by the wavefunction. Since the uncertainty relation can be derived from the wavefunction in other interpretations of quantum mechanics, it can be likewise derived (in the epistemic sense mentioned above) on the de Broglie–Bohm theory. To put the statement differently, the particles' positions are only known statistically.
Laser Cooling Doppler cooling Laser_cooled > Methods > Doppler cooling Thus if one applies light from two opposite directions, the atoms will always scatter more photons from the laser beam pointing opposite to their direction of motion. In each scattering event the atom loses a momentum equal to the momentum of the photon. If the atom, which is now in the excited state, then emits a photon spontaneously, it will be kicked by the same amount of momentum, but in a random direction.
Meniscus (optics) Construction of simple lenses Meniscus_lens > Construction of simple lenses They have a different focal power in different meridians. This forms an astigmatic lens. An example is eyeglass lenses that are used to correct astigmatism in someone's eye.
Garbage bag Summary Trash_bag Plastic bags are often used for lining litter or waste containers or bins. This keeps the container sanitary by avoiding container contact with the garbage.
Photoemission electron microscopy Initial development Photoemission_electron_microscopy > History > Initial development In 1933, Ernst Brüche reported images of cathodes illuminated by UV light. This work was extended by two of his colleagues, H. Mahl and J. Pohl. Brüche made a sketch of his photoelectron emission microscope in his 1933 paper (Figure 1). This is evidently the first photoelectron emission microscope (PEEM).
Priority R-tree Performance Priority_R-tree > Performance Arge et al. writes that the priority tree always answers window-queries with O ( ( N B ) 1 − 1 d + T B ) {\displaystyle O\left(\left({\frac {N}{B}}\right)^{1-{\frac {1}{d}}}+{\frac {T}{B}}\right)} I/Os, where N is the number of d-dimensional (hyper-) rectangles stored in the R-tree, B is the disk block size, and T is the output size.
Show control Subsystems Show_control > Subsystems The programmable logic controller, or its cousin the small logic controller, would be the next increase in scale. These are commonly used to control subsystems which vary from moderately-sized to very large pieces of equipment.
Alternatives to the Standard Higgs Model List of alternative models Higgsless_model > List of alternative models A partial list of proposed alternatives to a Higgs field as a source for symmetry breaking includes: Technicolor models break electroweak symmetry through new gauge interactions, which were originally modeled on quantum chromodynamics. Extra-dimensional Higgsless models use the fifth component of the gauge fields to play the role of the Higgs fields. It is possible to produce electroweak symmetry breaking by imposing certain boundary conditions on the extra dimensional fields, increasing the unitarity breakdown scale up to the energy scale of the extra dimension. Through the AdS/QCD correspondence this model can be related to technicolor models and to "UnHiggs" models in which the Higgs field is of unparticle nature.
Distribution on a linear algebraic group The Lie algebra of a linear algebraic group Distribution_on_a_linear_algebraic_group > Construction > The Lie algebra of a linear algebraic group Let k be an algebraically closed field and G a linear algebraic group (that is, affine algebraic group) over k. By definition, Lie(G) is the Lie algebra of all derivations of k that commute with the left action of G. As in the Lie group case, it can be identified with the tangent space to G at the identity element.
Gear housing Summary Gear_housing The gear housing is a mechanical housing that surrounds the mechanical components of a gear box. It provides mechanical support for the moving components, a mechanical protection from the outside world for those internal components, and a fluid-tight container to hold the lubricant that bathes those components.
SimRank Introduction SimRank > Introduction On the Web, for example, two pages are related if there are hyperlinks between them. A similar approach can be applied to scientific papers and their citations, or to any other document corpus with cross-reference information. In the case of recommender systems, a user’s preference for an item constitutes a relationship between the user and the item.
Graph Style Sheets The GSS Language Graph_Style_Sheets > The GSS Language GSS is a stylesheet language for styling data modeled in RDF and features a cascading mechanism. Its transformation model is loosely based on that of XSLT and its instructions resemble some existing W3C Recommendations such as CSS and SVG. In particular most of the GSS properties accept all values defined by the CSS 2 and SVG 1.0 Recommendations.Any transformation rule of GSS is made of a selector-instruction pair. The left-hand side of a rule is called selector and the right-hand side is called the instruction. Such sets of rules are collected in a stylesheet (or several cascading stylesheets) and the application (a GSS engine) responsible for styling RDF model, evaluates relevant rules on data model (resources, literals and properties) while walking it; that is, if the selector of a rule matches a node (or edge) in the data model, its set of styling instructions are applied to that node (or edge). Conflicts between rules matching the same node (or edge) are resolved by giving different priority to rules in the stylesheets and most specific selector if conflicting rules are in the same stylesheet.
Quantum theory of measurement A counterexample Principle_of_uncertainty > Robertson–Schrödinger uncertainty relations > A counterexample These states are normalizable, unlike the eigenstates of the momentum operator on the line. Also the operator A ^ {\displaystyle {\hat {A}}} is bounded, since θ {\displaystyle \theta } ranges over a bounded interval.
Expression of genes Summary Receptor_expression In genetics, gene expression is the most fundamental level at which the genotype gives rise to the phenotype, i.e. observable trait. The genetic information stored in DNA represents the genotype, whereas the phenotype results from the "interpretation" of that information. Such phenotypes are often displayed by the synthesis of proteins that control the organism's structure and development, or that act as enzymes catalyzing specific metabolic pathways.
Proper orthogonal decomposition POD and PCA Proper_orthogonal_decomposition > POD and PCA The main use of POD is to decompose a physical field (like pressure, temperature in fluid dynamics or stress and deformation in structural analysis), depending on the different variables that influence its physical behaviors. As its name hints, it's operating an Orthogonal Decomposition along with the Principal Components of the field. As such it is assimilated with the principal component analysis from Pearson in the field of statistics, or the singular value decomposition in linear algebra because it refers to eigenvalues and eigenvectors of a physical field. In those domains, it is associated with the research of Karhunen and Loève, and their Karhunen–Loève theorem.
Loop representation The loop transform and the loop representation Loop_representation > The loop representation of Yang–Mills theory > The loop transform and the loop representation The inverse loop transform is defined by Ψ = ∫ Ψ W γ . {\displaystyle \Psi =\int \Psi W_{\gamma }.} This defines the loop representation.
Interactionism Rejection of positivist methods Interactionism > Methodology > Rejection of positivist methods Interactionists reject statistical (quantitative) data, a method preferred by post-positivists. These methods include: experiments, structured interviews, questionnaires, non-participant observation, and secondary sources. This rejection is based in a few basic criticisms, namely: Statistical data is not "valid;" in other words, these methods do not provide people with a true picture of society on the topic being researched. Quantitative research is biased and therefore not objective. Whilst the sociologist would be distant, it is argued that the existence of a hypothesis implies that the research is biased towards a pre-set conclusion (e.g., Rosenhan experiment in 1973). Therefore, such research is rejected by interactionists, who claim that it is artificial and also raises ethical issues to experiment on people.
Ego defenses Theories and classifications Defence_mechanisms > Theories and classifications According to his theory, reaction formation relates to joy (and manic features), denial relates to acceptance (and histrionic features), repression to fear (and passivity), regression to surprise (and borderline traits), compensation to sadness (and depression), projection to disgust (and paranoia), displacement to anger (and hostility) and intellectualization to anticipation (and obsessionality).The Diagnostic and Statistical Manual of Mental Disorders (DSM-IV) published by the American Psychiatric Association (1994) includes a tentative diagnostic axis for defence mechanisms. This classification is largely based on Vaillant's hierarchical view of defences, but has some modifications. Examples include: denial, fantasy, rationalization, regression, isolation, projection, and displacement.
Consistency model Relaxed write to read and write to write Consistency_model > Relaxed memory consistency models > Relaxed write to read and write to write A counter is used to determine when all the writes before the STBAR instruction have been completed, which triggers a write to the memory system to increment the counter. A write acknowledgement decrements the counter, and when the counter becomes 0, it signifies that all the previous writes are completed. In the examples A and B, PSO allows both these non-sequentially consistent results. The safety net that PSO provides is similar to TSO's, it imposes program order from a write to a read and enforces write atomicity. Similar to the previous models, the relaxations allowed by PSO are not sufficiently flexible to be useful for compiler optimisation, which requires a much more flexible optimisation.
Membrane potential Summary Cell_excitability The membrane potential has two basic functions. First, it allows a cell to function as a battery, providing power to operate a variety of "molecular devices" embedded in the membrane. Second, in electrically excitable cells such as neurons and muscle cells, it is used for transmitting signals between different parts of a cell.
Unit Disc The open unit disk, the plane, and the upper half-plane Open_unit_disc > The open unit disk, the plane, and the upper half-plane Considered as a Riemann surface, the open unit disk is therefore different from the complex plane. There are conformal bijective maps between the open unit disk and the open upper half-plane. So considered as a Riemann surface, the open unit disk is isomorphic ("biholomorphic", or "conformally equivalent") to the upper half-plane, and the two are often used interchangeably.
Buffer zone Providing aesthetic value Buffer_zone > Ecological functions of conservation > Providing aesthetic value As an important part of riparian zone, the vegetation buffer zones form a variety of landscape, and the landscape pattern of combining land and water improves the aesthetic value of river basin landscape. The riparian buffer is rich in plant resources, and the wetland, grassland and forest ecosystem make the landscape more beautiful. In addition, some recreational facilities can be built in the buffer zone to provide better living conditions for residents or tourists and improve people's quality of life.
AI-assisted reverse engineering Unsupervised learning AI-assisted_reverse_engineering > Techniques > Unsupervised learning Unsupervised learning is utilized to detect concealed patterns and structures in untagged data. It proves beneficial in comprehending complex systems where there's no evident labeling or mapping of components.
Classical Physics Computer modeling and manual calculation, modern and classic comparison Classical_theory > Computer modeling and manual calculation, modern and classic comparison For example, in many formulations from special relativity, a correction factor (v/c)2 appears, where v is the velocity of the object and c is the speed of light. For velocities much smaller than that of light, one can neglect the terms with c2 and higher that appear. These formulas then reduce to the standard definitions of Newtonian kinetic energy and momentum.
Probability density Summary Probability_density_function The probability density function is nonnegative everywhere, and the area under the entire curve is equal to 1. The terms probability distribution function and probability function have also sometimes been used to denote the probability density function. However, this use is not standard among probabilists and statisticians.
Artificial Neural Networks History Parameter_(machine_learning) > History He called it the neocognitron. In 1969, he also introduced the ReLU (rectified linear unit) activation function. The rectifier has become the most popular activation function for CNNs and deep neural networks in general.
Bitstate hashing Use Bitstate_hashing > Use Bitstate hashing is utilized in the SPIN model checker for deciding whether a state was already visited by a nested-depth-first search algorithm or not. This purportedly led to memory savings of 98% in the case of using one hash function (175 MB to 3 MB) and 92% when two hash functions are used (13 MB). The state coverage dropped to 97% in the former case. == References ==
Max Born General references Max_Born > General references Bernstein, Jeremy (2005). "Max Born and the Quantum Theory". American Journal of Physics. 73 (11): 999–1008.
Biotin Cofactor biochemistry Biotin > Cofactor biochemistry PCC catalyzes a step in the metabolism of propionyl-CoA. Metabolic degradation of the biotinylated carboxylases leads to the formation of biocytin. This compound is further degraded by biotinidase to release biotin, which is then reutilized by holocarboxylase synthetase.Biotinylation of histone proteins in nuclear chromatin is a posttranslational modification that plays a role in chromatin stability and gene expression.
Co-occurrence matrix Summary Co-occurrence_matrix A co-occurrence matrix or co-occurrence distribution (also referred to as: gray-level co-occurrence matrices GLCMs) is a matrix that is defined over an image to be the distribution of co-occurring pixel values (grayscale values, or colors) at a given offset. It is used as an approach to texture analysis with various applications especially in medical image analysis.
Cryptographic key types Key types Cryptographic_key_types > Key types Symmetric key wrapping key Symmetric key wrapping keys are used to encrypt other keys using symmetric key algorithms. Key wrapping keys are also known as key encrypting keys. Symmetric and asymmetric random number generation keys These are keys used to generate random numbers.
Coleman–Weinberg potential Summary Coleman–Weinberg_potential The same can happen in other gauge theories. In the broken phase the fluctuations of the scalar field ϕ {\displaystyle \phi } will manifest themselves as a naturally light Higgs boson, as a matter of fact even too light to explain the electroweak symmetry breaking in the minimal model - much lighter than vector bosons. There are non-minimal models that give a more realistic scenarios.
Seismic architecture Failure modes Earthquake_engineering_research > Seismic design > Failure modes Separation between the framing and the walls can jeopardize the vertical support of roof and floor systems. Soft story effect. Absence of adequate stiffness on the ground level caused damage to this structure.
TCP SYN Hardware implementations TCP_checksum_offload > Hardware implementations One way to overcome the processing power requirements of TCP is to build hardware implementations of it, widely known as TCP offload engines (TOE). The main problem of TOEs is that they are hard to integrate into computing systems, requiring extensive changes in the operating system of the computer or device. One company to develop such a device was Alacritech.
Coefficient of viscosity Eddy viscosity Viscous_friction > Molecular origins > Eddy viscosity In the study of turbulence in fluids, a common practical strategy is to ignore the small-scale vortices (or eddies) in the motion and to calculate a large-scale motion with an effective viscosity, called the "eddy viscosity", which characterizes the transport and dissipation of energy in the smaller-scale flow (see large eddy simulation). In contrast to the viscosity of the fluid itself, which must be positive by the second law of thermodynamics, the eddy viscosity can be negative.
Amiga Fast File System Summary Amiga_Fast_File_System The Amiga Fast File System (abbreviated AFFS, or more commonly historically as FFS) is a file system used on the Amiga personal computer. The previous Amiga filesystem was never given a specific name and known originally simply as "DOS" or AmigaDOS. Upon the release of FFS, the original filesystem became known as Amiga Old File System (OFS).
WSA Process The process WSA_Process > The process The main reactions in the WSA processCombustion: 2 H2S + 3 O2 ⇌ 2 H2O + 2 SO2 (-1036 kJ/mol) Oxidation: 2 SO2 + O2 ⇌ 2 SO3 (-198 kJ/mol) Hydration: SO3 + H2O ⇌ H2SO4 (g) (-101 kJ/mol) Condensation: H2SO4 (g) ⇌ H2SO4 (l) (-90 kJ/mol)The energy released by the above-mentioned reactions is used for steam production. Approximately 2–3 tons of high-pressure steam are produced per ton of acid.
QTY Code The QTY code QTY_Code > The QTY code Thus, it can be used broadly. The QTY Code has implications for designing additional GPCRs and other membrane proteins including cytokine receptors that are directly involved in cytokine storm syndrome.The QTY Code has also been applied to cytokine receptor water-soluble variants with the aim of combatting the cytokine storm syndrome (also called cytokine release syndrome) suffered by cancer patients receiving CAR-T therapy. This therapeutic application may be equally applicable to severely infected COVID-19 patients, for whom cytokine storms often lead to death.In 2021, predictions of AlphaFold 2 program proved the validity of QTY code.
Transition metal complex Stability constant Coordination_complex > Stability constant Formation constants vary widely. Large values indicate that the metal has high affinity for the ligand, provided the system is at equilibrium.Sometimes the stability constant will be in a different form known as the constant of destability. This constant is expressed as the inverse of the constant of formation and is denoted as Kd = 1/Kf .
Regulatory macrophages Mreg origin and induction Regulatory_macrophages > Mreg origin and induction Similar effect provoked interaction of macrophages and B1 B cells. Mregs can even arise following stress responses. Activation of the hypothalamic-pituitary-adrenal axis leads to production of glucocorticoids that cause decreased production of IL-12 by macrophages.Many cell types including monocytes, M1, and M2 can in a specific microenvironment differentiate to Mregs.
Lipotoxicity Kidneys Lipotoxicity > Effects in different organs > Kidneys Renal lipotoxicity occurs when excess long-chain nonesterified fatty acids are stored in the kidney and proximal tubule cells. It is believed that these fatty acids are delivered to the kidneys via serum albumin. This condition leads to tubulointerstitial inflammation and fibrosis in mild cases, and to kidney failure and death in severe cases. The current accepted treatments for lipotoxicity in renal cells are fibrate therapy and intensive insulin therapy.
Hearing conservation program Noise reduction ratings Hearing_conservation_program > Noise reduction ratings The United States Environmental Protection Agency (EPA) requires that all hearing protection devices be labeled with their associated noise reduction rating (NRR). The NRR provides the estimated attenuation of the hearing protection device. The NRR obtained in the lab is often higher than the attenuation provided in the field. To determine the amount of noise reduction afforded by a hearing protection device for the A weighted scale, OSHA recommends that 7 dB be subtracted from the NRR.
Lung consolidation Summary Pulmonary_consolidation Consolidation occurs through accumulation of inflammatory cellular exudate in the alveoli and adjoining ducts. The liquid can be pulmonary edema, inflammatory exudate, pus, inhaled water, or blood (from bronchial tree or hemorrhage from a pulmonary artery). Consolidation must be present to diagnose pneumonia: the signs of lobar pneumonia are characteristic and clinically referred to as consolidation.
Darwin's Dangerous Idea Part II: Darwinian Thinking in Biology Darwin's_Dangerous_Idea > Synopsis > Part II: Darwinian Thinking in Biology Dennett thinks adaptationism is, in fact, the best way of uncovering constraints. The tenth chapter, entitled "Bully for Brontosaurus", is an extended critique of Stephen Jay Gould, who Dennett feels has created a distorted view of evolution with his popular writings; his "self-styled revolutions" against adaptationism, gradualism and other orthodox Darwinism all being false alarms. The final chapter of part II dismisses directed mutation, the inheritance of acquired traits and Teilhard's "Omega Point", and insists that other controversies and hypotheses (like the unit of selection and Panspermia) have no dire consequences for orthodox Darwinism.
Rational consensus Spokescouncil Rational_consensus > Process models > Spokescouncil In the spokescouncil model, affinity groups make joint decisions by each designating a speaker and sitting behind that circle of spokespeople, akin to the spokes of a wheel. While speaking rights might be limited to each group's designee, the meeting may allot breakout time for the constituent groups to discuss an issue and return to the circle via their spokesperson. In the case of an activist spokescouncil preparing for the A16 Washington D.C.
Engineering control Falls Engineering_control > Physical hazards > Falls Fall protection is the use of controls designed to protect personnel from falling or in the event they do fall, to stop them without causing severe injury. Typically, fall protection is implemented when working at height, but may be relevant when working near any edge, such as near a pit or hole, or performing work on a steep surface. According to the US Department of Labor, falls account for 8% of all work-related trauma injuries leading to death.Fall guarding is the use of guard rails or other barricades to prevent a person from falling.
Industrial catalysts Carbon Monoxide Industrial_catalysts > Water gas shift reaction > Carbon Monoxide CO is a common molecule to use in a catalytic reaction, and when it interacts with a metal surface it is actually the molecular orbitals of CO that interacts with the d-band of the metal surface. When considering a molecular orbital(MO)-diagram CO can act as an σ-donor via the lone pair of the electrons on C, and a π-acceptor ligand in transition metal complexes. When a CO molecule is adsorbed on a metal surface, the d-band of the metal will interact with the molecular orbitals of CO.
Timeline of mathematics 20th century Timeline_of_mathematics > Symbolic stage > Contemporary > 20th century 1936 – Alonzo Church and Alan Turing create, respectively, the λ-calculus and the Turing machine, formalizing the notion of computation and computability. 1938 – Tadeusz Banachiewicz introduces LU decomposition.
Rete algorithm Production execution Rete_algorithm > Description > Production execution Some engines support advanced refraction strategies in which certain production instances executed in a previous cycle are not re-executed in the new cycle, even though they may still exist on the agenda. It is possible for the engine to enter into never-ending loops in which the agenda never reaches the empty state. For this reason, most engines support explicit "halt" verbs that can be invoked from production action lists.
Scientific hypothesis Honours Scientific_hypothesis > Honours Mount Hypothesis in Antarctica is named in appreciation of the role of hypothesis in scientific research.
Variable renewable energy Background and terminology Variable_renewable_energy > Background and terminology Most of these terms also apply to traditional power plants. Intermittency or variability is the extent to which a power source fluctuates. This has two aspects: a predictable variability (such as the day-night cycle) and an unpredictable part (imperfect local weather forecasting).
0.999... Algebraic arguments 0.999... > Algebraic arguments Students who did not accept the first argument sometimes accept the second argument, but, in Byers's opinion, still have not resolved the ambiguity, and therefore do not understand the representation for infinite decimals. Peressini & Peressini (2007), presenting the same argument, also state that it does not explain the equality, indicating that such an explanation would likely involve concepts of infinity and completeness. Baldwin & Norton (2012), citing Katz & Katz (2010a), also conclude that the treatment of the identity based on such arguments as these, without the formal concept of a limit, is premature. The same argument is also given by Richman (1999), who notes that skeptics may question whether x is cancellable – that is, whether it makes sense to subtract x from both sides.
Van der Waals surface area Van der Waals volume and Van der Waals surface area Van_der_Waals_surface_area > Van der Waals volume and Van der Waals surface area Van der Waals radii and volumes may be determined from the mechanical properties of gases (the original method, determining the Van der Waals constant), from the critical point (e.g., of a fluid), from crystallographic measurements of the spacing between pairs of unbonded atoms in crystals, or from measurements of electrical or optical properties (i.e., polarizability or molar refractivity). In all cases, measurements are made on macroscopic samples and results are expressed as molar quantities. Van der Waals volumes of a single atom or molecules are arrived at by dividing the macroscopically determined volumes by the Avogadro constant.
Constant of aberration Stokes' aether drag Aberration_of_starlight > Historical theories of aberration > Aether drag models > Stokes' aether drag However, the fact that light is polarized (discovered by Fresnel himself) led scientists such as Cauchy and Green to believe that the aether was a totally immobile elastic solid as opposed to Fresnel's fluid aether. There was thus renewed need for an explanation of aberration consistent both with Fresnel's predictions (and Arago's observations) as well as polarization. In 1845, Stokes proposed a 'putty-like' aether which acts as a liquid on large scales but as a solid on small scales, thus supporting both the transverse vibrations required for polarized light and the aether flow required to explain aberration. Making only the assumptions that the fluid is irrotational and that the boundary conditions of the flow are such that the aether has zero velocity far from the Earth, but moves at the Earth's velocity at its surface and within it, he was able to completely account for aberration.
Electronic envelope Summary Email_encryption Email encryption is encryption of email messages to protect the content from being read by entities other than the intended recipients. Email encryption may also include authentication. Email is prone to the disclosure of information.
Hypohalous acid Summary Hypohalous_acid A hypohalous acid is an oxyacid consisting of a hydroxyl group single-bonded to any halogen. Examples include hypofluorous acid, hypochlorous acid, hypobromous acid, and hypoiodous acid. The conjugate base is a hypohalite. They can be formed by reacting the corresponding diatomic halogen molecule (F2, Cl2, Br2, I2) with water in the reaction: X2 + H2O ⇌ HXO + HXThis also results in the corresponding hydrogen halide, which is also acidic.
Misconceptions about HIV/AIDS Cure Misconceptions_about_HIV/AIDS > Treatment > Cure However, these advances do not constitute a cure, since current treatment regimens cannot eradicate latent HIV from the body.High levels of HIV-1 (often HAART-resistant) develop if treatment is stopped, if compliance with treatment is inconsistent, or if the virus spontaneously develops resistance to an individual's regimen. Antiretroviral treatment known as post-exposure prophylaxis reduces the chance of acquiring an HIV infection when administered within 72 hours of exposure to HIV. However, an overwhelming body of clinical evidence has demonstrated the U=U rule - if someone's viral load is undetectable (<200 viral copies per mL) they are untransmissible. Essentially this means if a person living with HIV is well controlled on medications with a viral load less than 200, they cannot transmit HIV to their partners via sexual contact. The landmark study that first established this was the HPTN052 study, which looked at over 2000 couples over 10 years, where one partner was HIV positive, and the other partner was HIV negative.
Memory consistency Example Memory_consistency > Example Assume that the following case occurs: The row X is replicated on nodes M and N The client A writes row X to node M After a period of time t, client B reads row X from node NThe consistency model determines whether client B will definitely see the write performed by client A, will definitely not, or cannot depend on seeing the write.