title
stringlengths
1
149
section
stringlengths
1
1.9k
text
stringlengths
13
73.5k
Enhanced TV Binary Interchange Format
Actions
The following categories of actions are defined by EBIF: Flow of Control Actions Predicate Actions Variable Store Actions Arithmetic Actions Boolean Logic Actions Mathematic Actions String Actions Array Actions Application and Page Actions Widget Actions Table Actions Miscellaneous Actions Memory Model The action memory model is based on a variable store, and does not make use of registers or a stack. With the exception of one predefined, internal result value variable, all variables are preallocated (and typed) at compilation time. These variables are represented in the form of a table referred to as an augmented reference table, where the content of the table is initialized at compilation time, then stored and mutated at runtime by an ETV User Agent.
Enhanced TV Binary Interchange Format
Actions
Execution Model The action execution model is based on the decoding and processing of action sequences that serve as event handlers. Execution of action sequences are serialized through the sequential dispatching of events to event handlers, completing the execution of an action sequence functioning as an event handler before executing any other applicable event handlers (for that event) and before processing any other enqueued event.
Perennial crop
Perennial crop
Perennial crops are crops that – unlike annual crops – don't need to be replanted each year. After harvest, they automatically grow back. Many fruit and nut crops are naturally perennial, however there is also a growing movement to create perennial alternatives to annual crops. From the 1920s to the 1950s, researchers in the former Soviet Union attempted to perennialize annual wheats by crossing them with perennial relatives such as intermediate wheatgrass. Interest waned when the crosses repeatedly resulted in sterile offspring, and seed yield decreased significantly. The next major time the project of perennializing grain was picked up was a wheat hybrid developed by the Montana Agricultural Experiment Station in 1986, which the Rodale Institute field tested. For example, The Land Institute has bred a perennial wheat crop known as Kernza. By eliminating or greatly reducing the need for tillage, perennial cropping can reduce topsoil losses due to erosion, increase biological carbon sequestration, and greatly reduce waterway pollution through agricultural runoff due to less nitrogen input.
Perennial crop
Benefits
Erosion control: Because plant materials (stems, crowns, etc.) can remain in place year-round, topsoil erosion due to wind and rainfall/irrigation is reduced Water-use efficiency: Because these crops tend to be deeper and more fibrously-rooted than their annual counterparts, they are able to hold onto soil moisture more efficiently, while filtering pollutants (e.g. excess nitrogen) traveling to groundwater sources. Nutrient cycling efficiency: Because perennials more efficiently take up nutrients as a result of their extensive root systems, reduced amounts of nutrients need to be supplemented, lowering production costs while reducing possible excess sources of fertilizer runoff. Light interception efficiency: Earlier canopy development and longer green leaf duration increase the seasonal light interception efficiency of perennials, an important factor in plant productivity. Carbon sequestration: Because perennial grasses use a greater fraction of carbon to produce root systems, more carbon is integrated into soil organic matter, contributing to increases in soil organic carbon stocks. Perennial species have been shown to provide an opportunity for mitigating or reducing the negative effects of climate change while sustaining their agricultural productivity as well. It has also been shown that perennial plant communities may also enhance ecosystem resilience. As well as stability and ability to adapt to environmental fluctuations, due to them possessing high levels of biodiversity.
Perennial crop
Examples
Existing crops Fruit trees Oil palm Edible berries Asparagus Rhubarb Chives Mint Oregano Kale Under development Miscanthus giganteus - a perennial crop with high yields and high GHG mitigation potential. Perennial sunflower - a perennial oil and seedcrop developed through backcrossing genes with wild sunflower. Perennial grain - more extensive root systems allow for more efficient water and nutrient uptake, while reducing erosion due to rain and wind year-round. Perennial rice - currently in the development stage using similar methods to those used in producing the perennialized sunflower, perennial rice promises to reduce deforestation through increases in production efficiency by keeping cleared land out of the fallow stage for long periods of time.
SYSGO
SYSGO
SYSGO GmbH is a German information technologies company that supplies operating systems and services for embedded systems with high safety and security-related requirements, using Linux. For security-critical applications, the company offers the Hypervisor and RTOS PikeOS, an operating system for multicore processors and the foundation for intelligent devices in the Internet of Things (IoT). As an operating system manufacturer provider, SYSGO supports companies with the formal certification of software to international standards for safety and security in markets such as aerospace and defence, industrial automation, automotive, railway, medical as well as network infrastructure. SYSGO participates in a variety of international research projects and standardisation initiatives in the area of safety and security.
SYSGO
History
SYSGO was founded in 1991. On the initiative of company founder Knut Degen, the company specialized in the use of Linux-based operating systems in embedded applications. In the 1990s, SYSGO worked mainly with LynxOS. In 1999, the company launched the first product of its own, a development environment for Linux-based embedded applications by the name of ELinOS. SYSGO introduced the first version of its PikeOS real-time operating system in 2005. With hypervisor functionality integrated into its basic structure, this operating system allows multiple embedded applications with different functional safety requirements to be operated on the same processor. The current version of PikeOS can run safety-critical applications for aerospace, automotive, rail and other industrial applications. 2009 saw the market launch of a software-only implementation of an AFDX stack (Avionics Full DupleX Switched Ethernet) for Safety-Critical Ethernet in accordance with ARINC-664 Part 7, which was certified to DO-178B. In 2013, SYSGO also achieved SIL 4 certification on multicore processors for EN 50128, a European standard for safety-relevant software used in railway applications. The first subsidiary of the company was established in Ulm in 1997, followed by Prague (2004), Paris (2005) and Rostock (2008). In 2012, SYSGO was taken over by the Thales Group of France. In 2019 SYSGO built its new headquarters in Klein-Winternheim, near Mainz and moved in by April 2020.
SYSGO
Products and services
SYSGO's best-known product is PikeOS, a real-time operating system with a separation kernel-based Hypervisor, which provides multiple partitions for a variety of other operating systems and equips them with time schedules.
SYSGO
Products and services
Other products include: ELinOS, a Linux operating system for embedded applications Safety-Critical Ethernet/AFDX, a software implementation of ARINC-664 Part 7 Various components required for certificationThe PikeOS Hypervisor forms a foundation for critical systems in which both safety and security have to be ensured. The company also offers various certification kits. These certification kits include, for example, support documentation for development and testing and, if necessary, additional safety and security information to allow the development of standards-compliant systems.
SYSGO
Research
SYSGO is the technical lead for the EU research project certMILS. The goal of certMILS is primarily to make a certified European MILS platform available, and thus simplify the certification of composite IT systems. The project is supported by the EU as part of the Horizon 2020 programme.
SYSGO
Customers and partner network
Customers include companies that are working i.a. on solutions for the Internet of Things; especially suppliers and manufacturers in the areas aerospace and defence, automotive, railway and industrial sectors who have high safety and security requirements for their applications.
Agraphia
Agraphia
Agraphia is an acquired neurological disorder causing a loss in the ability to communicate through writing, either due to some form of motor dysfunction or an inability to spell. The loss of writing ability may present with other language or neurological disorders; disorders appearing commonly with agraphia are alexia, aphasia, dysarthria, agnosia, acalculia and apraxia. The study of individuals with agraphia may provide more information about the pathways involved in writing, both language related and motoric. Agraphia cannot be directly treated, but individuals can learn techniques to help regain and rehabilitate some of their previous writing abilities. These techniques differ depending on the type of agraphia.
Agraphia
Agraphia
Agraphia can be broadly divided into central and peripheral categories. Central agraphias typically involve language areas of the brain, causing difficulty spelling or with spontaneous communication, and are often accompanied by other language disorders. Peripheral agraphias usually target motor and visuospatial skills in addition to language and tend to involve motoric areas of the brain, causing difficulty in the movements associated with writing. Central agraphia may also be called aphasic agraphia as it involves areas of the brain whose major functions are connected to language and writing; peripheral agraphia may also be called nonaphasic agraphia as it involves areas of the brain whose functions are not directly connected to language and writing (typically motor areas).The history of agraphia dates to the mid-fourteenth century, but it was not until the second half of the nineteenth century that it sparked significant clinical interest. Research in the twentieth century focused primary on aphasiology in patients with lesions from strokes.
Agraphia
Characteristics
Agraphia or impairment in producing written language can occur in many ways and many forms because writing involves many cognitive processes (language processing, spelling, visual perception, visuospatial orientation for graphic symbols, motor planning, and motor control of handwriting).Agraphia has two main subgroupings: central ("aphasic") agraphia and peripheral ("nonaphasic") agraphia. Central agraphias include lexical, phonological, deep, and semantic agraphia. Peripheral agraphias include allographic, apraxic, motor execution, hemianoptic and afferent agraphia.
Agraphia
Characteristics
Central Central agraphia occurs when there are both impairments in spoken language and impairments to the various motor and visualization skills involved in writing. Individuals who have agraphia with fluent aphasia write a normal quantity of well-formed letters, but lack the ability to write meaningful words. Receptive aphasia is an example of fluent aphasia. Those who have agraphia with nonfluent aphasia can write brief sentences but their writing is difficult to read. Their writing requires great physical effort but lacks proper syntax and often has poor spelling. Expressive aphasia is an example of nonfluent aphasia. Individuals who have Alexia with agraphia have difficulty with both the production and comprehension of written language. This form of agraphia does not impair spoken language.
Agraphia
Characteristics
Deep agraphia affects an individual's phonological ability and orthographic memory. Deep agraphia is often the result of a lesion involving the left parietal region (supramarginal gyrus or insula). Individuals can neither remember how words look when spelled correctly, nor sound them out to determine spelling. Individuals typically rely on their damaged orthographic memory to spell; this results in frequent errors, usually semantic in nature. Individuals have more difficulty with abstract concepts and uncommon words. Reading and spoken language are often impaired as well.
Agraphia
Characteristics
Gerstmann syndrome agraphia is the impairment of written language production associated with the following structural symptoms: difficulty discriminating between one's own fingers, difficulty distinguishing left from right, and difficulty performing calculations. All four of these symptoms result from pathway lesions. Gerstmann's syndrome may additionally be present with alexia and mild aphasia. Global agraphia also impairs an individual's orthographic memory although to a greater extent than deep agraphia. In global apraxia, spelling knowledge is lost to such a degree that the individual can only write very few meaningful words, or cannot write any words at all. Reading and spoken language are also markedly impaired.
Agraphia
Characteristics
Lexical and structural agraphia are caused by damage to the orthographic memory; these individuals cannot visualize the spelling of a word, though they do retain the ability to sound them out. This impaired spelling memory can imply the loss or degradation of the knowledge or just an inability to efficiently access it. There is a regularity effect associated with lexical agraphia in that individuals are less likely to correctly spell words without regular, predictable spellings. Additionally, spelling ability tends to be less impaired for common words. Individuals also have difficulty with homophones. Language competence in terms of grammar and sentence writing tends to be preserved.
Agraphia
Characteristics
Phonological agraphia is the opposite of lexical agraphia in that the ability to sound out words is impaired, but the orthographical memory of words may be intact. It is associated with a lexicality effect by a difference in the ability to spell words versus nonwords; individuals with this form of agraphia are depending on their orthographic memory. Additionally, it is often harder for these individuals to access more abstract words without strong semantic representations (i.e., it is more difficult for them to spell prepositions than concrete nouns).
Agraphia
Characteristics
Pure agraphia is the impairment in written language production without any other language or cognitive disorder.Agraphia can occur separately or co-occur and can be caused by damage to the angular gyrus Peripheral Peripheral agraphias occurs when there is damage to the various motor and visualization skills involved in writing.
Agraphia
Characteristics
Apraxic agraphia is the impairment in written language production associated with disruption of the motor system. It results in distorted, slow, effortful, incomplete, and/or imprecise letter formation. Though written letters are often so poorly formed that they are almost illegible, the ability to spell aloud is often retained. This form of agraphia is caused specifically by a loss of specialized motor plans for the formation of letters and not by any dysfunction affecting the writing hand. Apraxic agraphia may present with or without ideomotor apraxia. Paralysis, chorea, Parkinson's disease (micrographia), and dystonia (writer's cramp) are motor disorders commonly associated with agraphia.
Agraphia
Characteristics
Hysterical agraphia is the impairment in written language production caused by a conversion disorder. Reiterative agraphia is found in individuals who repeat letters, words, or phrases in written language production an abnormal number of times. Perseveration, paragraphia, and echographia are examples of reiterative agraphia.
Agraphia
Characteristics
Visuospatial agraphia is the impairment in written language production defined by a tendency to neglect one portion (often an entire side) of the writing page, slanting lines upward or downward, and abnormal spacing between letters, syllables, and words. The orientation and correct sequencing of the writing will also be impaired. Visuospatial agraphia is frequently associated with left hemispatial neglect, difficulty in building or assembling objects, and other spatial difficulties.
Agraphia
Causes
Agraphia has a multitude of causes ranging from strokes, lesions, traumatic brain injury, and dementia. Twelve regions of the brain are associated with handwriting. The four distinct functional areas are the left superior frontal area composed of the middle frontal gyrus and the superior frontal sulcus, the left superior parietal area composed of the inferior parietal lobule, the superior parietal lobule and the intraparietal sulcus and lastly the primary motor cortex and the somatosensory cortex. The eight other areas are considered associative areas and are the right anterior cerebellum, the left posterior nucleus of the thalamus, the left inferior frontal gyrus, the right posterior cerebellum, the right superior frontal cortex, the right inferior parietal lobule, the left fusiform gyrus and the left putamen. The specific type of agraphia resulting from brain damage will depend on which area of the brain was damaged.
Agraphia
Causes
Phonological agraphia is linked to damage in areas of the brain involved in phonological processing skills (sounding out words), specifically the language areas around the sylvian fissure, such as Broca's area, Wernicke's area, and the supramarginal gyrus.Lexical agraphia is associated with damage to the left angular gyrus and/or posterior temporal cortex. The damage is typically posterior and inferior to the perisylvian language areas.Deep agraphia involves damage to the same areas of the brain as lexical agraphia plus some damage to the perisylvian language areas as well. More extensive left hemisphere damage can lead to global agraphia.Gerstmann's syndrome is caused by a lesion of the dominant (usually the left) parietal lobe, usually an angular gyrus lesion.Apraxic agraphia with ideomotor apraxia is typically caused by damage to the superior parietal lobe (where graphomotor plans are stored) or the premotor cortex (where the plans are converted into motor commands). Additionally, some individuals with cerebellar lesions (more typically associated with non-apraxic motor dysfunction) develop apraxic agraphia. Apraxic agraphia without ideomotor apraxia may be caused by damage to either of the parietal lobes, the dominant frontal lobe, or to the dominant thalamus.Visuospatial agraphia typically has a right hemisphere pathology. Damage to the right frontal area of the brain may cause more motor defects, whereas damage to the posterior part of the right hemisphere leads predominantly to spatial defects in writing.
Agraphia
Causes
Alzheimer's disease Agraphia is often seen in association with Alzheimer's disease (AD). Writing disorders can be an early manifestation of AD. In individuals with AD, the first sign pertaining to writing skills is the selective syntactic simplification of their writing. Individuals will write with less description, detail and complexity, and other markers, such as grammatical errors, may emerge. Different agraphias may develop as AD progresses. In the beginning stages of AD, individuals show signs of allographic agraphia and apraxic agraphia. Allographic agraphia is represented in AD individuals by the mixing of lower and upper case letters in words; apraxic agraphia is represented in AD patients through poorly constructed or illegible letters and omission or over repetition of letter strokes. As their AD progresses, so does the severity of their agraphia; they may begin to form spatial agraphia, which is the inability to write in a straight horizontal line, and there are often unnecessary gaps between letters and words.A connection between AD and agraphia is the role of memory in normal writing ability. Normal spellers have access to a lexical spelling system that uses a whole-word; when functioning properly, it allows for recall of the spelling of a complete word, not as individual letters or sounds. This system further uses an internal memory store where the spellings of hundreds of words are kept. This is called the graphemic output lexicon and is aptly named in relation to the graphemic buffer, which is the short term memory loop for many of the functions involved in handwriting. When the spelling system cannot be used, such as with unfamiliar words, non-words or words that we do not recognize the spelling for, some people are able to use the phonological process called the sub-lexical spelling system. This system is used to sound out a word and spell it. In AD individuals, memory stores that are used for everyday handwriting are lost as the disease progresses.
Agraphia
Management
Agraphia cannot be directly treated, but individuals can be rehabilitated to regain some of their previous writing abilities.For the management of phonological agraphia, individuals are trained to memorize key words, such as a familiar name or object, that can then help them form the grapheme for that phoneme. Management of allographic agraphia can be as simple as having alphabet cards so the individual can write legibly by copying the correct letter shapes. There are few rehabilitation methods for apraxic agraphia; if the individual has considerably better hand control and movement with typing than they do with handwriting, then they can use technological devices. Texting and typing do not require the same technical movements that handwriting does; for these technological methods, only spatial location of the fingers to type is required. If copying skills are preserved in an individual with apraxic agraphia, repeated copying may help shift from the highly intentional and monitored hand movements indicative of apraxic agraphia to a more automated control.Micrographia is a condition that can occur with the development of other disorders, such as Parkinson's disease, and is when handwriting becomes illegible because of small writing. For some individuals, a simple command to write bigger eliminates the issue.
Agraphia
Management
Anagram and Copy Treatment (ACT) uses the arrangement of component letters of target words and then repeated copying of the target word. This is similar to the CART; the main difference is that the target words for ACT are specific to the individual. Target words that are important in the life of the individual are emphasized because people with deep or global agraphias do not typically have the same memory for the words as other people with agraphia may. Writing can be even more important to these people as it can cue spoken language. ACT helps in this by facilitating the relearning of a set of personally relevant written words for use in communication.
Agraphia
Management
Copy and Recall Treatment (CART) method helps to reestablish the ability to spell specific words that are learned through repeated copying and recall of target words. CART is more likely to be successful in treating lexical agraphia when a few words are trained to mastery than when a large group of unrelated words is trained. Words chosen can be individualized to the patient, which makes treatment more personalized.
Agraphia
Management
Graphemic buffer uses the training of specific words to improve spelling. Cueing hierarchies and copy and recall method of specific words are used, to work the words into the short-term memory loop, or graphemic buffer. The segmentation of longer words into shorter syllables helps bring words into short-term memory. Problem solving approach is used as a self-correcting method for phonological errors. The individual sounds out the word and attempts to spell it, typically using an electronic dictionary-type device that indicates correct spelling. This method takes advantage of the preserved sound-to-letter correspondences when they are intact. This approach may improve access to spelling memory, strengthen orthographic representations, or both.
Agraphia
History
In 1553 Thomas Wilson's book Arte of Rhetorique held the earliest known description of what would now be called acquired agraphia. In the second half of the nineteenth century, the loss of the ability to produce written language received clinical attention, when ideas about localization in the brain influenced studies about dissociation between written and spoken language as well as reading and writing. Paul Broca's work on aphasia during this time inspired researchers across Europe and North America to begin conducting studies on the correlation between lesions and loss of function in various cortical areas.During the 1850s, clinicians such as Armand Trousseau and John Hughlings Jackson held the prevailing view that the same linguistic deficiency occurred in writing as well as speech and reading impairments. In 1856, Louis-Victor Marcé argued that written and spoken language were independent of each other; he discovered that in many patients with languages disorders, both speech and writing was impaired. The recovery of written and spoken language was not always parallel suggesting that these two modes of expression were independent. He believed the ability to write not only involved motor control, but also the memory of the signs and their meaning.In 1867, William Ogle, who coined the term agraphia, made several key observations about the patterns of dissociation found in written and spoken language. He demonstrated that some patients with writing impairments were able to copy written letters but struggled arranging the letters to form words. Ogle knew that aphasia and agraphia often occurred together, but he confirmed that the impairment of two different types of language (spoken and written) can vary in type and severity. Although Ogle's review helped make important advancements toward understanding writing disorders, a documented case of pure agraphia was missing.In 1884, over two decades after the research of acquired language disorders began, Albert Pitres made an important contribution when he published a clinical report of pure agraphia. According to Pitres, Marcé and Ogle were the first to emphasize the dissociation between speech and writing. His work was also strongly influenced by Théodule-Armand Ribot's modular approach to memory. Pitres's clinical case study in 1884 argues for the localization of writing in the brain.Pitres's reading and writing models consisted of three main components: visual (the memory for letters and how letters are put together to form syllables and word), auditory (the memory for the sounds of each letter), and motor (motor-graphic memory of the letters). He proposed the following classifications of agraphia: Agraphia by word blindness: inability to copy a model, but the individual can write spontaneously and in response to dictation.
Agraphia
History
Agraphia by word deafness: inability to write to dictation, but the individual can copy a model and write spontaneously. Motor agraphia: no ability to write, but the individual can spell.Pitres said in aphasia, the intellect is not systematically impaired.Research in the twentieth century focused primarily on aphasiology in patients with lesions from cerebrovascular accidents. From these studies, researches gained significant insight into the complex cognitive process of producing written language.
Long posterior ciliary arteries
Long posterior ciliary arteries
The long posterior ciliary arteries are arteries of the orbit. There are long posterior ciliary arteries two on each side of the body. They are branches of the ophthalmic artery. They pass forward within the eye to reach the ciliary body where they ramify and anastomose with the anterior ciliary arteries, thus forming the major arterial circle of the iris.The long posterior ciliary arteries contribute arterial supply to the choroid, ciliary body, and iris.
Long posterior ciliary arteries
Anatomy
There are two long ciliary arteries. They are branches of the ophthalmic artery.
Long posterior ciliary arteries
Anatomy
Course and relations The long posterior ciliary arteries first run near the optic nerve before piercing the posterior sclera near the optic nerve. They pass anterior-ward - one along each side of the eyeball - between the sclera and choroid to reach the ciliary muscle where they divide into two branches which go on to form the major arterial circle of the iris.
Long posterior ciliary arteries
Anatomy
Anastomoses Non-terminal branches of the long posterior ciliary arteries anastomose with branches of the short posterior ciliary arteries.Upon reaching the ciliary body, the long posterior ciliary arteries ramify superiorly and inferiorly, the branches forming anastomoses with each other and with those of the anterior ciliary arteries to form the major arterial circle of the iris. Distribution The long posterior ciliary arteries supply the choroid, ciliary body, and iris.Non-terminal branches are distributed to the ciliary muscle/ciliary body, and anterior choroid. Terminal branches are distributed to the iris and ciliary body via the major arterial circle of the iris.
Dynamic/Dialup Users List
Dynamic/Dialup Users List
A Dial-up/Dynamic User List (DUL) is a type of DNSBL which contains the IP addresses an ISP assigns to its customer on a temporary basis, often using DHCP or similar protocols. Dynamically assigned IP addresses are contrasted with static IP addresses which do not change once they have been allocated by the service provider.
Dynamic/Dialup Users List
Dynamic/Dialup Users List
DULs serve several purposes. Their primary function is to assist an ISP in enforcement of its Acceptable Use Policy, many of which prohibit customers from setting up an email server. Customers are expected to use the email facilities of the service provider. This use of a DUL is especially helpful in curtailing abuse when a customer's computer has been converted into a zombie computer and is distributing email without the knowledge of the computer's owner. A second major use involves receivers who do not wish to accept email from computers with dynamically assigned email addresses. They use DULs to enforce this policy. Receivers adopt such policies because computers at dynamically assigned IP addresses so often are a source of spam.
Dynamic/Dialup Users List
Dynamic/Dialup Users List
The first DUL was created by Gordon Fecyk in 1998. It quickly became quite popular because it addressed a specific tactic popular with spammers at the time. The DUL subsequently was absorbed by Mail Abuse Prevention System (MAPS) in 1999. When MAPS was no longer a free service, other DNSBLs such as Dynablock, Not Just Another Bogus List (NJABL), and Spam and Open Relay Blocking System (SORBS) began providing lists of dynamically assigned IP addresses.
High Technology Theft Apprehension and Prosecution Program
High Technology Theft Apprehension and Prosecution Program
The High Technology Theft Apprehension and Prosecution Program (HTTAP Program) is a program within the California Emergency Management Agency (CalEMA) concerned with high technology crime including white-collar crime, cracking, computerized money laundering, theft of services, copyright infringement of software, remarking and counterfeiting of computer hardware and software, and industrial espionage.
High Technology Theft Apprehension and Prosecution Program
High Technology Crime Advisory Committee
The High Technology Crime Advisory Committee was "established for the purpose of formulating a comprehensive written strategy for addressing high technology crime throughout the state" and is composed of the following individuals appointed by the CalEMA Secretary: a designee of the California District Attorneys Association a designee of the California State Sheriffs Association a designee of the California Police Chiefs Association a designee of the California Attorney General a designee of the California Highway Patrol a designee of the High Technology Crime Investigation Association a designee of the California Emergency Management Agency a designee of the American Electronics Association to represent California computer system manufacturers a designee of the American Electronics Association to represent California computer software producers a designee of CTIA - The Wireless Association a representative of the California Internet industry a designee of the Semiconductor Equipment and Materials International a designee of the California Cable & Telecommunications Association a designee of the Motion Picture Association of America a designee of the California Communications Associations (CalCom) a representative of the California banking industry a representative of the California Office of Information Security and Privacy Protection a representative of the California Department of Finance a representative of the California State Chief Information Officer a representative of the Recording Industry Association of America a representative of the Consumers Union
High Technology Theft Apprehension and Prosecution Program
Task Forces
The program is implemented by funding and supporting independent regional task forces: the Computer and Technology Crime High-Tech Response Team (CATCH) of the San Diego County District Attorney's Office the Northern California Computer Crimes Task Force (NC3TF) of the Marin County District Attorney's Office the Rapid Enforcement Allied Computer Team (REACT) of the Santa Clara County District Attorney's Office the Southern California High Tech Task Force (SCHTTF) of the Los Angeles County Sheriff's Department the Sacramento Valley Hi-Tech Crimes Task Force (SVHTCTF) of the Sacramento County Sheriff's Department
Amylase
Amylase
An amylase () is an enzyme that catalyses the hydrolysis of starch (Latin amylum) into sugars. Amylase is present in the saliva of humans and some other mammals, where it begins the chemical process of digestion. Foods that contain large amounts of starch but little sugar, such as rice and potatoes, may acquire a slightly sweet taste as they are chewed because amylase degrades some of their starch into sugar. The pancreas and salivary gland make amylase (alpha amylase) to hydrolyse dietary starch into disaccharides and trisaccharides which are converted by other enzymes to glucose to supply the body with energy. Plants and some bacteria also produce amylase. Specific amylase proteins are designated by different Greek letters. All amylases are glycoside hydrolases and act on α-1,4-glycosidic bonds.
Amylase
Classification
α-Amylase The α-amylases (EC 3.2.1.1 ) (CAS 9014-71-5) (alternative names: 1,4-α-D-glucan glucanohydrolase; glycogenase) are calcium metalloenzymes. By acting at random locations along the starch chain, α-amylase breaks down long-chain saccharides, ultimately yielding either maltotriose and maltose from amylose, or maltose, glucose and "limit dextrin" from amylopectin. They belong to glycoside hydrolase family 13 (https://www.cazypedia.org/index.php/Glycoside_Hydrolase_Family_13). Because it can act anywhere on the substrate, α-amylase tends to be faster-acting than β-amylase. In animals, it is a major digestive enzyme, and its optimum pH is 6.7–7.0.In human physiology, both the salivary and pancreatic amylases are α-amylases. The α-amylase form is also found in plants, fungi (ascomycetes and basidiomycetes) and bacteria (Bacillus).
Amylase
Classification
β-Amylase Another form of amylase, β-amylase (EC 3.2.1.2 ) (alternative names: 1,4-α-D-glucan maltohydrolase; glycogenase; saccharogen amylase) is also synthesized by bacteria, fungi, and plants. Working from the non-reducing end, β-amylase catalyzes the hydrolysis of the second α-1,4 glycosidic bond, cleaving off two glucose units (maltose) at a time. During the ripening of fruit, β-amylase breaks starch into maltose, resulting in the sweet flavor of ripe fruit. They belong to glycoside hydrolase family 14.
Amylase
Classification
Both α-amylase and β-amylase are present in seeds; β-amylase is present in an inactive form prior to germination, whereas α-amylase and proteases appear once germination has begun. Many microbes also produce amylase to degrade extracellular starches. Animal tissues do not contain β-amylase, although it may be present in microorganisms contained within the digestive tract. The optimum pH for β-amylase is 4.0–5.0.
Amylase
Classification
γ-Amylase γ-Amylase (EC 3.2.1.3 ) (alternative names: Glucan 1,4-a-glucosidase; amyloglucosidase; exo-1,4-α-glucosidase; glucoamylase; lysosomal α-glucosidase; 1,4-α-D-glucan glucohydrolase) will cleave α(1–6) glycosidic linkages, as well as the last α-1,4 glycosidic bond at the nonreducing end of amylose and amylopectin, yielding glucose. The γ-amylase has the most acidic optimum pH of all amylases because it is most active around pH 3. They belong to a variety of different GH families, such as glycoside hydrolase family 15 in fungi, glycoside hydrolase family 31 of human MGAM, and glycoside hydrolase family 97 of bacterial forms.
Amylase
Uses
Fermentation α- and β-amylases are important in brewing beer and liquor made from sugars derived from starch. In fermentation, yeast ingests sugars and excretes ethanol. In beer and some liquors, the sugars present at the beginning of fermentation have been produced by "mashing" grains or other starch sources (such as potatoes). In traditional beer brewing, malted barley is mixed with hot water to create a "mash", which is held at a given temperature to allow the amylases in the malted grain to convert the barley's starch into sugars. Different temperatures optimize the activity of alpha or beta amylase, resulting in different mixtures of fermentable and unfermentable sugars. In selecting mash temperature and grain-to-water ratio, a brewer can change the alcohol content, mouthfeel, aroma, and flavor of the finished beer.
Amylase
Uses
In some historic methods of producing alcoholic beverages, the conversion of starch to sugar starts with the brewer chewing grain to mix it with saliva. This practice continues to be practiced in home production of some traditional drinks, such as chhaang in the Himalayas, chicha in the Andes and kasiri in Brazil and Suriname.
Amylase
Uses
Flour additive Amylases are used in breadmaking and to break down complex sugars, such as starch (found in flour), into simple sugars. Yeast then feeds on these simple sugars and converts it into the waste products of ethanol and carbon dioxide. This imparts flavour and causes the bread to rise. While amylases are found naturally in yeast cells, it takes time for the yeast to produce enough of these enzymes to break down significant quantities of starch in the bread. This is the reason for long fermented doughs such as sourdough. Modern breadmaking techniques have included amylases (often in the form of malted barley) into bread improver, thereby making the process faster and more practical for commercial use.α-Amylase is often listed as an ingredient on commercially package-milled flour. Bakers with long exposure to amylase-enriched flour are at risk of developing dermatitis or asthma.
Amylase
Uses
Molecular biology In molecular biology, the presence of amylase can serve as an additional method of selecting for successful integration of a reporter construct in addition to antibiotic resistance. As reporter genes are flanked by homologous regions of the structural gene for amylase, successful integration will disrupt the amylase gene and prevent starch degradation, which is easily detectable through iodine staining.
Amylase
Uses
Medical uses Amylase also has medical applications in the use of pancreatic enzyme replacement therapy (PERT). It is one of the components in Sollpura (liprotamase) to help in the breakdown of saccharides into simple sugars. Other uses An inhibitor of alpha-amylase, called phaseolamin, has been tested as a potential diet aid.When used as a food additive, amylase has E number E1100, and may be derived from pig pancreas or mold fungi. Bacilliary amylase is also used in clothing and dishwasher detergents to dissolve starches from fabrics and dishes. Factory workers who work with amylase for any of the above uses are at increased risk of occupational asthma. Five to nine percent of bakers have a positive skin test, and a fourth to a third of bakers with breathing problems are hypersensitive to amylase.
Amylase
Hyperamylasemia
Blood serum amylase may be measured for purposes of medical diagnosis. A higher than normal concentration may reflect any of several medical conditions, including acute inflammation of the pancreas (which may be measured concurrently with the more specific lipase), perforated peptic ulcer, torsion of an ovarian cyst, strangulation, ileus, mesenteric ischemia, macroamylasemia and mumps. Amylase may be measured in other body fluids, including urine and peritoneal fluid.
Amylase
Hyperamylasemia
A January 2007 study from Washington University in St. Louis suggests that saliva tests of the enzyme could be used to indicate sleep deficits, as the enzyme increases its activity in correlation with the length of time a subject has been deprived of sleep.
Amylase
History
In 1831, Erhard Friedrich Leuchs (1800–1837) described the hydrolysis of starch by saliva, due to the presence of an enzyme in saliva, "ptyalin", an amylase. it was named after the Ancient Greek name for saliva: πτύαλον - ptyalon. The modern history of enzymes began in 1833, when French chemists Anselme Payen and Jean-François Persoz isolated an amylase complex from germinating barley and named it "diastase". It is from this term that all subsequent enzyme names tend to end in the suffix -ase. In 1862, Alexander Jakulowitsch Danilewsky (1838–1923) separated pancreatic amylase from trypsin.
Amylase
Evolution
Salivary amylase Saccharides are a food source rich in energy. Large polymers such as starch are partially hydrolyzed in the mouth by the enzyme amylase before being cleaved further into sugars. Many mammals have seen great expansions in the copy number of the amylase gene. These duplications allow for the pancreatic amylase AMY2 to re-target to the salivary glands, allowing animals to detect starch by taste and to digest starch more efficiently and in higher quantities. This has happened independently in mice, rats, dogs, pigs, and most importantly, humans after the agricultural revolution.Following the agricultural revolution 12,000 years ago, human diet began to shift more to plant and animal domestication in place of hunting and gathering. Starch has become a staple of the human diet.
Amylase
Evolution
Despite the obvious benefits, early humans did not possess salivary amylase, a trend that is also seen in evolutionary relatives of the human, such as chimpanzees and bonobos, who possess either one or no copies of the gene responsible for producing salivary amylase.Like in other mammals, the pancreatic alpha-amylase AMY2 was duplicated multiple times. One event allowed it to evolve salivary specificity, leading to the production of amylase in the saliva (named in humans as AMY1). The 1p21.1 region of human chromosome 1 contains many copies of these genes, variously named AMY1A, AMY1B, AMY1C, AMY2A, AMY2B, and so on.However, not all humans possess the same number of copies of the AMY1 gene. Populations known to rely more on saccharides have a higher number of AMY1 copies than human populations that, by comparison, consume little starch. The number of AMY1 gene copies in humans can range from six copies in agricultural groups such as European-American and Japanese (two high starch populations) to only two to three copies in hunter-gatherer societies such as the Biaka, Datog, and Yakuts.The correlation that exists between starch consumption and number of AMY1 copies specific to population suggest that more AMY1 copies in high starch populations has been selected for by natural selection and considered the favorable phenotype for those individuals. Therefore, it is most likely that the benefit of an individual possessing more copies of AMY1 in a high starch population increases fitness and produces healthier, fitter offspring.This fact is especially apparent when comparing geographically close populations with different eating habits that possess a different number of copies of the AMY1 gene. Such is the case for some Asian populations that have been shown to possess few AMY1 copies relative to some agricultural populations in Asia. This offers strong evidence that natural selection has acted on this gene as opposed to the possibility that the gene has spread through genetic drift.Variations of amylase copy number in dogs mirrors that of human populations, suggesting they acquired the extra copies as they followed humans around. Unlike humans whose amylase levels depend on starch content in diet, wild animals eating a broad range of foods tend to have more copies of amylase. This may have to do with mainly detection of starch as opposed to digestion.
Containerization (computing)
Containerization (computing)
In software engineering, containerization is operating system-level virtualization or application-level virtualization over multiple network resources so that software applications can run in isolated user spaces called containers in any cloud or non-cloud environment, regardless of type or vendor.
Containerization (computing)
Usage
The containers are basically a fully functional and portable cloud or non-cloud computing environment surrounding the application and keeping it independent of other parallelly running environments. Individually each container simulates a different software application and runs isolated processes by bundling related configuration files, libraries and dependencies. But, collectively, multiple containers share a common operating system kernel (OS).In recent times, the containerization technology has been widely adopted by cloud computing platforms like Amazon Web Services, Microsoft Azure, Google Cloud Platform, and IBM Cloud. Containerization has also been pursued by the U.S. Department of Defense as a way of more rapidly developing and fielding software updates, with first application in its F-22 air superiority fighter.
Containerization (computing)
Types of containers
OS containers Application containers Security issues Because of the shared OS, security threats can affect the whole containerized system. In containerized environments, security scanners generally protect the OS but not the application containers, which adds unwanted vulnerability.
Containerization (computing)
Container management, orchestration, clustering
Container orchestration or container management is mostly used in the context of application containers. Implementations providing such orchestration include Kubernetes and Docker swarm.
Containerization (computing)
Container cluster management
Container clusters need to be managed. This includes functionality to create a cluster, to upgrade the software or repair it, balance the load between existing instances, scale by starting or stopping instances to adapt to the number of users, to log activities and monitor produced logs or the application itself by querying sensors. Open-source implementations of such software include OKD and Rancher. Quite a number of companies provide container cluster management as a managed service, like Alibaba, Amazon, Google, Microsoft.
KID
KID
KID (an acronym standing for Kindle Imagine Develop) was a Japan-based company specializing in porting and developing bishōjo games.
KID
History
KID was founded in 1988, with capital of 160 million yen. In the early 1990s, it served primarily as a contract developer. Notable titles from this era include Burai Fighter, Low G Man, G.I. Joe, Isolated Warrior and Recca. In 1997, it began porting PC games to games consoles. In 1999, it released an original title called Memories Off on PlayStation, which later became its first well-known series. In 2000, it released the original title Never 7: The End of Infinity, the first in the Infinity series. KID also created the popular underground PlayStation game Board Game Top Shop. In 2005, KID became a sponsor of the Japanese drama series Densha Otoko.
KID
History
The company declared bankruptcy in 2006. However, in February 2007 it was announced that KID's intellectual properties had been acquired by the CyberFront Corporation, which would continue all unfinished projects until its closure in December 2013. Kaga Create then bought CyberFront Corporation and owned the rights to KID's works. After Kaga Create closed down, 5pb. bought Cyberfront's assets which also included all of KID's works.
KID
Works
Infinity series Infinity Cure Never 7: The End of Infinity Ever 17: The Out of Infinity Remember 11: The Age of Infinity 12Riven: The Psi-Climinal of Integral Memories Off series Memories Off Memories Off 2nd You that became a Memory ~Memories Off~ Memories Off ~And then~ Memories Off ~And Then Again~ Memories Off 5: Togireta Film Memories Off #5 encore Your Memories Off: Girl's Style Other Blocken (Arcade) Armored Police Metal Jack (Game Boy) Kingyo Chūihō! 2 Gyopichan o Sagase! (Game Boy) Battle Grand Prix (SNES) Jumpin' Derby (Super Famicom) Super Bowling (SNES) Super Jinsei Game (series) (2 & 3) (Super Famicom) Chibi Maruko-chan: Okozukai Daisakusen (Game Boy, 1990) Chibi Maruko-Chan 2: Deluxe Maruko World (Game Boy, 1991) Chibi Maruko-chan 3: Mezase! Game Taishou no Maki (Game Boy, 1992) Chibi Maruko-chan 4: Korega Nihon Dayo Ouji Sama (Game Boy, 1992) Chibi Maruko-Chan: Maruko Deluxe Gekijou (Game Boy, 1995) Burai Fighter Low G Man: The Low Gravity Man Bananan Ouji no Daibouken Kick Master G.I. Joe G.I. Joe: The Atlantis Factor Rock 'n' Ball Sumo Fighter: Tōkaidō Basho UFO Kamen Yakisoban Sutobasu Yarō Shō: 3 on 3 Basketball Mini 4WD Shining Scorpion Let's & Go!! Pepsiman Doki! Doki! Yūenchi: Crazy Land Daisakusen (Famicom) Ai Yori Aoshi (PS2 and PC adaptation) Ryu-Koku (final game released before the bankruptcy) Separate Hearts Ski Air Mix Recca (Famicom Shooter created for the "Summer Carnival '92" gaming tournament) We Are* Close to: Inori no Oka Yume no Tsubasa Max Warrior: Wakusei Kaigenrei Kaitou Apricot (PlayStation) Kiss yori... (Sega Saturn and WonderSwan) 6 Inch my Darling (Sega Saturn) Dokomademo Aoku... (consumer port of TopCat's Hateshinaku Aoi, Kono Sora no Shita de...) Kagayaku Kisetsu e (consumer port of Tactics' One: Kagayaku Kisetsu e) She'sn Screen (consumer port of Ather's Campus ~Sakura no Mau Naka de~) Emmyrea (consumer port of Penguin Soft's Nemureru Mori no Ohime-sama) My Merry May Iris Flamberge no Seirei (consumer port of Nikukyuu's Mei King) Prism Heart (Dreamcast) Oujisama Lv1 (PlayStation) Boku to Bokura no Natsu (Dreamcast) Monochrome (PlayStation 2 and PSP) Hōkago Ren'ai Club – Koi no Etude (Sega Saturn) Subete ga F ni Naru (PlayStation)
Staircase voltammetry
Staircase voltammetry
Staircase voltammetry is a derivative of linear sweep voltammetry. In linear sweep voltammetry the current at a working electrode is measured while the potential between the working electrode and a reference electrode is swept linearly in time. Oxidation or reduction of species is registered as a peak or trough in the current signal at the potential at which the species begins to be oxidized or reduced. In staircase voltammetry, the potential sweep is a series of stair steps. The current is measured at the end of each potential change, right before the next, so that the contribution to the current signal from the capacitive charging current is reduced.
Hall-Riggs syndrome
Hall-Riggs syndrome
Hall-Riggs syndrome is a rare genetic disorder that causes neurological issues and birth defects. People with Hall-Riggs syndrome usually have skeletal dysplasia, facial deformities, and intellectual disabilities. Only 8 cases from 2 families worldwide have been described in medical literature. It is an autosomal recessive genetic disorder, meaning both parents must carry the gene in order for their offspring to be affected.Common characteristics of Hall-Riggs syndrome include: spondyloepimetaphyseal dysplasia short stature shortened limbs, fingers, and toes microcephaly scoliosis seizures widened nasal bridge and mouth other dysmorphic facial features intellectual disabilities recurrent vomiting episodes
Hall-Riggs syndrome
Cases
1975: Hall and Riggs describe 6 out of 15 children born to consanguineous parents. Said children had severe intellectual deficit, microcephaly, facial dysmorphisms consisting of nostril anteversion, depressed nasal bridge and large lips, and progressive dysplasia of the skeletal system, including scoliosis, flattened femoral heads, shortened femoral necks, shortened proximal segments of the arms, growth delays, and epiphyseal flattening affecting the fingers and ankles. Said children didn't acquire the ability of speech even in adulthood. The parents of the 15 children were healthy, unaffected first-cousins.
Hall-Riggs syndrome
Cases
2000: Silengo and Rigardwtto describe two Italian siblings of the opposite sex born to healthy, unaffected non-consanguineous parents. Said children had the same symptoms as the previously described family alongside short stature and hypertelorbitism, Spondylometaphyseal dysplasia and mild epiphyseal changes were confirmed through radiographs. MRI findings included the presence of cavum vergae and multiple cysts in the septum pellucidum. EEGs came back abnormal. High-resolution karyotypes came back normal. The brother had a history of seizures and psychomotor instability and agitation. Other symptoms included brachydactyly type D, dorsal kyphosis, platyspondyly, enamel hypoplasia, coarse and thick hair, and feeding difficulties.
Hanover bars
Hanover bars
Hanover bars, in one of the PAL television video formats, are an undesirable visual artifact in the reception of a television image. The name refers to the city of Hannover, in which the PAL system developer Telefunken Fernseh und Rundfunk GmbH was located.
Hanover bars
Hanover bars
The PAL system encodes color as YUV. The U (corresponding to B-Y) and V (corresponding to R-Y) signals carry the color information for a picture, with the phase of the V signal reversed (i.e. shifted through 180 degrees) on alternate lines (hence the name PAL, or phase alternate line). This is done to cancel minor phase errors in the reception process. However, if gross errors occur, complementary errors from the V signal carry into the U signal, and thus visible stripes occur.Later PAL systems introduced alterations to ensure that Hanover bars do not occur, introducing a swinging burst to the color synchronization. Other PAL systems may handle this problem differently.
Hanover bars
Suppression of Hanover bars
To suppress Hanover bars, PAL color decoders use a delay line that repeats the chroma information from each previous line and blends it with the current line. This causes phase errors to cancel out, at the cost of vertical color resolution, and in early designs, also a loss of color saturation proportional to the phase error.
Mastering (audio)
Mastering (audio)
Mastering, a form of audio post production, is the process of preparing and transferring recorded audio from a source containing the final mix to a data storage device (the master), the source from which all copies will be produced (via methods such as pressing, duplication or replication). In recent years digital masters have become usual, although analog masters—such as audio tapes—are still being used by the manufacturing industry, particularly by a few engineers who specialize in analog mastering.Mastering requires critical listening; however, software tools exist to facilitate the process. Results depend upon the intent of the engineer, the skills of the engineer, the accuracy of the speaker monitors, and the listening environment. Mastering engineers often apply equalization and dynamic range compression in order to optimize sound translation on all playback systems. It is standard practice to make a copy of a master recording—known as a safety copy—in case the master is lost, damaged or stolen.
Mastering (audio)
History
Pre-1940s In the earliest days of the recording industry, all phases of the recording and mastering process were entirely achieved by mechanical processes. Performers sang and/or played into a large acoustic horn and the master recording was created by the direct transfer of acoustic energy from the diaphragm of the recording horn to the mastering lathe, typically located in an adjoining room. The cutting head, driven by the energy transferred from the horn, inscribed a modulated groove into the surface of a rotating cylinder or disc. These masters were usually made from either a soft metal alloy or from wax; this gave rise to the colloquial term waxing, referring to the cutting of a record.After the introduction of the microphone and electronic amplifier in the mid-1920s, the mastering process became electro-mechanical, and electrically driven mastering lathes came into use for cutting master discs (the cylinder format by then having been superseded). Until the introduction of tape recording, master recordings were almost always cut direct-to-disc. Only a small minority of recordings were mastered using previously recorded material sourced from other discs.
Mastering (audio)
History
Emergence of magnetic tape In the late 1940s, the recording industry was revolutionized by the introduction of magnetic tape. Magnetic tape was invented for recording sound by Fritz Pfleumer in 1928 in Germany, based on the invention of magnetic wire recording by Valdemar Poulsen in 1898. Not until the end of World War II could the technology be found outside Europe. The introduction of magnetic tape recording enabled master discs to be cut separately in time and space from the actual recording process.Although tape and other technical advances dramatically improved the audio quality of commercial recordings in the post-war years, the basic constraints of the electro-mechanical mastering process remained, and the inherent physical limitations of the main commercial recording media—the 78 rpm disc and later the 7-inch 45 rpm single and 33-1/3 rpm LP record—meant that the audio quality, dynamic range, and running time of master discs were still limited compared to later media such as the compact disc.
Mastering (audio)
History
Electro-mechanical mastering process From the 1950s until the advent of digital recording in the late 1970s, the mastering process typically went through several stages. Once the studio recording on multi-track tape was complete, a final mix was prepared and dubbed down to the master tape, usually either a single-track mono or two-track stereo tape. Prior to the cutting of the master disc, the master tape was often subjected to further electronic treatment by a specialist mastering engineer.
Mastering (audio)
History
After the advent of tape it was found that, especially for pop recordings, master recordings could be made so that the resulting record would sound better. This was done by making fine adjustments to the amplitude of sound at different frequency bands (equalization) prior to the cutting of the master disc.
Mastering (audio)
History
In large recording companies such as EMI, the mastering process was usually controlled by specialist staff technicians who were conservative in their work practices. These big companies were often reluctant to make changes to their recording and production processes. For example, EMI was very slow in taking up innovations in multi-track recording and did not install 8-track recorders in their Abbey Road Studios until the late 1960s, more than a decade after the first commercial 8-track recorders were installed by American independent studios.
Mastering (audio)
History
Digital technology In the 1990s, electro-mechanical processes were largely superseded by digital technology, with digital recordings stored on hard disk drives or digital tape and mastered to CD. The digital audio workstation (DAW) became common in many mastering facilities, allowing the off-line manipulation of recorded audio via a graphical user interface (GUI). Although many digital processing tools are common during mastering, it is also very common to use analog media and processing equipment for the mastering stage. Just as in other areas of audio, the benefits and drawbacks of digital technology compared to analog technology are still a matter for debate. However, in the field of audio mastering, the debate is usually over the use of digital versus analog signal processing rather than the use of digital technology for storage of audio.Digital systems have higher performance and allow mixing to be performed at lower maximum levels. When mixing to 24-bits with peaks between -3 and -10 dBFS on a mix, the mastering engineer has enough headroom to process and produce a final master. Mastering engineers recommend leaving enough headroom on the mix to avoid distortion. Reduction of dynamics by the mix or mastering engineer has resulted in a loudness war in commercial recordings.
Mastering (audio)
Process
The source material, ideally at the original resolution, is processed using equalization, compression, limiting and other processes. Additional operations, such as editing, specifying the gaps between tracks, adjusting level, fading in and out, noise reduction and other signal restoration and enhancement processes can also be applied as part of the mastering stage. The source material is put in the proper order, commonly referred to as assembly (or 'track') sequencing. These operations prepare the music for either digital or analog, e.g. vinyl, replication.
Mastering (audio)
Process
If the material is destined for vinyl release, additional processing, such as dynamic range reduction or frequency-dependent stereo–to–mono fold-down and equalization may be applied to compensate for the limitations of that medium. For compact disc release, start of track, end of track, and indexes are defined for playback navigation along with International Standard Recording Code (ISRC) and other information necessary to replicate a CD. Vinyl LP and cassettes have their own pre-duplication requirements for a finished master. Subsequently, it is rendered either to a physical medium, such as a CD-R or DVD-R, or to computer files, such as a Disc Description Protocol (DDP) file set or an ISO image. Regardless of what delivery method is chosen, the replicator factory will transfer the audio to a glass master that will generate metal stampers for replication.
Mastering (audio)
Process
The process of audio mastering varies depending on the specific needs of the audio to be processed. Mastering engineers need to examine the types of input media, the expectations of the source producer or recipient, the limitations of the end medium and process the subject accordingly. General rules of thumb can rarely be applied.
Mastering (audio)
Process
Steps of the process typically include the following: Transferring the recorded audio tracks into the Digital Audio Workstation (DAW) Sequence the separate songs or tracks as they will appear on the final release Adjust the length of the silence between songs Process or sweeten audio to maximize the sound quality for the intended medium (e.g. applying specific EQ for vinyl) Transfer the audio to the final master format (CD-ROM, half-inch reel tape, PCM 1630 U-matic tape, etc.)Examples of possible actions taken during mastering: Editing minor flaws Applying noise reduction to eliminate clicks, dropouts, hum and hiss Adjusting stereo width Equalize audio across tracks for the purpose of optimized frequency distribution Adjust volume Dynamic range compression or expansion Peak limit Inserting ISRC codes and CD text Arranging tracks in their final sequential order Fading out the ending of each song Dither
Vermilion border
Vermilion border
The vermilion border (sometimes spelled vermillion border), also called margin or zone, is the normally sharp demarcation between the lip and the adjacent normal skin. It represents the change in the epidermis from highly keratinized external skin to less keratinized internal skin. It has no sebaceous glands, sweat glands, or facial hair.It has a prominence on the face, creating a focus for cosmetics (it is where lipstick is sometimes applied) and is also a location for several skin diseases. Its functional properties, however, remain unknown.
Vermilion border
Structure
The lips are composed wholly of soft tissue. The skin of the face is thicker than the skin overlying the lips where blood vessels are closer to the surface. As a consequence, the margin of the lips shows a transition between the thicker and thinner skin, represented by the vermilion border. It therefore has the appearance of a sharp line between the coloured edge of the lip and adjoining skin.It has been described as a pale, white rolled border and also as being a red line.This fine line of pale skin accentuates the colour difference between the vermilion and normal skin. Along the upper lip, two adjacent elevations of the vermilion border form the Cupid's bow.
Vermilion border
Structure
Microanatomy The vermilion border represents the change in the epidermis from highly keratinized external skin to less keratinized internal skin. It has no sebaceous glands, sweat glands, or facial hair.There are two reasons that the border appears red in some people: The epithelium is thin and therefore the blood vessels are closer to the surface. This epithelium contains eleidin which is transparent and the blood vessels are near the surface of the papillary layer, revealing the "red blood cell" color. At the angles of the mouth, there are sebaceous glands, without hair follicles, which are called Fordyce spots.
Vermilion border
Clinical significance
The vermilion border is important in dentistry and oral pathology as a marker to detect disease, such as in actinic cheilitis. Associated diseases Perioral dermatitis is a rash typically around the mouth, that spares the vermilion border. Cheilitis glandularis may present with a burning sensation over the vermilion border. This chronic progressive condition is associated with thinning of the skin of the lips and ulceration. Infections may involve the vermilion border. Cold sores are one common infection. Impetigo is another. Skin cancer can also occur at the vermilion border. Fetal alcohol syndrome causes facial abnormalities which include a thin vermilion border with a smooth philtrum. Cosmetic appearance Sunlight exposure can blur the junction between the vermilion border and the skin. Applying lip balm and sunscreen moisturizes protects it from sunlight. Surgery A vermilionectomy (sometimes spelled vermillionectomy) is the surgical removal of the vermilion border. It is sometimes performed to treat carcinoma of the lip.Close attention is given when repairing any injury to the vermilion border. Even 1 mm of vermilion misalignment could be noticeable.
LSM5
LSM5
U6 snRNA-associated Sm-like protein LSm5 is a protein that in humans is encoded by the LSM5 gene.Sm-like proteins were identified in a variety of organisms based on sequence homology with the Sm protein family (see SNRPD2; MIM 601061). Sm-like proteins contain the Sm sequence motif, which consists of 2 regions separated by a linker of variable length that folds as a loop. The Sm-like proteins are thought to form a stable heteromer present in tri-snRNP particles, which are important for pre-mRNA splicing.[supplied by OMIM]
Cutaneous B-cell lymphoma
Cutaneous B-cell lymphoma
Cutaneous B-cell lymphomas constitute a group of diseases that occur less commonly than cutaneous T-cell lymphoma, and are characterized histologically by B-cells that appear similar to those normally found in germinal centers of lymph nodes.: 741  Conditions included in this group are:: 740–743  Primary cutaneous diffuse large B-cell lymphoma, leg type Primary cutaneous follicular lymphoma Primary cutaneous marginal zone lymphoma Intravascular large B-cell lymphoma Plasmacytoma Plasmacytosis
Content engineering
Content engineering
Content engineering is a term applied to an engineering specialty dealing with the complexities around the use of content in computer-facilitated environments.
Content engineering
Content engineering
Content authoring and production, content management, content modeling, content conversion, and content use and repurposing are all areas involving this practice. It is not a specialty with wide industry recognition and is often performed on an ad hoc basis by members of software development or content production or marketing staff, but is beginning to be recognized as a necessary function in any complex content-centric project involving both content production as well as software system development mainly involving content management systems (CMS) or digital experience platforms (DXP).
Content engineering
Content engineering
Content engineering tends to bridge the gap between groups involved in the production of content (publishing and editorial staff, marketing, sales, human resources) and more technologically oriented departments such as software development, or IT that put this content to use in web or other software-based environments, and requires an understanding of the issues and processes of both sides. Typically, content engineering involves extensive use of embedded XML technologies, XML being the most widespread language for representing structured content. Content management systems are a key technology often used in the practice of content engineering.
Content engineering
Definition
Content engineering is the practice of organizing the shape and structure of content by deploying content and metadata models, in authoring and publishing processes in a manner that meets the requirements of an organization’s Content Strategy, and its implementation through the use of technology such as CMS, XML, schema markup, artificial intelligence, APIs and others.
Content engineering
Purpose and goal
In very general terms, content engineering practices aim to maximize the ROI of content through content reuse and improving efficiency of content marketing, content operations, content strategy.
Content engineering
Purpose and goal
Content engineering can help address content challenges that fairly typical organizations face: Siloed content supply chains Duplicate content in a myriad of formats Inefficient content authoring workflows Chunky, unstructured content Outdated technology Technology in place does not match needs Inability to reuse content across channels (multi-channel content) Metadata and schema are not used Lack of standards for metadata Lack of findability of content for internal and external use Poor SEO performance Inability to implement personalization
Content engineering
The role of a content engineer
Content engineers bridge the divide between content strategists and producers and the developers and content managers who publish and distribute content. But rather than simply wedging themselves between these players, content engineers help define and facilitate the content structure during the entire content strategy, production and distribution cycle from beginning to end. With equal parts business and technology savvy, the content engineer does not see content as a static and finished piece. Rather, he or she looks at the value of the content and how it can best be adapted and personalized to serve customers and emerging content platforms, technologies, and opportunities.
Content engineering
The role of a content engineer
Create customer experience Content marketing suffers from two fundamental limitations that constrain the true power and potential that a great content marketing plan can bring to a business' bottom line: Content relevance: how to make content more relevant and personalized to their audiences. The marketer and content strategist direct the customer experience itself, and the content engineer makes it happen with content structure, schema, metadata, microdata, taxonomy, and CMS topology.
Content engineering
The role of a content engineer
Content agility: Marketers who are burdened with one-size-fits-all content remain stuck managing their content rather than their customers' experience. Content engineers give marketers the "super powers" to move content-powered experiences across interfaces and personalization variants. Break down barriers Empower content strategists: Content engineers work with content strategists by helping them connect content not as a fixed message, but as a modular construct which can be channeled and manipulated. Enable content producers: A content engineer will work with a content producer by helping to find new sources of content and ways the content can be combined and presented. Guide and free developers: The content engineer helps translate marketing strategy into clear technical needs and functions developers can build into content management systems Enhance content management: Develop content structures that make it easier for content writers and content managers to author to a single, very usable, interface for even complex content types that might contain dozens of elements. Engineer content for success: Content engineers help all members of a marketing team work more smoothly, with the support and structures needed to get the most out of the content they produce.
Content engineering
Sources
"What is Content Engineering?". www.simplea.com "Content Engineer Roles and Responsibilities". www.stc.org - Society of Technical Communication, 2020 "Is Your Content Plan Equipped for Content Engineering?". www.contentmarketinginstitute.com "John Collins: Content Engineering – Episode 106". wwwellessmedia.com "I am a Content Engineer". www.everypageispageone.com