hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
a66f6300689a7a6bbbee1edffcbd9f14acd3c362 | 18,197 | md | Markdown | src/docs/reference/view-page.md | ehershey/httptoolkit.tech | a1b9e886070bcddb181e3f4dafb1a887ec5ab35f | [
"MIT"
] | 46 | 2018-08-30T08:03:58.000Z | 2022-03-12T11:42:01.000Z | src/docs/reference/view-page.md | ehershey/httptoolkit.tech | a1b9e886070bcddb181e3f4dafb1a887ec5ab35f | [
"MIT"
] | 14 | 2019-02-04T18:37:41.000Z | 2022-03-27T02:00:44.000Z | src/docs/reference/view-page.md | ehershey/httptoolkit.tech | a1b9e886070bcddb181e3f4dafb1a887ec5ab35f | [
"MIT"
] | 17 | 2019-01-30T00:38:55.000Z | 2022-02-28T22:05:08.000Z | ---
name: 'The View Page'
title: 'The View Page'
order: 1
---

On the View page you can examine collected HTTP and HTTPS traffic up close.
It shows the traffic collected from HTTP clients that you've intercepted, and/or past HTTP traffic that you import from a HAR file.
The page is in two parts:
* On the left, a list of HTTP events, with some controls to explore those.
* On the right, a pane that shows the details of the selected event, if any.
You can drag the central divider to resize either half as you'd like.
## The exchange list

The exchange list shows a list of the HTTP events, in the order they happened. For HTTP exchanges, that's the order the requests were initially sent.
There's a few possible types of event:
* An HTTP exchange. This consists of a request, which might still be in progresss, might be at a breakpoint, might have completed with a response, or might have failed permanently.
* A failed TLS connection. This is an HTTPS request that has failed to setup the initial connection, perhaps due to the HTTP Toolkit certificate being rejected, or a connection being cancelled.
### HTTP Exchanges
Each exchange row shows:
* A colour marker, giving an at-a-glance summary of each request:
* <span style='color: #000; font-weight: bold'>Black: Incomplete/failed</span>
* <span style='color: #ce3939; font-weight: bold'>Red: Mutative</span> (e.g. POST/DELETE)
* <span style='color: #4caf7d; font-weight: bold'>Green: Image</span>
* <span style='color: #ff8ce8; font-weight: bold'>Orange: JavaScript</span>
* <span style='color: #e9f05b; font-weight: bold'>Yellow: CSS</span>
* <span style='color: #2fb4e0; font-weight: bold'>Light blue: HTML</span>
* <span style='color: #5a80cc; font-weight: bold'>Dark blue: Fonts</span>
* <span style='color: #6e40aa; font-weight: bold'>Purple: Data</span> (XML, protobuf, form data or JSON)
* <span style='color: #888; font-weight: bold'>Grey: Unknown</span>
(Don't worry, you don't need to memorize these! All details of each exchange are available separately, these just help with quickly skimming the list)
* The HTTP method used, with different colours for each verb.
* Either:
* The response status code, also colourized.
* 🚫 if the response did not complete.
* ⚠ if the exchange is currently paused at a breakpoint.
* A spinner, if the response is still in progress.
* The source of the request, as an icon (e.g. Firefox, Chrome, or a specific language) when it's known. You can hover over this icon for more details.
* The URL of the request, separated into the host and the path & query. Long URLs may be truncated here, but the full URL is available as a tooltip if you mouseover the row, or in the details pane if you select the row.
To see more details on any row, click it to select it, and it'll be shown in the details pane on the right. More details on that below.
### The list controls

At the bottom of the list, there's a few useful controls:
* A search bar. This filters the visible events. You can focus it at any time with Ctrl+F (or Cmd+F on Mac).
* You can type text here and it'll be matched directly against the text of every part of every exchange, except the body itself (for performance reasons).
* You can also enter a structured filter, e.g. `status=200`
* When you enter text that matches a structured filter, an autosuggest for the filters you can create will appear. Select one with up/down and press enter to create it. You can also press up/down with no text entered to browse the full list of filters available.
* The created filters are shown as floating tags in the filter input box.
* You can precisely filter for every aspect of a request, from path to the presence of individual headers to body size, including various operators for each filter, such as "starts with" or "greater than".
* As you enter text, an explanation of the filter that will be created is shown below to the selected filter.
* [HTTP Toolkit Pro](/get-pro/) users can save sets of filters with their own custom name. To do this, create your filters, then type a string that doesn't match an existing filter, and select the 'Save filters' suggestion that's shown.
* The number of events shown, if your current filter is hiding any events.
* The total number of events in the list.
* A button to pause/resume interception.
* When interception is paused, traffic will continue to pass through the proxy as normal, but will not be collected or shown.
* When interception is resumed, traffic is once again collected.
* A button to export the currently visible exchanges as a [HAR file](https://en.wikipedia.org/wiki/HAR_(file_format)), usable with many other tools (_requires [HTTP Toolkit Pro](/get-pro/)_).
* A button to import a HAR file. These imported exchanges are appended to the list of events, and won't remove any events already present (_requires [HTTP Toolkit Pro](/get-pro/)_).
* A button to clear the event list, deleting all events from memory.
## The event details pane
The right pane is made up of a series of 'cards': collapsible sections, which provide different views of the details of the selected event. You can see an example of each card below.
For an HTTP exchange, there's a few cards that will be shown:
* The request details.
* The request body (if there is one).
* If a response has been received:
* The response details.
* The response body (if there is one).
* The performance details (_requires [HTTP Toolkit Pro](/get-pro/)_).
* The export options (_requires [HTTP Toolkit Pro](/get-pro/)_).
* Breakpoint cards, for breakpointed requests and responses.
It's also possible to expand the request and response body cards, so they fill the entire pane. In this case, only that card will be shown.
### The Request Card
For [HTTP Toolkit Pro](/get-pro/) users, the request card may show metadata & validation, for requests to recognized APIs. This is powered by the OpenAPI specifications of the [OpenAPI Directory](https://github.com/APIs-guru/openapi-directory). As an example:

Here, a request to the GitHub API is recognized as such, so includes:
- The service the request is going to, with links to its docs
- The name of the operation itself
- A list of the parameters being specified, each including:
- The name & value of the parameter
- Documentation for the specific parameter, with a description and the valid possible values
- Validation & warnings, for invalid or missing values, and deprecated parameters or operations
In addition to API metadata for Pro users, for all users this card shows the core request data: the HTTP method, URL & headers sent.
All standard methods and headers will be recognized, and clicking the plus symbol next to them will show a quick explanation of what they mean, with links to the Mozilla Developer Network's full documentation for that method or header.
The URL can also be expanded, to show it broken down for readability into the protocol, host, path, and each of the query string parameters included.
### Body Cards
Both requests and responses can have bodies, and when present, the corresponding request or response body card will be shown. These both work exactly the same, and both can appear at the same time, e.g. for a POST request with a 200 response.
This card consists of a viewer for the body content, plus a few controls. Let's start with the controls:
* An expand/shrink button: this expands the body content to fill the entire right pane or shrinks it again, back to normal, if it is currently expanded.
* A save button: this saves the shown decoded body content to a file, so it can be edited or opened directly with other tools (_requires [HTTP Toolkit Pro](/get-pro/)_)
* The number of bytes: this shows the numbers of bytes in the content itself. This is the content after decoding bodies, which may have been gzipped for example, but ignoring any content autoformatting in the editor shown.
* A dropdown to select the formatting for the body viewer. This is filtered to provide only meaningful options for each content type. There's a few options:
* Image - an image, shown as an actual image.
* Hex - the raw bytes of the body, in formatted hexidecimal.
* Text - the content of the body when decoded as UTF8 text.
* XML - the content of the body decoded as UTF8 text, autoformatted as XML, with syntax highlighting.
* JSON - the content of the body decoded as UTF8 text, autoformatted as JSON, with syntax highlighting.
* JavaScript - the content of the body decoded as UTF8 text, autoformatted as JavaScript, with syntax highlighting and intellisense.
* HTML - the content of the body decoded as UTF8 text, autoformatted as HTML, with syntax highlighting.
* CSS - the content of the body decoded as UTF8 text, autoformatted as CSS, with syntax highlighting.
* Markdown - the content of the body decoded as UTF8 text, with markdown syntax highlighting.
* YAML - the content of the body decoded as UTF8 text, with YAML syntax highlighting.
The viewer itself appears below, showing the content formatted according to the formatting dropdown.
The viewer is powered by a read-only [Monaco editor](https://github.com/microsoft/monaco-editor) (the internals of Visual Studio Code), and includes many of the features of Visual Studio Code. For example:
* Syntax highlighting, with errors for invalid syntax.
* Type inference & intellisense, for JavaScript.
* Collapsible blocks (use the +/- in the left margin).
* Plain text and regex search (press Ctrl/Cmd + F).
* Match highlighting: select and text to see all other occurances highlighted in the text and the scrollbar.
* Inline color tags, for colors defined in CSS content.
### The Response Card

The response card shows the HTTP status code, status message, and response headers. For [HTTP Toolkit Pro](/get-pro/) users, the response status itself may come with further explanations powered by any detected OpenAPI specifications from the [OpenAPI Directory](https://github.com/APIs-guru/openapi-directory).
All standard status codes and headers on this card are automatically recognized, and clicking the plus symbol next to them will show a quick explanation of what they mean, and links to the Mozilla Developer Network's full documentation for that status or header.
### The Performance Card
_Only available with [HTTP Toolkit Pro](/get-pro/)_

The performance card shows the performance details for the given response. These include:
* The exact time taken for the full request and response
* For both the request & response bodies:
* The compression algorithm used, and its relative (percentage reduction) & absolute (bytes saved) performance
* The potential improvements with alternative compression algorithms:
* When improvements are available, the algorithm is shown as green, and a suggestion appears alongside. This suggestion will include a note if your current HTTP client does not advertise support for the best available compression algorithm
* A warning will be shown if the current compression algorithm is _increasing_ the size of the the body
* For each algorithm, the specific compressed & uncompressed sizes can be seen by hovering over its result
* Exact results might vary slightly from those shown, depending on the settings used, but they should be representative
* The caching behaviour of the given request with explanations:
* This is separated into a few sections:
* Whether the response is cacheable at all
* Which future requests can reuse this response from the cache
* Which types of cache are allowed to store the response
* When the response will expire, and how and when it should be revalidated
* Each section can be expanded to see an explanation of the given behaviour
* Sections with potential improvements or issues will be shown with a lightbulb or warning icon, respectively
* The explanations here indicate the behaviour allowed by the caching standards. Individual cache behaviour will depend on these, but can vary further. For example, caches are not obliged to cache cacheable content, but should never cache non-cacheable content.
### The Export Card
_Only available with [HTTP Toolkit Pro](/get-pro/)_

The export card allows you to export a given request & response as a HAR file, or to export the request as ready-to-use code for a wide selection of languages.
You can export the exchange as a HAR file by clicking the 'Save as HAR' button in the top left. This exports this individual exchange into a standalone file. Note that's it's also possible to export the full set of currently filtered exchange from the exchange list on the left, using the export button in the list footer.
To export as a code snippet, first pick your language or tool of choice from the list. The list includes:
* C: Libcurl
* Clojure: clj-http
* C#: RestSharp
* Go: NewRequest
* Raw HTTP/1.1
* Java:
* OkHttp
* Unirest
* JavaScript
* jQuery
* fetch
* XMLHttpRequest
* Node.js:
* HTTP
* Request
* Unirest
* Objective-C: NSURLSession
* OCaml: CoHTTP
* PHP:
* cURL
* HTTP v1
* HTTP v2
* Powershell:
* Invoke-WebRequest
* Invoke-RestMethod
* Python:
* http.client
* Requests
* Ruby: net::http
* Shell:
* cURL
* HTTPie
* Wget
* NSURLSession
A short description of the client and a link to its full docs will be shown in the body of the card, followed by the full code snippet (shown in a Monaco editor, with all the same features of the body editors described above).
You can copy the entire snippet using the 'Copy snippet' button, or copy individual segments of the snippet by hand.
Each snippet is intended to be representative ready-to-use code for the given target, but you should pay attention to application specific details, and be aware they you may need to customize security settings (e.g. certificate trust & validation) and add appropriate error handling for your specific environment.
### Breakpoint Cards
When viewing a request or response that is paused at a breakpoint, some other cards may not appear, or may be replaced by editable cards. The rules that trigger breakpoints can be configured on [the Mock page](/docs/reference/mock-page/).
Breakpointed requests can be edited before they are sent upstream to their target server, or you can respond to them directly without forwarding the request. Breakpointed responses can be edited before they are returned to the initiating HTTP client.
A breakpointed request looks like this:

From here, you can:
* Change the request method.
* Edit the request URL:
* You can change the request protocol, path or query.
* Alternatively, edit the hostname to redirect it to a different server entirely.
* Editing the hostname will automatically update the Host header to match.
* Edit the request headers.
* Edit the empty name/value row at the bottom to add new headers.
* Click the delete button shown on any existing row to remove existing headers.
* Edit the request body.
* You can select the syntax highlighting to use with the dropdown on the right.
* If there is a content-length header set that matches the body length, editing the body will automatically update the header to match.
* The body is shown in its decoded form. If a content-encoding header is set, the body will be re-encoded before the request is forwarded.
Press 'Resume' in the breakpoint header to forward your request on to the target server.
You can also press 'Response directly' to skip forwarding the request entirely, and instead manually provide your own response. This triggers an immediate response breakpoint, for an empty response.
A breakpointed response looks like this:

From here, you can:
* Edit the status code or status message that will be returned.
* Edit the response headers.
* Edit the empty name/value row at the bottom to add new headers.
* Click the delete button shown on any existing row to remove existing headers.
* Edit the response body.
* You can select the syntax highlighting to use with the dropdown on the right.
* If there is a content-length header set that matches the body length, editing the body will automatically update the header to match.
* The body is shown in its decoded form. If a content-encoding header is set, the body will be re-encoded before the request is forwarded.
When you're done, press resume to let the response continue back to the HTTP client.
There are a couple of other things about breakpointing worth noting:
* When hitting a response breakpoint, request data has already been received, and can be seen in the cards above the editable response.
* Whilst waiting at a breakpoint, your client is not receiving content, and may time out. If this happens, your breakpoint will close suddenly, and will appear like any other timed out or aborted request. To avoid this, increase the timeout in your client, or edit quickly!
* After your breakpointing is all done, the exchange content shown is always the data from the perspective of the client. That means:
* The edited request data will not be shown - you'll see the data that the client actually sent, not the data sent to the server.
* The original response data will not be shown - you'll see the data that the client actually received, not the data received from the server.
**Any questions? [Get in touch](/contact/)** | 62.748276 | 322 | 0.756608 | eng_Latn | 0.999238 |
a66fdd6a08772a41c9419fb0d5f3111f9675c95b | 32 | md | Markdown | README.md | HayatoDoi/com | 647890d096c8d6b427c3d9da5f4ccf08a72105cb | [
"MIT"
] | null | null | null | README.md | HayatoDoi/com | 647890d096c8d6b427c3d9da5f4ccf08a72105cb | [
"MIT"
] | 9 | 2019-10-08T04:56:29.000Z | 2020-05-27T02:32:07.000Z | README.md | HayatoDoi/com | 647890d096c8d6b427c3d9da5f4ccf08a72105cb | [
"MIT"
] | null | null | null | # com
interactive com port tool
| 10.666667 | 25 | 0.78125 | cat_Latn | 0.801603 |
a670ec36add5442378f1e2b39dd858ffc057da7e | 959 | md | Markdown | _posts/writing/2015-03-16-knife-block-v0.2.0.md | solarce/solarce.org | 9c13cdf761409dd75986c77e1f4146afbedc804f | [
"MIT"
] | null | null | null | _posts/writing/2015-03-16-knife-block-v0.2.0.md | solarce/solarce.org | 9c13cdf761409dd75986c77e1f4146afbedc804f | [
"MIT"
] | null | null | null | _posts/writing/2015-03-16-knife-block-v0.2.0.md | solarce/solarce.org | 9c13cdf761409dd75986c77e1f4146afbedc804f | [
"MIT"
] | null | null | null | ---
layout: post
title: "knife-block v0.2.0 released"
excerpt: ""
categories: writing
tags: [chef, knife, knife-block]
comments: true
share: true
---
# knife-block v0.2.1 released.
_Important Update:_ v0.2.0 has been yanked because it wasn't actually working on Chef12/ChefDK.
[v0.2.1](https://github.com/knife-block/knife-block/releases/tag/v0.2.1) is out and on [rubygems.org](https://rubygems.org/gems/knife-block/versions/0.2.1)
This release includes:
- A fix so knife-block now works on Chef12
- Some updates to the documentation
- Some travis build improvements, including
- Multiple 2.x Ruby builds
- Building on 2.1 with chefdk
- A shiny logo
# The Future
Some future improvements I'm hoping to make in the coming months are:
- Get the code base happy with Rubocop
- Get CI builds going on OSX (w/ rvm and chefdk) and Windows builds
- Start working on any Windows related bugs that crop up
- Better integration with Berkshelf config stuff
| 28.205882 | 155 | 0.748697 | eng_Latn | 0.987613 |
a6716e1f9bc0111dd59b8cabd343ce777b083078 | 1,790 | md | Markdown | generatedREADME.md | DexterLGriffith/Professional-README-Generator | c9c06bed40e057ed6bb1e8b6761251435aa1af84 | [
"BSD-2-Clause"
] | null | null | null | generatedREADME.md | DexterLGriffith/Professional-README-Generator | c9c06bed40e057ed6bb1e8b6761251435aa1af84 | [
"BSD-2-Clause"
] | null | null | null | generatedREADME.md | DexterLGriffith/Professional-README-Generator | c9c06bed40e057ed6bb1e8b6761251435aa1af84 | [
"BSD-2-Clause"
] | null | null | null |
# Professional README Generator
## Table of Contents
### [Description](#Description)
### [Tasks Completed](#Tasks Completed)
### [Installation](#Installation)
### [Links](#Links)
### [Credits](#Credits)
### [References](#References)
### [License](#License)
## Description
Build a professional README generator from scratch using node.js and the skills learned from the previous weeks. I will build a node.js application which allows the user to input information into the node index.js terminal and output the information into a professionallyt designed README.md file.
## Tasks Completed
1. Created basic file from Github with README.md, license and initial index.js file.
2. Created initial shell for project on index.js, and initialized and ran npm to allow project to run in terminal.
3. Created .gitignore file, to ignore files, and made question sheets which promp in the terminal when running the program. Finally made a section in index.js file, which prompts creation of README.md file when user answers questions prompted onto the terminal.
## Installation
Download files from the repo, and open index.js and the command terminal for the folder. Initialize npm within file and type in node.js and answer questions.
## Links
https://github.com/DexterLGriffith/Professional-README-Generator
## Credits
Dexter Griffith
## References
1. https://smu.bootcampcontent.com/SMU-Coding-Bootcamp/smu-dal-fsf-pt-07-2021-u-c/-/tree/master/09-NodeJS/01-Activities/28-Stu_Mini-project
2. https://smu.bootcampcontent.com/SMU-Coding-Bootcamp/smu-dal-fsf-pt-07-2021-u-c/-/tree/master/09-NodeJS/01-Activities/18-Stu_Package-npm
## License
BSD 2-clause license
| 38.085106 | 301 | 0.72067 | eng_Latn | 0.934823 |
a672fc162780a4fb2b9b4485f5b9c14db8348bc9 | 15,479 | md | Markdown | content/implement/error_handling.it.md | JohnRoesler/goa.design | 840e4d98c5709ca1721914ac0d4e07a9c6fca65c | [
"MIT"
] | null | null | null | content/implement/error_handling.it.md | JohnRoesler/goa.design | 840e4d98c5709ca1721914ac0d4e07a9c6fca65c | [
"MIT"
] | null | null | null | content/implement/error_handling.it.md | JohnRoesler/goa.design | 840e4d98c5709ca1721914ac0d4e07a9c6fca65c | [
"MIT"
] | null | null | null | +++
date = "2020-11-21:01:06-05:00"
title = "Gestione degli Errori"
weight = 4
[menu.main]
name = "Gestione degli Errori"
parent = "implement"
+++
## Panoramica
Goa rende possibile descrivere con precisione i potenziali errori ritornati
dai vari service methods. Ciò permette di definire un contratto chiaro fra server e clients,
che viene riflesso nel codice e nella documentazione generata.
Goa ha un approccio "tutto incluso" dove gli errori possono essere definiti
con una informazione minimale quale può essere semplicemente un nome.
Tuttavia il DSL permette anche la definizione di nuovi tipi di errori qualora
quelli definiti di default da Goa non risultino sufficienti.
## Definire gli errori
Gli errori sono definiti attraverso la funzione
[Error](https://pkg.go.dev/goa.design/goa/v3/dsl#Error):
```go
var _ = Service("calc", func() {
Error("invalid_arguments")
})
```
Gli errori possono anche essere definiti con uno scope specifico per un singolo
metodo:
```go
var _ = Service("calc", func() {
Method("divide", func() {
Payload(func() {
Field(1, "dividend", Int)
Field(1, "divisor", Int)
Required("dividend", "divisor")
})
Result(func() {
Field(1, "quotient", Int)
Field(2, "reminder", Int)
Required("quotient", "reminder")
})
Error("div_by_zero") // Errore specifico per il metodo
})
})
```
Sia l'errore `invalid_arguments` che `div_by_zero` nell'esempio fanno uso
del tipo di errore di default
[ErrorResult](https://pkg.go.dev/goa.design/goa/v3/expr#ErrorResult).
Possono essere anche usati tipi Custom per definire gli errori, nel seguente modo:
```go
var DivByZero = Type("DivByZero", func() {
Description("DivByZero è l'errore ritornato quando si usa 0 come divisore.")
Field(1, "message", String, "Dividere per 0 fa infinito.")
Required("message")
})
var _ = Service("calc", func() {
Method("divide", func() {
Payload(func() {
Field(1, "dividend", Int)
Field(1, "divisor", Int)
Required("dividend", "divisor")
})
Result(func() {
Field(1, "quotient", Int)
Field(2, "reminder", Int)
Required("quotient", "reminder")
})
Error("div_by_zero", DivByZero, "Divisione per zero") // Usa il tipo di errore DivByZero
})
})
```
Se un tipo è usato per definire più errori diversi deve definire un attributo che
contiene il nome dell'errore, di modo che il codice generato possa inferire
la definizione di design corrispondente. La definizione deve essere identificata tramite lo struct tag
`struct:error:name`
[metadata](https://pkg.go.dev/goa.design/goa/v3/dsl#Meta), per esempio:
```go
var DivByZero = Type("DivByZero", func() {
Description("DivByZero è l'errore ritornato quando si usa 0 come divisore.")
Field(1, "message", String, "Dividere per 0 fa infinito.")
Field(2, "name", String, "Nome dell'errore", func() {
// Dice a Goa di usare il campo `name`per identificare la definizione
// dell'errore.
Meta("struct:error:name")
})
Required("message", "name")
})
```
Il campo deve essere inizializzato dal codice nel server che ritorna quell'errore.
Il codice generato lo userà per corrispondenza con la definizione dell'errore stesso
e restituire il corretto status code.
### Temporary Errors, Faults e Timeouts
La funzione `Error` accetta un DSL opzionale come ultimo argomento che rende
possibile specificare proprietà eventuali sull'errore. La funzione DSL `Error`
accetta 3 funzioni figlie:
* `Timeout()` Identifica l'errore come frutto di un timeout del server.
* `Fault()` Identifica l'errore come un problema server side (es. un bug, un panic ecc...).
* `Temporary()` Identifica l'errore come temporaneo (e di conseguenza la richiesta collegata è riprovabile).
La seguente definizione è appropriata per definire un errore di timeout:
```go
Error("Timeout", ErrorResult, "Il timeout della richiesta è stato superato", func() {
Timeout()
})
```
Le funzioni `Timeout`, `Fault` e `Temporary` istruiscono il generatore Goa ad
inizializzare correttamente i campi con lo stesso nome all'interno della
`ErrorResponse`. Non hanno effetti (a parte la documentazione) quando usati
su un errore custom.
### Mappare gli errori agli status code del Trasporto
La funzione [Response](https://pkg.go.dev/goa.design/goa/v3/dsl#Response)
definisce gli status code HTTP o gRPC che verranno usati per descrivere l'errore:
```go
var _ = Service("calc", func() {
Method("divide", func() {
Payload(func() {
Field(1, "dividend", Int)
Field(2, "divisor", Int)
Required("dividend", "divisor")
})
Result(func() {
Field(1, "quotient", Int)
Field(2, "reminder", Int)
Required("quotient", "reminder")
})
Error("div_by_zero")
HTTP(func() {
POST("/")
Response("div_by_zero", StatusBadRequest, func() {
// Usa il codice di stato HTTP 400 (BadRequest) per gli errori "div_by_zero"
Description("Response usata per gli errori DivByZero")
})
})
GRPC(func() {
Response("div_by_zero", CodeInvalidArgument, func() {
// Usa il codice di stato gRPC 3 (InvalidArgument) per gli errori "div_by_zero"
Description("Response used for division by zero errors")
})
})
})
})
```
## Produrre Errori
### Usando l'error type di Default
Con il design definito sopra Goa genera una helper function `MakeDivByZero`
che il codice del server può usare per restituire errori. La funzione è generata
nel package specifico del servizio (sotto `gen/calc` in questo esempio).
Accetta un Go error come parametro:
```go
// Code generated by goa v....
// ...
package calc
// ...
// MakeDivByZero builds a goa.ServiceError from an error.
func MakeDivByZero(err error) *goa.ServiceError {
return &goa.ServiceError{
Name: "div_by_zero",
ID: goa.NewErrorID(),
Message: err.Error(),
}
}
// ...
```
Questa funzione può essere usata come segue per implementare la funzione `Divide`:
```go
func (s *calcsrvc) Divide(ctx context.Context, p *calc.DividePayload) (res *calc.DivideResult, err error) {
if p.Divisor == 0 {
return nil, calc.MakeDivByZero(fmt.Errorf("cannot divide by zero"))
}
// ...
}
```
Le funzioni `MakeXXX` generate creano istanze del tipo
[ServiceError](https://pkg.go.dev/goa.design/goa/v3/pkg#ServiceError).
### Usando error type personalizzati
Quando si usano tipi personalizzati per definire errori in Goa, non vengono generate le
helper functions in quando il generatore Goa non ha una maniera per mappare go errors e
i tipi generati corrispondenti. In questo caso il metodo deve istanziare l'errore
direttamente.
Sfruttando l'esempio precedente e usando il tipo `DivByZero`:
```go
Error("div_by_zero", DivByZero, "Division by zero") // Usa DivByZero per la definizione dell'errore
```
Per ritornare l'errore l'implementazione del metodo deve ritornare un'istanza
della struct `DivByZero`, sempre presente nel service package (`calc`
in questo esempio):
```go
func (s *calcsrvc) Divide(ctx context.Context, p *calc.DividePayload) (res *calc.DivideResult, err error) {
if p.Divisor == 0 {
return nil, &calc.DivByZero{Message: "cannot divide by zero"}
}
// ...
}
```
## Consumare gli errori
Gli error values ritornati al client sono costruiti dalle stesse struct usate dal
server che ritorna gli errori stessi.
### Con il tipo di errore di default
Se l'errore usa la definizione di default allora tali errori sono istanze
di [ServiceError](https://pkg.go.dev/goa.design/goa/v3/pkg#ServiceError):
```go
// ... inizializza endpoint, ctx, payload
c := calc.NewClient(endpoint)
res, err := c.Divide(ctx, payload)
if res != nil {
if dbz, ok := err.(*goa.ServiceError); ok {
// usa dbz per gestire l'errore
}
}
// ...
```
### Con tipi di errore personalizzati
Se l'errore ha una definizione di tipo personalizzata allora l'errore client
side è la stessa struct personalizzata:
```go
// ... inizializza endpoint, ctx, payload
c := calc.NewClient(endpoint)
res, err := c.Divide(ctx, payload)
if res != nil {
if dbz, ok := err.(*calc.DivByZero); ok {
// usa dbz per gestire l'errore
}
}
// ...
```
## Validazione degli errori
Gli errori di validazione sono essi stessi struct
[ServiceError](https://pkg.go.dev/goa.design/goa/v3/pkg#ServiceError).
Il campo `name` della struct rende possibile per il codice del client di
differenziare i diversi tipi di errore.
Qui un esempio di come farlo, che assume che il design usi il tipo di errore
di default per definire l'errore `div_by_zero`:
```go
// ... inizializza endpoint, ctx, payload
c := calc.NewClient(endpoint)
res, err := c.Divide(ctx, payload)
if res != nil {
if serr, ok := err.(*goa.ServiceError); ok {
switch serr.Name {
case "missing_field":
// Gestire l'errore missing operand qui
case "div_by_zero":
// Gestire l'errore division by zero qui
default:
// Gestire gli altri possibili errori qui
}
}
}
// ...
```
Gli errori di validatione sono tutti definiti nel file
[error.go](https://github.com/goadesign/goa/blob/v3/pkg/error.go), e sono:
* `missing_payload`: prodotto quando alla richiesta manca un payload richiesto.
* `decode_payload`: prodotto quando il body della richiesta non può essere decodificato con successo.
* `invalid_field_type`: prodotto quando un campo non è dello stesso tipo definito nel corrispettivo design.
* `missing_field`: prodotto quando il payload non possiede un campo richiesto.
* `invalid_enum_value`: prodotto quando il valore di un campo nel payload non corrisponde all'enum definito nel design (Enum).
* `invalid_format`: prodotto quando il campo nel payload non passa i check di formato del design (Format).
* `invalid_pattern`: prodotto quando il valore di un campo nel payload non passa i check del pattern regexp specificato nel design (Pattern).
* `invalid_range`: prodotto quando il valore del campo nel payload non è nel range specificato nel design (es. Minimum, Maximum).
* `invalid_length`: prodotto quando il valore del campo non rispetta i requiditi di lunghezza specificati nel design (es. MinLength, MaxLength).
## Sovrascrivere la serializzazione degli errori
Qualche volta è necessario sovrascrivere il formato usato dal codice generato
per validare gli errori. L'handler HTTP e il codice di creazione del server generati
permettono di passare un error formatter personalizzato come parametro:
```go
// Code generated by goa v...
package server
// ...
// New instantiates HTTP handlers for all the calc service endpoints using the
// provided encoder and decoder. The handlers are mounted on the given mux
// using the HTTP verb and path defined in the design. errhandler is called
// whenever a response fails to be encoded. formatter is used to format errors
// returned by the service methods prior to encoding. Both errhandler and
// formatter are optional and can be nil.
func New(
e *calc.Endpoints,
mux goahttp.Muxer,
decoder func(*http.Request) goahttp.Decoder,
encoder func(context.Context, http.ResponseWriter) goahttp.Encoder,
errhandler func(context.Context, http.ResponseWriter, error),
formatter func(err error) goahttp.Statuser, // Error formatter function
// ...
```
La funzione fornita deve accettare una istanza di un error come parametro e
restituire una struct che implementa l'interfaccia
[Statuser](https://pkg.go.dev/goa.design/goa/v3/http#Statuser):
```go
type Statuser interface {
// StatusCode return the HTTP status code used to encode the response
// when not defined in the design.
StatusCode() int
}
```
Il codice generato chiama il metodo `StatusCode` della struct quando deve scrivere
la response HTTP e usa il suo valore di ritorno per scrivere il codice di stato HTTP.
La struct viene poi serializzata nel response body.
L'implementazione di default usata quando il valore `nil` è passato come parametro
`formatter` nella funzione `New` function è
[NewErrorResponse](https://pkg.go.dev/goa.design/goa/v3/http#NewErrorResponse)
che ritorna una istanza di
[ErrorResponse](https://pkg.go.dev/goa.design/goa/v3/pkg#ErrorResponse).
### Sovrascrivere gli errori di validazione della serializzazione
Un formatter custom può ispezionare l'errore in maniera simile a come fa un
qualsiasi codice client quando gestisce errori differenti, per esempio:
```go
// missingFieldError è il tipo usato per serializzare gli errori di campo obbligatorio
// mancante. Sovrascrive il default fornito da Goa.
type missingFieldError string
// StatusCode restituisce 400 (BadRequest).
func (missingFieldError) StatusCode() int { return http.StatusBadRequest }
// customErrorResponse converte err in un errore MissingField error se err corrisponde
// a un errore di tipo missing required field.
func customErrorResponse(err error) Statuser {
if serr, ok := err.(*goa.ServiceError); ok {
switch serr.Name {
case "missing_field":
return missingFieldError(serr.Message)
default:
// Usa il default di Goa
return goahttp.NewErrorResponse(err)
}
}
// Usa il default di Goaper tutti gli altri errori
return goahttp.NewErrorResponse(err)
}
```
Questo formatter personalizzato può essere usato per istanziare un server HTTP o un handler:
```go
var (
calcServer *calcsvr.Server
)
{
eh := errorHandler(logger)
calcServer = calcsvr.New(calcEndpoints, mux, dec, enc, eh, customErrorResponse)
// ...
```
## Esempio
L'esempio sulla [gestione degli errori](https://github.com/goadesign/examples/tree/master/error)
mostra come usare tipi di errore personalizzati e come sovrascrivere la error response
predefinita per gli errori di validazione.
| 37.298795 | 190 | 0.640287 | ita_Latn | 0.991631 |
a6743707d5aa94503d5a74a0d4b4115a2f671a00 | 755 | md | Markdown | setup.md | rknx/BioInfoAPS.github.io | 81ee565ad1fdc3df7ddb6cb58ecd2102e5da2aa8 | [
"CC-BY-4.0"
] | null | null | null | setup.md | rknx/BioInfoAPS.github.io | 81ee565ad1fdc3df7ddb6cb58ecd2102e5da2aa8 | [
"CC-BY-4.0"
] | 2 | 2020-09-25T09:59:11.000Z | 2020-09-25T10:01:52.000Z | setup.md | rknx/BioInfoAPS.github.io | 81ee565ad1fdc3df7ddb6cb58ecd2102e5da2aa8 | [
"CC-BY-4.0"
] | null | null | null | ---
layout: page
title: Setup
---
{% comment %} Setup {% endcomment %}
<h2 id="setup">Setup</h2>
<p>
To participate in a this workshop, you will need access to the software described below.
</p>
<p>
We maintain a list of common issues that occur during installation as a reference for instructors
that may be useful on the
<a href = "{{ site.baseSite }}{{ site.faq }}">FAQ page</a>.
</p>
{% comment %} Zoom {% endcomment %}
{% include setup/conference.html %}
{% comment %} SSH Client {% endcomment %}
{% include setup/ssh.html %}
{% comment %} SFTP {% endcomment %}
{% include setup/sftp.html %}
{% comment %} Text editor {% endcomment %}
{% include setup/editor.html %}
{% comment %} Figtree {% endcomment %}
{% include setup/figtree.md %} | 25.166667 | 99 | 0.655629 | eng_Latn | 0.975768 |
a6752fb329def454e61c806aa2a8badc20568915 | 312 | md | Markdown | tags.md | ProbablePrime/neos-voting-server | 00d40933f0d8d64cab76cc5f4ef280d9f2deefd4 | [
"MIT"
] | 1 | 2021-03-09T05:22:29.000Z | 2021-03-09T05:22:29.000Z | tags.md | ProbablePrime/neos-voting-server | 00d40933f0d8d64cab76cc5f4ef280d9f2deefd4 | [
"MIT"
] | null | null | null | tags.md | ProbablePrime/neos-voting-server | 00d40933f0d8d64cab76cc5f4ef280d9f2deefd4 | [
"MIT"
] | null | null | null |
"mmc2021",
“world”, “social”
“world”, “game”
“world”, “misc”
“avatar”, “avatars”
“avatar”, “accessories”
“avatar”, “misc”
“other”, “tau”
“other”, “misc”
“meme”
.post('vote/:competition/:category/:subcategory?', handleVote)
.get('vote/:competition/:category/:subcategory?', hasVoted)
vote/mmc2021/world/social
| 18.352941 | 62 | 0.679487 | nld_Latn | 0.114994 |
a67656a1fd7e2ae7b2ec12a04437db65a3ba5934 | 10,585 | md | Markdown | src/Resources/Resources/help/Get-AzResource.md | khannarheams/azure-powershell | d1bbdaed19bd3dd455301902c9033f7ec687f2e6 | [
"MIT"
] | 1 | 2019-01-28T13:40:49.000Z | 2019-01-28T13:40:49.000Z | src/Resources/Resources/help/Get-AzResource.md | khannarheams/azure-powershell | d1bbdaed19bd3dd455301902c9033f7ec687f2e6 | [
"MIT"
] | 15 | 2015-05-29T10:07:37.000Z | 2018-10-17T09:37:56.000Z | src/Resources/Resources/help/Get-AzResource.md | khannarheams/azure-powershell | d1bbdaed19bd3dd455301902c9033f7ec687f2e6 | [
"MIT"
] | 2 | 2015-08-04T08:00:04.000Z | 2018-06-26T10:38:46.000Z | ---
external help file: Microsoft.Azure.PowerShell.Cmdlets.ResourceManager.dll-Help.xml
Module Name: Az.Resources
ms.assetid: C2C608E5-3351-4D01-8533-9668B2E9F1D1
online version: https://docs.microsoft.com/en-us/powershell/module/az.resources/get-azresource
schema: 2.0.0
---
# Get-AzResource
## SYNOPSIS
Gets resources.
## SYNTAX
### ByTagNameValueParameterSet (Default)
```
Get-AzResource [-Name <String>] [-ResourceType <String>] [-ODataQuery <String>] [-ResourceGroupName <String>]
[-TagName <String>] [-TagValue <String>] [-ExpandProperties] [-ApiVersion <String>] [-Pre]
[-DefaultProfile <IAzureContextContainer>] [<CommonParameters>]
```
### ByResourceId
```
Get-AzResource -ResourceId <String> [-ODataQuery <String>] [-ExpandProperties] [-ApiVersion <String>] [-Pre]
[-DefaultProfile <IAzureContextContainer>] [<CommonParameters>]
```
### ByTagObjectParameterSet
```
Get-AzResource [-Name <String>] [-ResourceType <String>] [-ODataQuery <String>] [-ResourceGroupName <String>]
-Tag <Hashtable> [-ExpandProperties] [-ApiVersion <String>] [-Pre] [-DefaultProfile <IAzureContextContainer>]
[<CommonParameters>]
```
## DESCRIPTION
The **Get-AzResource** cmdlet gets Azure resources.
## EXAMPLES
### Example 1: Get all resources in the current subscription
```
PS C:\> Get-AzResource | ft
Name ResourceGroupName ResourceType Location
---- ----------------- ------------ --------
testVM testRG Microsoft.Compute/virtualMachines westus
disk testRG Microsoft.Compute/disks westus
nic testRG Microsoft.Network/networkInterfaces westus
nsg testRG Microsoft.Network/networkSecurityGroups westus
ip testRG Microsoft.Network/publicIPAddresses westus
vnet testRG Microsoft.Network/virtualNetworks westus
testKV otherRG Microsoft.KeyVault/vaults eastus
storage otherResourceGroup Microsoft.Storage/storageAccounts eastus
testVM2 otherResourceGroup Microsoft.Compute/virtualMachines eastus
```
This command gets all of the resources in the current subscription.
### Example 2: Get all resources in a resource group
```
PS C:\> Get-AzResource -ResourceGroupName testRG | ft
Name ResourceGroupName ResourceType Location
---- ----------------- ------------ --------
testVM testRG Microsoft.Compute/virtualMachines westus
disk testRG Microsoft.Compute/disks westus
nic testRG Microsoft.Network/networkInterfaces westus
nsg testRG Microsoft.Network/networkSecurityGroups westus
ip testRG Microsoft.Network/publicIPAddresses westus
vnet testRG Microsoft.Network/virtualNetworks westus
```
This command gets all of the resources in the resource group "testRG".
### Example 3: Get all resources whose resource group matches the provided wildcard
```
PS C:\> Get-AzResource -ResourceGroupName other* | ft
Name ResourceGroupName ResourceType Location
---- ----------------- ------------ --------
testKV otherRG Microsoft.KeyVault/vaults eastus
storage otherResourceGroup Microsoft.Storage/storageAccounts eastus
testVM2 otherResourceGroup Microsoft.Compute/virtualMachines eastus
```
This command gets all of the resources whose resource group they belong in beings with "other".
### Example 4: Get all resources with a given name
```
PS C:\> Get-AzResource -Name testVM | fl
Name : testVM
ResourceGroupName : testRG
ResourceType : Microsoft.Compute/virtualMachines
Location : westus
ResourceId : /subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/testRG/providers/Microsoft.Compute/virtualMachines/testVM
```
This command gets all of the resources whose resource name is "testVM".
### Example 5: Get all resources whose name matches the provided wildcard
```
PS C:\> Get-AzResource -Name test* | ft
Name ResourceGroupName ResourceType Location
---- ----------------- ------------ --------
testVM testRG Microsoft.Compute/virtualMachines westus
testKV otherRG Microsoft.KeyVault/vaults eastus
testVM2 otherResourceGroup Microsoft.Compute/virtualMachines eastus
```
This command gets all of the resources whose resource name begins with "test".
### Example 6: Get all resources of a given resource type
```
PS C:\> Get-AzResource -ResourceType Microsoft.Compute/virtualMachines | ft
Name ResourceGroupName ResourceType Location
---- ----------------- ------------ --------
testVM testRG Microsoft.Compute/virtualMachines westus
testVM2 otherResourceGroup Microsoft.Compute/virtualMachines eastus
```
This command gets all of the resources in the current subscriptions that are virtual machines.
### Example 7: Get a resource by resource id
```
PS C:\> Get-AzResource -ResourceId /subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/testRG/providers/Microsoft.Compute/virtualMachines/testVM
Name : testVM
ResourceGroupName : testRG
ResourceType : Microsoft.Compute/virtualMachines
Location : westus
ResourceId : /subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/resourceGroups/testRG/providers/Microsoft.Compute/virtualMachines/testVM
```
This command gets the resource with the provided resource id, which is a virtual machine called "testVM" in the resource group "testRG".
## PARAMETERS
### -ApiVersion
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DefaultProfile
The credentials, account, tenant, and subscription used for communication with azure
```yaml
Type: Microsoft.Azure.Commands.Common.Authentication.Abstractions.Core.IAzureContextContainer
Parameter Sets: (All)
Aliases: AzContext, AzureRmContext, AzureCredential
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ExpandProperties
When specified, expands the properties of the resource.
```yaml
Type: System.Management.Automation.SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Name
The name of the resource(s) to be retrieved. This parameter supports wildcards at the beginning and/or end of the string.
```yaml
Type: System.String
Parameter Sets: ByTagNameValueParameterSet, ByTagObjectParameterSet
Aliases: ResourceName
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: True
```
### -ODataQuery
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Pre
```yaml
Type: System.Management.Automation.SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ResourceGroupName
The resource group the resource(s) that is retireved belongs in. This parameter supports wildcards at the beginning and/or end of the string.
```yaml
Type: System.String
Parameter Sets: ByTagNameValueParameterSet, ByTagObjectParameterSet
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: True
```
### -ResourceId
Specifies the fully qualified resource ID, as in the following example
`/subscriptions/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/providers/Microsoft.Compute/virtualMachines`
```yaml
Type: System.String
Parameter Sets: ByResourceId
Aliases: Id
Required: True
Position: Named
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -ResourceType
The resource type of the resource(s) to be retrieved. For example, Microsoft.Compute/virtualMachines
```yaml
Type: System.String
Parameter Sets: ByTagNameValueParameterSet, ByTagObjectParameterSet
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Tag
Gets resources that have the specified Azure tag. Enter a hash table with a Name key or Name and Value keys. Wildcard characters are not supported.A "tag" is a name-value pair that you can apply to resources and resource groups. Use tags to categorize your resources, such as by department or cost center, or to track notes or comments about the resources. To add a tag to a resource, use the Tag parameter of the New-AzResource or Set-AzResource cmdlets. To create a predefined tag, use the New-AzTag cmdlet. For help with hash tables in Windows PowerShell, run 'Get-Help about_Hashtables'.
```yaml
Type: System.Collections.Hashtable
Parameter Sets: ByTagObjectParameterSet
Aliases:
Required: True
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TagName
The key in the tag of the resource(s) to be retrieved.
```yaml
Type: System.String
Parameter Sets: ByTagNameValueParameterSet
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -TagValue
The value in the tag of the resource(s) to be retrieved.
```yaml
Type: System.String
Parameter Sets: ByTagNameValueParameterSet
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
### System.String
## OUTPUTS
### Microsoft.Azure.Commands.ResourceManager.Cmdlets.SdkModels.PSResource
## NOTES
## RELATED LINKS
[Find-AzResource](./Find-AzResource.md)
[Move-AzResource](./Move-AzResource.md)
[New-AzResource](./New-AzResource.md)
[Remove-AzResource](./Remove-AzResource.md)
[Set-AzResource](./Set-AzResource.md)
| 29.90113 | 591 | 0.725366 | yue_Hant | 0.414899 |
a676a78d3edfa5c48b4a6b9364db85a3ee56f7e0 | 1,453 | md | Markdown | 2020/08/05/2020-08-05 09:55.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/08/05/2020-08-05 09:55.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/08/05/2020-08-05 09:55.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年08月05日09时数据
Status: 200
1.张萌录视频道歉
微博热度:3007857
2.父亲第一条朋友圈献给女儿
微博热度:2736508
3.救救苹果肌
微博热度:2731731
4.林有有 工具人
微博热度:2617080
5.鹿晗演技
微博热度:2057907
6.华为年薪201万天才少年回应
微博热度:1618922
7.黎巴嫩总理妻女在爆炸中受伤
微博热度:1241020
8.黎巴嫩首都突发爆炸
微博热度:1000385
9.白银
微博热度:964050
10.花木兰放弃北美院线发行
微博热度:945802
11.黎巴嫩总理证实2700吨硝酸铵爆炸
微博热度:862082
12.爆炸目击者说整座城下起碎玻璃雨
微博热度:798848
13.张天爱喊话跟拍者
微博热度:786223
14.上海暴雨
微博热度:613832
15.10万余箱问题坚果流入市场
微博热度:467518
16.前妻谈张玉环故意杀人案改判无罪
微博热度:461656
17.刘雨昕再唱想见你
微博热度:460358
18.尹正为王一博庆生
微博热度:447539
19.苹果更新27寸iMac
微博热度:446524
20.台风过后男子家中下水道喷垃圾
微博热度:433359
21.张玉环接受江西高院的道歉
微博热度:432296
22.曾繁日道歉
微博热度:432255
23.26年前张玉环故意杀人案6大疑点
微博热度:430237
24.泰国威胁将对Facebook采取法律行动
微博热度:428756
25.南京天空之镜石臼湖
微博热度:425221
26.汤显祖的千层套路
微博热度:383807
27.香港负压病房使用率达上限
微博热度:360666
28.千万不要让家人学会拍一拍
微博热度:352880
29.外交部回应美对中国记者签证限制
微博热度:328964
30.楚歌哭戏
微博热度:309244
31.喜欢林有有的原因
微博热度:306711
32.黄圣依哭泣时也要插兜
微博热度:286624
33.黎巴嫩总理宣布8月5日为国家哀悼日
微博热度:276469
34.粤港澳大湾区城际铁路建设规划获批
微博热度:252530
35.驻黎巴嫩大使馆提醒中国公民注意安全
微博热度:238502
36.钟晓芹美人鱼表白
微博热度:238238
37.黄金价格突破2000美元
微博热度:233258
38.美国约100万餐饮人失业
微博热度:228768
39.暴雨台风双预警
微博热度:205430
40.闺蜜拍照技术到底有多差
微博热度:201489
41.关晓彤霸气臭脸妆
微博热度:182718
42.雾山五行
微博热度:178949
43.新疆新增22例本土病例
微博热度:178706
44.黎巴嫩爆炸威力大于3.3级地震
微博热度:177944
45.哈尔滨仓库坍塌被困9人遇难
微博热度:177690
46.大坪医院
微博热度:177198
47.男朋友可以有多用心
微博热度:176892
48.警方通报南京女大学生在云南失联
微博热度:172168
49.吴邪看刘丧微博
微博热度:162323
50.张玉环出狱后首次发声
微博热度:161333
| 7.122549 | 23 | 0.788713 | yue_Hant | 0.407699 |
a676b8a6fdfee4e24be821d00deb58524e071064 | 351 | md | Markdown | _seminars/2010/2010-09-13-prof-angela-pasquale-universite-paul-verlaine--metz-france.md | siddhartha-gadgil/DeptWeb | 9ed39dffb6b0c2c47c611d548633015b13736b46 | [
"MIT"
] | 1 | 2018-05-24T07:33:31.000Z | 2018-05-24T07:33:31.000Z | _seminars/2010/2010-09-13-prof-angela-pasquale-universite-paul-verlaine--metz-france.md | siddhartha-gadgil/DeptWeb | 9ed39dffb6b0c2c47c611d548633015b13736b46 | [
"MIT"
] | 55 | 2017-01-31T05:32:50.000Z | 2021-10-11T12:08:38.000Z | _seminars/2010/2010-09-13-prof-angela-pasquale-universite-paul-verlaine--metz-france.md | siddhartha-gadgil/DeptWeb | 9ed39dffb6b0c2c47c611d548633015b13736b46 | [
"MIT"
] | 1 | 2017-04-28T06:31:27.000Z | 2017-04-28T06:31:27.000Z | ---
date: 2010-9-13
speaker: "Prof. Angela Pasquale Universite Paul Verlaine--Metz France"
title: "Analytic continuation of the resolvent, resonances, and Huygens' principle for Riemannian symmetric spaces of noncompact type"
time: "4:00-5:00 p.m."
venue: "Lecture Hall I, Department of Mathematics"
---
The abstract of this talk has been posted at:
| 39 | 134 | 0.763533 | eng_Latn | 0.884044 |
a676b976cd56d2d0b9c453c620caeec16b8cf3ff | 29,494 | md | Markdown | README.md | apolopena/drupal-flashcards | efaf56f4d8fd0c1a39d1f77fe86e285f23c35756 | [
"MIT"
] | 2 | 2021-10-31T06:52:55.000Z | 2021-11-08T09:12:37.000Z | README.md | apolopena/drupal-flashcards | efaf56f4d8fd0c1a39d1f77fe86e285f23c35756 | [
"MIT"
] | 1 | 2021-11-20T18:30:48.000Z | 2021-11-20T18:40:38.000Z | README.md | apolopena/drupal-flashcards | efaf56f4d8fd0c1a39d1f77fe86e285f23c35756 | [
"MIT"
] | 4 | 2021-08-04T16:30:11.000Z | 2021-11-13T12:34:13.000Z | # Welcome
🚀
`gitpod-phpmyadmin` was built from [`gitpod-laravel-starter` v1.3.0](https://github.com/apolopena/gitpod-laravel-starter) and generates a starting point for you to [develop in the cloud](https://www.gitpod.io/) with [phpmyadmin](https://www.phpmyadmin.net/) , [MySql](https://www.mysql.com/products/community/) and pretty much any other technoglogy you would like to add.
* Develop in the cloud on the [Gitpod](https://www.gitpod.io/) platform
* Preconfigured yet fully customizable [LAMP](https://en.wikipedia.org/wiki/LAMP_(software_bundle)) or [LEMP](https://lemp.io/) stack
* Full debugging capabilities
This readme is a work in progress and an adaptation from [`gitpod-laravel-starter` v1.3.0](https://github.com/apolopena/gitpod-laravel-starter) so please ignore references to the laravel framework for now as that functionality has been removed. Some of the information found that pertains to laravel will pertain to this project as well.
If you want to jump right in to [setting up a project](https://github.com/apolopena/gitpod-laravel-starter/wiki/Setup) then have a look at the wiki [setup page](https://github.com/apolopena/gitpod-laravel-starter/wiki/Setup).
The [wiki](https://github.com/apolopena/gitpod-laravel-starter/wiki/Setup) is designed to provide you with essential details not found in this document such as how to easily add [hot reloading](https://github.com/apolopena/gitpod-laravel-starter/wiki/Hot-Reload) and [Typescript](https://github.com/apolopena/gitpod-laravel-starter/wiki/Typescript) to your projects.
`gitpod-phpmyadmin` is designed for any type of developer from beginner to professional to hobbyist. Developing in the cloud has many benefits including giving developers the freedom to try entire complex technological stacks with a single click.
## _Powered 100% by open source_:
<a href="https://www.gitpod.io/"><img src="https://gitpod.io/static/media/gitpod.2cdd910d.svg" alt="Gitpod - Spin up fresh, automated dev environments
for each task, in the cloud, in seconds" width="70" ></a>**Gitpod**
<a href="https://www.php.net/"><img src="https://www.php.net/images/logos/new-php-logo.svg" alt="PHP - A popular general-purpose scripting language that is especially suited to web development" width="130" ></a>
<a href="https://www.mysql.com/products/community/"><img src="https://www.logo.wine/a/logo/MySQL/MySQL-Logo.wine.svg" alt="MySQL Community Edition - Freely downloadable version of the world's most popular open source database" width="140" ></a>
<a href="https://httpd.apache.org/"><img src="https://upload.wikimedia.org/wikipedia/commons/1/10/Apache_HTTP_server_logo_%282019-present%29.svg" alt="The Apache Software Foundation, Apache License 2.0 <http://www.apache.org/licenses/LICENSE-2.0>, via Wikimedia Commons" width="165" ></a>
<a href="hhttps://www.nginx.com/resources/wiki/"><img src="https://upload.wikimedia.org/wikipedia/commons/c/c5/Nginx_logo.svg" alt="NGINX - A free, open-source, high-performance HTTP server and reverse proxy, as well as an IMAP/POP3 proxy server." width="150" ></a>
<a href="https://www.gnu.org/software/bash/"><img src="https://upload.wikimedia.org/wikipedia/commons/4/4b/Bash_Logo_Colored.svg" alt="Bootstrap - Build fast, responsive sites with Bootstrap" width="80" ></a>
<a href="https://reactjs.org/"><img src="https://upload.wikimedia.org/wikipedia/commons/a/a7/React-icon.svg" alt="Bash A Unix shell and command language written by Brian Fox for the GNU Project" width="115" ></a>
<a href="https://vuejs.org/"><img src="https://upload.wikimedia.org/wikipedia/commons/9/95/Vue.js_Logo_2.svg" alt="Vue.js - The Progressive
JavaScript Framework" width="72" ></a>
<a href="https://getbootstrap.com/"><img src="https://cdn.worldvectorlogo.com/logos/bootstrap-5-1.svg" alt="Bootstrap - Build fast, responsive sites with Bootstrap" width="82" ></a>
<br />
# Table of Contents
1. [Welcome](#welcome)
2. [Requirements](#requirements)
3. [Setting Up a Repository](#setting-up-a-repository)
- 3.1 [Creating a new Gitpod Workspace from a GitHub repository](#creating-a-new-gitpod-workspace-from-a-github-repository)
4. [Running the Client](#running-the-client)
5. [Pushing Laravel scaffolding Files to Your Remote Repository](#pushing-laravel-scaffolding-files-to-your-remote-repository)
- 5.1 [Gitpod account permissions](#gitpod-account-permissions)
- 5.2 [GitHub email protection](#GitHub-email-protection)
6. [Starter Project Configuration](#Starter-Project-Configuration)
- 6.1 [Preset Examples](#preset-examples)
- 6.2 [Development Servers](#development-servers)
- 6.3 [Changing the default server](#changing-the-default-server)
- 6.4 [Running more than one server at a time](#running-more-than-one-server-at-a-time)
- 6.5 [Changing the Laravel Version](#changing-the-laravel-version)
- 6.6 [Breaking the Docker cache](#breaking-the-docker-cache)
7. [Additional Features](#additional-features)
- 7.1 [Hot Reloading](#hot-reloading)
- 7.2 [Typescript](#typescript)
8. [Debugging PHP](#debugging-php)
- 8.1 [The default development server](#the-default-development-server)
- 8.2 [Specific development servers](#specific-development-servers)
- 8.3 [Setting breakpoints](#setting-breakpoints)
- 8.4 [Debugging Blade templates](#debugging-blade-templates)
- 8.5 [Tailing the Xdebug Log](#tailing-the-xdebug-log)
9. [Debugging JavaScript](#debugging-javascript)
10. [phpMyAdmin](#phpmyadmin)
- 10.1 [Installing phpMyAdmin](#installing-phpmyadmin)
- 10.2 [Security Concerns](#security-concerns)
- 10.3 [Securing phpMyAdmin](#securing-phpmyadmin)
11. [Generating a CHANGELOG.md Using github-changelog-generator](#generating-a-changelogmd-using-github-changelog-generator)
- 11.1 [Setting up an Access Token for github-changelog-generator](#setting-up-an-access-token-for-github-changelog-generator)
12. [Project Specific Bash Code for Gitpod](#project-specific-bash-code-for-gitpod)
13. [Ruby Gems](#ruby-gems)
14. [Git Aliases](#git-aliases)
- 14.1 [Emoji-log and Gitmoji](#emoji-log-and-gitmoji)
15. [Deployment Outside of Gitpod](#deployment-outside-of-gitpod)
16. [Gitpod Caveats](#gitpod-caveats)
17. [Thanks](#thanks)
<br />
## Requirements
- A [GitHub](https://github.com/) account. You may use a free account.
- A [GitPod](https://www.gitpod.io/) account. You may use a free account. Just log in with your github credentials.
<br />
## Setting Up a Repository
There are many ways that you can [use `gitpod-laravel-starter`](https://github.com/apolopena/gitpod-laravel-starter/wiki/Setup). __Full setup instructions can be found on the wiki [setup page](https://github.com/apolopena/gitpod-laravel-starter/wiki/Setup)__.
### Creating a new Gitpod Workspace from a Github repository
Gitpod makes this easy. One simple URL deploys the entire system.
__A detailed breakdown of the initialization phase can be found on the wiki [initialization page](https://github.com/apolopena/gitpod-laravel-starter/wiki/Initialization)__
- Paste your GitHub repository URL to the end of the special Gitpod URL: **https://gitpod.io/#/**.
- If you don't need to push changes and you just want to try this repository with the default configuration you can click here [](http://gitpod.io/#/https://github.com/apolopena/gitpod-laravel-starter)
- Instructions for setting up a repository of your own can be found on the wiki [setup page](https://github.com/apolopena/gitpod-laravel-starter/wiki/Setup)
Initializing the workspace will take between 2 to 5 minutes depending on how you have [configured](#starter-project-configuration) the [`starter.ini`](https://github.com/apolopena/gitpod-laravel-starter/blob/main/starter.ini) file. Subsequent starts or creation of a workspace from your repository will be much faster thanks to caching mechanisms.
When the workspace is created for the first time an entire online development environment complete with an IDE is deployed along with any additional installations you have set in `starter.ini`. Laravel scaffolding files and debugging capabilities are created the first time you build the workspace so you should push any new files to your repository before you get started developing your project. You can push the files with a single command: `git new "Initial Commit"`
<br />
## Running the Client
A preview browser should automatically open and display the Laravel start page once the system is ready. This page is served by the default web server which is set in `starter.ini`. The code for the Laravel start page page is in `/resources/views/welcome.blade.php`. To manually open the preview browser or to refresh it you can run the command `op`.
<br />
## Pushing Laravel scaffolding Files to Your Remote Repository
If the result log summary in the console shows success, then you should push those newly created Laravel scaffolding files to your remote repository before you get started coding your project.
### Gitpod account permissions
You may need to allow Gitpod additional permissions to push to your repository in case you come across an issue like [this one](https://community.gitpod.io/t/i-cant-push-my-changes-to-my-github-remote-repository/629).
### GitHub email protection
If your GitHub account uses the protected email feature and the email address you are using in your git configuration looks something like this:
`[email protected]`
you may encounter an error that looks something like this:
`! [remote rejected] readme-dev -> readme-dev (push declined due to email privacy restrictions)`
The easiest way to circumvent error is to uncheck the box labeled "**Block command line pushes that expose my email**" under **Settings**-->**Emails** in your GitHub account.
Another workaround is to edit the `~/.gitconfig` file in your Gitpod workspace to use your protected email address since Gitpod defaults to using the unprotected email address for your GitHub account. Please note that if you do this you will have to make this change for **_every_** time you create a new workspace.
<br />
## Starter Project Configuration
A configuration file has been provided to allow you to control many aspects of the development environment and the Laravel project scaffolding.
The file [`starter.ini`](https://github.com/apolopena/gitpod-laravel-starter/blob/hot-reload/starter.ini) in the root of the project allows you to configure optional installations and other various options. Have a look at the comments in `starter.ini` for details and acceptable values you can use. Simply change values in `starter.ini`, push those changes to your repository, create a new Gitpod workspace from that repository and your new configurations will be enabled. Some of the configurations you can make are:
- Server: `apache`, `nginx` or `php` (development server)
- Optional installations
- `phpMyAdmin`
- Frontend: `react`, `vue` or plain `bootstrap`
- Your servers log monitor: `tail` with colorized log or `multitail`
- `.editorconfig`: You can omit this file or use a less opinionated version of this file than what Laravel gives you by default
- [github-change-log-generator](https://github.com/github-changelog-generator/github-changelog-generator)
*Please note that many of the configurations found in `starter.ini` should be made just once __prior__ to creating your workspace for the first time. Have a look at the comments in [`starter.ini`](https://github.com/apolopena/gitpod-laravel-starter/blob/main/starter.ini) for specifics.*
### Preset Examples
`gitpod-laravel-starter` preset examples are auto-configured examples of React and Vue projects that you can learn from or use as starting points for your own projects.
You can initialize a preset example as a starting point by adding `EXAMPLE=<id>` to the Gitpod URL right after the `#` and followed by a `/`.
To use a preset example as a starting point:
1. [Setup a project repository](https://github.com/apolopena/gitpod-laravel-starter/wiki/Setup#create-a-new-project-repository-from-gitpod-laravel-starter)
2. Initialize your workspace using the workspace URL for your corresponding EXAMPLE id but substitute https://github.com/apolopena/gitpod-laravel-starter with your project repository URL.
3. Save the system generated project scaffolding files to your new repository and you can start your project from that point.
- Please that some directives in `starter.ini` such as `phpmyadmin` will not be supercded on subsequent initializations of your workspace. Edit your `starter.ini` as needed.
| id | Description | Workspace URL |
| :---: | :--- | :--- |
| 1 | React Example with phpMyAdmin - Questions and Answers | https://gitpod.io/#EXAMPLE=1/https://github.com/apolopena/gitpod-laravel-starter |
| 2 | React Example without phpMyAdmin - Questions and Answers | https://gitpod.io/#EXAMPLE=2/https://github.com/apolopena/gitpod-laravel-starter |
| 3 __*__ | React Typescript Example with phpMyAdmin - Questions and Answers | https://gitpod.io/#EXAMPLE=3/https://github.com/apolopena/gitpod-laravel-starter |
| 4 __*__ | React Typescript Example without phpMyAdmin - Questions and Answers | https://gitpod.io/#EXAMPLE=4/https://github.com/apolopena/gitpod-laravel-starter |
| 10 __**__ | Vue Example with phpMyAdmin - Material Dashboard | https://gitpod.io/#EXAMPLE=10/https://github.com/apolopena/gitpod-laravel-starter |
| 11 __**__ | Vue Example without phpMyAdmin - Material Dashboard | https://gitpod.io/#EXAMPLE=11/https://github.com/apolopena/gitpod-laravel-starter |
<br />__\*__ Comes with hot reload functionality
<br />__\**__ Not designed to run in an iframe such as the preview browser in the IDE.
### Development Servers
`gitpod-laravel-starter` project comes pre-packaged with three development servers that serve on the following ports:
- Apache: port 8001
- Nginx (with `php-fpm`): port 8002
- PHP Development Sever: port 8000
By default the server set in `starter.ini` will be the server used. You can run any server at the same time or change your default server in `starter.ini` at any time.
Please note that Laravel uses the APP_URL and ASSET_URL variables set in `.env` to serve content. These values are set during workspace initialization and are based on the default server you are using. If you want serve the project using a different server _after_ a workspace has been created, then you will need to change APP_URL and ASSET_URL in `.env` to have the port number in it for the server you want to use.
You may also run the PHP Development server manually via the command `php artisan serve` which will use port 8000.
The default server will be started automatically when the workspace is started.
You can toggle any server on and off from any terminal window by running the relevant command. These commands will also dynamically kill the log monitor process for that server:
- Apache: `start_apache` or `stop_apache`
- Nginx `start_nginx` or `stop_nginx`
- PHP built-in development server: `start_php_dev` or `stop_php_dev`
### Changing the default server
Change the value of `default_server` in the `development` section of `starter.ini` to `apache`, `nginx`, or `php`. You will need to change the APP_URL and ASSET_URL in the `.env` file to use the port number for that server if you change the default development server *after* a workspace has been created.
### Running more than one server at a time
You may start and stop multiple servers.
If you have the Apache server running and you want to run the Nginx server at the same time just run this command:
`start_nginx`
The Nginx server will now be running in addition to the Apache server.
Laravel requires a URL to be set in the `.env` file in the project root. This is done for you automatically when the workspace is initialized. The URL set in the `.env` file contains the server port. so if you want to properly serve Laravel pages from a server other than the default server you initialized the project with then will need to change the values for APP_URL and ASSET_URL accordingly.
### Changing the Laravel Version
In `starter.ini` there is a directive to change the version of Laravel. You should only change the version of Larvel *before* you create a new workspace. The laravel version directive is cached in the workspace image so changing it sometimes requires you to [break the Docker cache](#breaking-the-docker-cache)
**Important**:
- By default `gitpod-laravel-starter` uses the most recent version of Laravel. Currently the most recent version of Laravel is `8.*`
- There are exactly three supported values for the Laravel version directive: `8.*`, `7.*`, and `6.*`
- Laravel will always use the most recent/stable minor and patch version for any major version.
**Caveats**:
- __Upgrading or downgrading Laravel once Laravel scaffolding files have been saved to your repository is not advised and should be avoided.__
- Attempts to upgrade will will result in an automatic downgrade and could cause instability.
- Attempts to downgrade will be ignored and could cause instability.
- The Laravel version directive is cached in the workspace image so changing it requires you to break the Docker cache.
### Breaking the Docker cache
You can break the Docker cache and force the workspace image to be rebuilt by incrementing the `INVALIDATE_CACHE` variable in `.gitpod.Dockerfile`. Push the changed `.gitpod.Dockerfile` to your repository, create a new gitpod workspace and the workspace image will be rebuilt. Any cached external files that Docker uses such as `starter.ini` will be updated.
<br />
## Additional Features
To keep the `gitpod-laravel-framework` as flexible as possible, some features have been left out of the `starter.ini` configuration file. These additional features can be easily added to your project using a one-time set up process. Wiki pages are available for each additional feature below that you may want to add to your project.
### Hot Reloading
- `gitpod-laravel-starter` makes it easy for you to add the ability to see your code changes in realtime without refreshing the browser. Take a look at the wiki [hot reload](https://github.com/apolopena/gitpod-laravel-starter/wiki/Hot-Reload) page for more details.
### Typescript
- Adding [Typescript](https://www.typescriptlang.org/) to your project is simple. Have a look at the wiki [Typescript page](https://github.com/apolopena/gitpod-laravel-starter/wiki/Typescript) for an example.
<br />
## Debugging PHP
Debugging must be enabled before breakpoints can be hit and will last for an hour before the debug session is disabled automatically.
When debugging is enabled or disabled, the preview browser will reload the index page. When debugging is enabled, *each* subsequent request can be debugged for an hour or until debugging is disabled.
This system uses port `9009` for the debugging. A launch configuration file is included in `.vscode/launch.json` and in `.theia/launch.json`.
### The default development server
To enable a debugging session on the default development server run `debug_on` in a Gitpod terminal.
To disable a debugging session on the default development server run `debug_off` in a Gitpod terminal.
### Specific development servers
You can toggle a debugging session for a specific server:
- Apache
- `debug_on apache` or `debug_off apache`
- Nginx
- `debug_on nginx` or `debug_off nginx`
- PHP (development server)
- `debug_on php` or `debug_off php`
*The [hot reload](https://github.com/apolopena/gitpod-laravel-starter/wiki/Hot-Reload) webpack server on port 3005 is not supported by this debugging system. You may be able to [configure it on your own](https://stackoverflow.com/questions/28470601/how-to-do-remote-debugging-with-browser-sync) if you like.*
### Setting breakpoints
Set a breakpoint in the Gitpod IDE by clicking in the gutter next to the line of code you want in any PHP file in the `public` folder (or deeper)
Then in the Gitpod IDE in the browser:
1. Click the debug icon in the left side panel to open the Debug panel.
2. Choose "Listen for XDebug" from the dropdown list.
3. Click the green play button (you should see the status "RUNNING" in the Threads panel)
4. Refresh the preview browser either manually or by running the `op` command and your breakpoint will be hit in the IDE.
All debugging is subject to a server timeout, just refresh preview browser or run the command `op` if this happens.
### Debugging Blade templates
You may also debug blade templates by placing the following snippet above where you want to inspect the blade directive.
```php
<?php xdebug_break(); ?>
```
Save the file and refresh the preview browser when the debugger is in the IDE.
This will open a temporary PHP file that has all the blade directives converted to `php` tags, you may set additional breakpoints in this code as well. Do not edit the code in these temporary files as it they be disposed at any time and are only derived for the current debugging session.
If you are having trouble, launch the "Listen for Xdebug" launch configuration again and refresh the preview browser.
<br />
### Tailing the Xdebug Log
You may want to see how Xdebug is working with your server when you are debugging PHP files.
1. Open a new terminal in gitpod
2. Run the command: `tail -f /var/log/xdebug.log`
<br />
## Debugging JavaScript
The is a rather diverse topic. To make a long story short it is possible but very situational.
Have a look at the wiki [debugging JavaScript](https://github.com/apolopena/gitpod-laravel-starter/wiki/Debugging-JavaScript) page for details and exact steps you can take to debug various types of JavaScript.
<br />
## phpMyAdmin
phpMyAdmin is a tool that handles MySQL administration over the web. This tool is very powerful and can be essential when developing MySQL powered systems especially in the cloud. For more information on what phpMyAdmin can do, check out the [official documentation](https://www.phpmyadmin.net/docs/), the [user guide](https://docs.phpmyadmin.net/en/latest/user.html) or just dabble around on the [demo server](https://www.phpmyadmin.net/try/).
### Installing phpMyAdmin
phpMyAdmin is installed automatically by default. A phpMyAdmin installation directive is available in `starter.ini` that allows you to omit the installation if you like.
### Security Concerns
phpMyAdmin also introduces some extra security concerns that you may want to address. If you have installed phpMyAdmin using the install directive in `starter.ini` then by default, two MySQL accounts are created using default passwords stored in version control:
- **pmasu**: This is the 'super user' account that a developer can use to log into phpMyAdmin in order to administer any MySQL database.
- The default password for the user **pmasu** is: ***123456***
- **pma**: This is the 'control user' that the phpMyAdmin uses internally to manage it's advanced storage features which are enabled by default. This user can only administer the `phpmyadmin` database and should not be used by anyone.
- The default password the 'control user' **pma** is: ***pmapass***
<br />
### Securing phpMyAdmin
At a minimum the default passwords that phpMyAdmin uses to administer the MySQL databases should be changed right after a Gitpod workspace has been created for the first time. An `update-phpmyadmin-pws` command has been provided that automagically changes the default passwords for you.
<br /><br />
The following steps are required to successfully run the `update-phpmyadmin-pws` command:
1. Create a file in .gp named `.starter.env`. you can run this command from the project root: `cp .gp/.starter.env.example .starter.env`
2. Or Copy and paste all the keys containing `PHPMYADMIN` from `.gp/.starter.env.example` to your blank `.starter.env` file
3. In `.starter.env`, set your password values for the `PHPMYADMIN` keys and save the file
4. In a terminal run the alias: `update-phpmyadmin-pws`
<br />
## Generating a CHANGELOG.md Using github-changelog-generator
Keeping track of your changes and releases can easily be automated.
There is an option in `starter-ini` to install [`github-changelog-generator`](https://github.com/github-changelog-generator/github-changelog-generator).
This option is on by default and additional settings for this option can be found in `starter.ini`.
You can generate a `CHANGELOG.md` by running the command:
`rake changelog`
Currently generating a changelog can only be done when the workspace is built for the first time. See [here](https://github.com/apolopena/gitpod-laravel-starter/issues/57) for more details. See [github-changelog-generator](https://github.com/github-changelog-generator/github-changelog-generator) for documentation.
### Setting up an Access Token for github-changelog-generator
GitHub limits API calls unless using an [access token](https://github.com/settings/tokens/).
`github-changelog-generator` uses the GitHub API to generate a `CHANGELOG.md` and will quickly max out of API calls if you try to generate the `CHANGELOG.md` more than a few times in a certain period of time.
It is recommended that you setup an access token from your GitHub account and then set that access token in an environment variable via your Gitpod dashboard. This way any project you like can generate a `CHANGELOG.md` as many times as it likes without error.
1. You can generate an access token [here](https://github.com/settings/tokens/new?description=GitHub%20Changelog%20Generator%20token). If the repository is public you do not need to grant any special privileges, just generate the token and copy it to your clipboard. Otherwise if the repository is private you need to grant it 'repo' privileges.
2. Once you have the github access token copied to your clipboard, in your gitpod account go to settings in the Environment Variables section click the "Add Variable" button.
3. For the 'name' field value type in `CHANGELOG_GITHUB_TOKEN`
4. For the 'value' field paste in your github access token
5. For the 'Organization/Repository' field you may leave it as it or type in GITHUBUSERNAME/* where GITHUBUSERNAME is the user name of your github account. This will allow you to use the github-changelog-generator as many times as you like for any of your repositories.
6. Restart or create a new workspace and you will now be able to use `github-changelog-generator` via the `rake changelog` command as many times as you like.
Important Note: If you do not generate an access token for `github-changelog-generator`, and if you do not cancel the error that results when you exceed your Github API calls when using `github-changelog-generator` then you could potentially run out of space for your github workspaces and not be able to create any any new workspace or open any existing ones until you delete the offending workspace(s) or the system is cleared automatically.
<br />
## Project Specific Bash Code for Gitpod
Bash code that you want to run when your Gitpod workspace is created for the first time such as database migrations and seeding should be put in the file:
`bash/init-project.sh`
This file contains some basic scaffolding and examples that you may use in your project.
<br />
## Ruby Gems
Currently until gitpod fixes the [issue](https://github.com/apolopena/gitpod-laravel-starter/issues/57) of ruby gems not persisting across workspace restarts, you can only use rake commands when the workspace is created for the first time.
<br />
## Git Aliases
Git aliases that you would like to add to your project should be added to the [`alias`](https://github.com/apolopena/gitpod-laravel-starter/blob/main/.gp/snippets/git/aliases) file.
### Emoji-log and Gitmoji
A compilation of git aliases from [Emoji-log](https://github.com/ahmadawais/Emoji-Log) and [Gitmoji](https://gitmoji.dev/) are included, use them as you like from the command line. There is also a separate set of emoji based git aliases that will commit the files with a message and push them to the repository *without* adding the files. Use these aliases for dealing with groups of files that need different commit messages but still need to use to Emoji-log and or Gitmoji standards. You can get a list of all the emoji based git aliases with the command: `git a`
<br />
## Deployment Outside of Gitpod
For now this will be something you need to figure out, eventually some guidelines for how to do that may be added here.
<br />
## Gitpod Caveats
Gitpod is an amazing and dynamic platform however sometimes during it's peak hours, latency can affect the workspace. Here are a few symptoms and their possible remedies. This section will be updated or removed as Gitpod evolves.
- **Symptom**: Workspace loads, IDE displays, however one or more terminals are blank.
- **Possible Fix**: Delete the workspace in your Gitpod dashboard and then [recreate the workspace](#creating-a-new-workspace-in-gitpod-from-your-new-github-project-repository).
- **Symptom**: Workspace loads, IDE displays, however no ports become available and or the spinner stays spinning in the terminal even after a couple of minutes.
- **Possible Fix**: Refresh the browser
You can also try to remedy any rare Gitpod network hiccups by simply waiting 30 minutes and trying again.
## Thank You
🙏 to the communities of:
- Gitpod
- Laravel
- VS Code
- Xdebug
| 71.587379 | 566 | 0.771309 | eng_Latn | 0.989946 |
a676e31339209583f23a7178d8e2c6cb41a97a63 | 2,200 | md | Markdown | README.md | fridek/Thesis-physics | bf6ec43579aac7e4946b8c98a721e87053eeeac6 | [
"MIT"
] | 2 | 2015-06-15T01:20:42.000Z | 2019-06-09T02:26:19.000Z | README.md | fridek/Thesis-physics | bf6ec43579aac7e4946b8c98a721e87053eeeac6 | [
"MIT"
] | null | null | null | README.md | fridek/Thesis-physics | bf6ec43579aac7e4946b8c98a721e87053eeeac6 | [
"MIT"
] | null | null | null | This project aims to implement common subsystems of physics engines using C++ and JavaScript.
Then C++ is cross-compiled to JavaScript using EmScripten and all three versions are compared time-wise.
Demo of Octree-partitioned space with 1K spheres colliding available at http://fridek.github.io/Thesis-physics/
To run benchmarks, you need g++ for C++ tests and V8 for JS. Build V8 from source and put d8 in browser/static for best performance - packages in repositories are outdated.
To build latest V8 with g++ 4.8 use ```make native werror=no```, otherwise plain ```make native``` is enough.
Process described in details on https://code.google.com/p/v8/wiki/BuildingWithGYP
Makefile rule for benchmarks is ```make timeall```.
To build all tests you need g++, emcc (EmScripten) and plovr (plovr.jar in browser/bin/).
https://github.com/kripken/emscripten
https://github.com/bolinfest/plovr
Makefile rule for compilation and benchmark is ```make build_and_test```.
Latest results from ```make timeall``` on my platform:
Fedora 19, Intel i7 2670QM, 4GB RAM, g++ 4.8.1
```
time/particles1.sh
JavaScript time: 19.51s 508.48% slower
Emscripten JavaScript time: 4.85s 51.28% slower
C++ time: 3.21s [FASTEST]
time/particles2.sh
JavaScript time: 4.96s 203.49% slower
Emscripten JavaScript time: 5.10s 212.06% slower
C++ time: 1.63s [FASTEST]
time/spheres1.sh
JavaScript time: 9.02s 81.92% slower
Emscripten JavaScript time: 12.35s 148.99% slower
C++ time: 4.96s [FASTEST]
time/spheres2.sh
JavaScript time: 14.14s 311.20% slower
Emscripten JavaScript time: 11.20s 225.71% slower
C++ time: 3.44s [FASTEST]
```
Windows 7, Intel i7 2670QM, 4GB RAM, g++ 4.7.3, Cygwin
```
time/particles1.sh
JavaScript time: 20.77s 491.96% slower
Emscripten JavaScript time: 6.46s 84.09% slower
C++ time: 3.51s [FASTEST]
time/particles2.sh
JavaScript time: 3.47s 102.57% slower
Emscripten JavaScript time: 5.57s 225.72% slower
C++ time: 1.71s [FASTEST]
time/spheres1.sh
JavaScript time: 10.81s 13.52% slower
Emscripten JavaScript time: 11.82s 24.15% slower
C++ time: 9.52s [FASTEST]
time/spheres2.sh
JavaScript time: 16.95s 20.19% slower
Emscripten JavaScript time: 17.79s 26.14% slower
C++ time: 14.10s [FASTEST]
```
| 35.483871 | 172 | 0.750909 | eng_Latn | 0.549069 |
a67757f0ff58ef3cff6b1a63a9a4bf5e4276b795 | 2,097 | md | Markdown | content/post/2018-05-12-g_ck.md | pulliam/blog | 581b179d95ce6423765ff157712dfd19007cecb1 | [
"MIT"
] | 1 | 2019-02-05T18:10:20.000Z | 2019-02-05T18:10:20.000Z | content/post/2018-05-12-g_ck.md | pulliam/blog | 581b179d95ce6423765ff157712dfd19007cecb1 | [
"MIT"
] | null | null | null | content/post/2018-05-12-g_ck.md | pulliam/blog | 581b179d95ce6423765ff157712dfd19007cecb1 | [
"MIT"
] | null | null | null | ---
title: What is g_ck
date: 2018-05-12
layout: post
tags:
- rest
aliases:
- "/g_ck/"
---
What is `g_ck` and why should I care?
<!--more-->
`g_ck` is the current sessions token for authentication.
## Why would you want to use this or know about this
It's useful for making rest requests while on the system to the system.
You might be thinking, "That seems silly". Trust me it does. However, if you want to let a page load quickly and have it load the rest of the content later, doing this is great.
## How do I use it
First, you have to make sure you have it, once you understand what it is, you can generate it if you don't have access to it with the following code server side;
```js
var g_ck = gs.getSession().getSessionToken();
```
So if on a UI Macro or UI Page, wrap that with `<g:evaluate>` tags. If on a Service Portal, toss that in your server script.
Once you have it you can use it in place of authentication, if you omit it, you will be asked to authenticate.
Below is a slightly modified script generated from the API Explorer.
```js
var requestBody = "";
var endpoint = document.location.origin + "/api/now/table/sys_user?";
endpoint += "sysparm_query=active%3Dtrue&";
endpoint += "sysparm_fields=user_name&";
endpoint += "sysparm_limit=1"
var client=new XMLHttpRequest();
client.open("get", endpoint);
client.setRequestHeader('Accept','application/json');
client.setRequestHeader('Content-Type','application/json');
// Eg. UserName="admin", Password="admin" for this code sample.
// Normally this is what it shows.
// client.setRequestHeader('Authorization', 'Basic '+btoa('admin'+':'+'admin'));
// However if you're already authenticated. You can use X-UserToken
client.setRequestHeader('X-UserToken', g_ck);
client.onreadystatechange = function() {
if(this.readyState == this.DONE) {
// Going to console log instead
// document.getElementById("response").innerHTML=this.status + this.response;
console.log(this.response);
}
};
client.send(requestBody);
/*
* Response I get
* {"result":[{"user_name":"abel.tuter"}]}
*/
```
| 30.838235 | 179 | 0.712923 | eng_Latn | 0.98291 |
a677589d863872f2b249842bf77cc1ade81bfbdb | 92 | md | Markdown | README.md | alkonosst/smartyboy | c44be42d28f33b0308df3436ec708aa2d03fdf5c | [
"MIT"
] | null | null | null | README.md | alkonosst/smartyboy | c44be42d28f33b0308df3436ec708aa2d03fdf5c | [
"MIT"
] | null | null | null | README.md | alkonosst/smartyboy | c44be42d28f33b0308df3436ec708aa2d03fdf5c | [
"MIT"
] | null | null | null | # Smarty Boy - IIoT Development Board

> Work in progress... | 18.4 | 37 | 0.717391 | eng_Latn | 0.276357 |
a67780c11afee2789d4e83e56464b9478f52315a | 58 | md | Markdown | README.md | lzaoral/sbt-slicer | ce747eca2b9fc930b33850ad8e20c8100118f299 | [
"MIT"
] | 8 | 2018-06-27T14:38:26.000Z | 2022-03-09T11:27:20.000Z | README.md | lzaoral/sbt-slicer | ce747eca2b9fc930b33850ad8e20c8100118f299 | [
"MIT"
] | 2 | 2020-05-08T09:12:21.000Z | 2020-09-01T09:58:24.000Z | README.md | lzaoral/sbt-slicer | ce747eca2b9fc930b33850ad8e20c8100118f299 | [
"MIT"
] | 1 | 2020-04-02T14:30:37.000Z | 2020-04-02T14:30:37.000Z | # sbt-slicer
Static program slicer used in Symbiotic tool
| 19.333333 | 44 | 0.810345 | eng_Latn | 0.996221 |
a6778b8c85345f8f6183ec3218bf25c6bf68ff6b | 4,407 | md | Markdown | README.md | hellow554/cargo-bitbake | 3e33296f6129c703c3295bfe603806e94b7990ca | [
"Apache-2.0",
"MIT"
] | 26 | 2020-06-25T14:27:00.000Z | 2022-03-24T22:25:19.000Z | README.md | hellow554/cargo-bitbake | 3e33296f6129c703c3295bfe603806e94b7990ca | [
"Apache-2.0",
"MIT"
] | 30 | 2020-04-06T21:43:52.000Z | 2022-03-29T16:50:42.000Z | README.md | hellow554/cargo-bitbake | 3e33296f6129c703c3295bfe603806e94b7990ca | [
"Apache-2.0",
"MIT"
] | 21 | 2020-09-11T12:03:45.000Z | 2022-01-28T13:04:20.000Z | # cargo-bitbake
[](https://travis-ci.org/cardoe/cargo-bitbake) [](https://crates.io/crates/cargo-bitbake)
`cargo bitbake` is a Cargo subcommand that generates a
[BitBake](https://en.wikipedia.org/wiki/BitBake) recipe that uses
[meta-rust](https://github.com/meta-rust/meta-rust) to build a Cargo based
project for [Yocto](https://yoctoproject.org)
Install it with Cargo:
```
$ cargo install cargo-bitbake
```
In its default mode, `cargo bitbake` will write the recipe for the
local crate:
```
$ cargo bitbake
Wrote: cargo-bitbake_0.1.0.bb
```
## Parameter Mapping
| Yocto | Cargo |
| ---------------- | --------------------------- |
| SRC_URI | each line in `dependencies` |
| SUMMARY | `package.description` |
| HOMEPAGE | `package.homepage` or `package.repository` |
| LICENSE | `package.license` or `package.license-file`
| LIC_FILES_CHKSUM | `package.license` or `package.license-file`. See below |
### LIC_FILES_CHKSUM
`LIC_FILES_CHKSUM` is treated a bit specially. If the user specifies `package.license-file` then the
filename is taken directly. If `package.license` is specified then it checks for the filename directly
and falls back to checking `LICENSE-{license}`. If nothing can be found then you are expected to generate
the md5sum yourself.
The license field supports any valid Cargo value and can be separated by `/` to specify multiple licenses.
## API
API documentation is available at [docs.rs](https://docs.rs/crate/cargo-bitbake/).
## Example output
```
$ cat cargo-bitbake_0.1.0.bb
inherit cargo_util
SRC_URI = " \
crate://crates.io/libssh2-sys/0.1.37 \
crate://crates.io/crates-io/0.2.0 \
crate://crates.io/openssl-sys/0.7.14 \
crate://crates.io/nom/1.2.3 \
crate://crates.io/rustache/0.0.3 \
crate://crates.io/url/1.1.1 \
crate://crates.io/unicode-bidi/0.2.3 \
crate://crates.io/num_cpus/0.2.13 \
crate://crates.io/libc/0.2.14 \
crate://crates.io/strsim/0.3.0 \
crate://crates.io/fs2/0.2.5 \
crate://crates.io/curl/0.2.19 \
crate://crates.io/pkg-config/0.3.8 \
crate://crates.io/filetime/0.1.10 \
crate://crates.io/flate2/0.2.14 \
crate://crates.io/matches/0.1.2 \
crate://crates.io/unicode-normalization/0.1.2 \
crate://crates.io/tar/0.4.6 \
crate://crates.io/memchr/0.1.11 \
crate://crates.io/git2/0.4.4 \
crate://crates.io/git2-curl/0.4.1 \
crate://crates.io/env_logger/0.3.4 \
crate://crates.io/winapi/0.2.8 \
crate://crates.io/miniz-sys/0.1.7 \
crate://crates.io/libgit2-sys/0.4.4 \
crate://crates.io/advapi32-sys/0.1.2 \
crate://crates.io/toml/0.1.30 \
crate://crates.io/pnacl-build-helper/1.4.10 \
crate://crates.io/gcc/0.3.31 \
crate://crates.io/tempdir/0.3.4 \
crate://crates.io/thread-id/2.0.0 \
crate://crates.io/libz-sys/1.0.5 \
crate://crates.io/url/0.2.38 \
crate://crates.io/thread_local/0.2.6 \
crate://crates.io/kernel32-sys/0.2.2 \
crate://crates.io/rustc-serialize/0.3.19 \
crate://crates.io/user32-sys/0.2.0 \
crate://crates.io/regex-syntax/0.3.4 \
crate://crates.io/libressl-pnacl-sys/2.1.6 \
crate://crates.io/crossbeam/0.2.9 \
crate://crates.io/bitflags/0.1.1 \
crate://crates.io/memstream/0.0.1 \
crate://crates.io/winapi-build/0.1.1 \
crate://crates.io/idna/0.1.0 \
crate://crates.io/glob/0.2.11 \
crate://crates.io/semver/0.2.3 \
crate://crates.io/time/0.1.35 \
crate://crates.io/gdi32-sys/0.2.0 \
crate://crates.io/utf8-ranges/0.1.3 \
crate://crates.io/term/0.4.4 \
crate://crates.io/rand/0.3.14 \
crate://crates.io/uuid/0.1.18 \
crate://crates.io/cargo/0.10.0 \
crate://crates.io/curl-sys/0.1.34 \
crate://crates.io/docopt/0.6.81 \
crate://crates.io/regex/0.1.73 \
crate://crates.io/cmake/0.1.17 \
crate://crates.io/log/0.3.6 \
crate://crates.io/aho-corasick/0.5.2 \
crate://crates.io/cargo-bitbake/0.1.0 \
crate-index://crates.io/CARGO_INDEX_COMMIT \
"
SRC_URI[index.md5sum] = "79f10f436dbf26737cc80445746f16b4"
SRC_URI[index.sha256sum] = "86114b93f1f51aaf0aec3af0751d214b351f4ff9839ba031315c1b19dcbb1913"
LIC_FILES_CHKSUM=" \
file://LICENSE-APACHE;md5=1836efb2eb779966696f473ee8540542 \
file://LICENSE-MIT;md5=0b29d505d9225d1f0815cbdcf602b901 \
"
SUMMARY = "Generates a BitBake recipe for a package utilizing meta-rust's classes."
HOMEPAGE = "https://github.com/cardoe/cargo-bitbake"
LICENSE = "MIT | Apache-2.0"
```
| 34.97619 | 234 | 0.6973 | eng_Latn | 0.285892 |
a677b14b5d0d690a7593933833727429a3625601 | 2,004 | md | Markdown | src/watching/fix-list/throttling.md | dwango-js/performance-handbook | 5ce2e5c184f53047a128bb346ae8393f9d1528e7 | [
"MIT"
] | 106 | 2018-09-13T08:56:01.000Z | 2022-03-03T10:48:23.000Z | src/watching/fix-list/throttling.md | dwango-js/performance-handbook | 5ce2e5c184f53047a128bb346ae8393f9d1528e7 | [
"MIT"
] | 2 | 2018-09-14T18:02:52.000Z | 2018-10-26T06:42:32.000Z | src/watching/fix-list/throttling.md | dwango-js/performance-handbook | 5ce2e5c184f53047a128bb346ae8393f9d1528e7 | [
"MIT"
] | 4 | 2018-09-13T10:08:05.000Z | 2018-10-25T09:56:31.000Z | # リストコンポーネントへの追加処理の修正
[リストコンポーネントの`shouldComponentUpdate`の改善](./shouldComponentUpdate.md)でリストコンポーネントの更新処理自体は改善されています。
しかし、リストコンポーネントにリストアイテム(コメント)を追加する回数や頻度が多いと更新処理自体は改善されていても更新回数が増えます。
更新回数を減らすには、コメントを追加するタイミングを間引くような仕組みが必要です。
実はすでに、`CommentThrottle`とそのままの名前のthrottlingで間引く処理の実装が使われていることがわかりました。
## 観測
このthrottlingの最適値を調べるつもりで、1秒ごとに5コのコメントを追加してその動きを調べてみました。
- 1秒ごとに5コのコメントを追加したとき
- `onAddComments`は 4、1の引数で2回呼ばれる
- 1秒ごとに10コのコメントを追加した時
- `onAddComments` は9、1の引数で2回呼ばれる
- 1秒ごとに2コのコメントを追加した時
- `onAddComments` は 1、1 の引数で2回呼ばれる
なぜか2回に分けてコメントが追加されていることがわかりました。
一度に追加するコメント数を変えても `n -> 1` と2回に分けられています。
これは`CommentThrottle`の実装が何かおかしそうです。
## 修正の方針
意図した挙動は、Nミリ秒間にMコのコメントを追加したら、その後Mコのコメントが同時にリストコンポーネントに追加されるです。
まずは、その挙動になっているかテストするために`CommentThrottle`のテストコードを書くとよさそうです。
```js
describe("CommentThrottle", () => {
describe("コメントを一度も追加してない時", () => {
it("intervalをまってもflushされない", () => {
// テスト実装
});
});
describe("コメントを追加した時", () => {
it("追加しても同期的にはflushされない", () => {
// テスト実装
});
it("追加したコメントはFLUSH_INTERVALまでflushされない", () => {
// テスト実装
});
it("追加したコメントはFLUSH_INTERVAL後flushされる", () => {
// テスト実装
});
it("複数のコメントを追加した場合は、FLUSH_INTERVAL後にまとめてflushされる", () => {
// テスト実装
});
it("シナリオテスト", () => {
});
});
});
```
このテストを実装してみると、既存の`CommentThrottle`はバグがあることがわかりました。
(意図した挙動のテストが通らない)
## 修正
まずは、テストが通るように無理やり`CommentThrottle`を修正しました。
その後、テストが通るのを維持したまま実装を書き換えて問題を修正しました。

## 計測
テストが通るので大きな問題ないと思いますが、計測前に行った仕組みと同じ方法で呼び出す回数を記録しました。
- 1秒ごとに5コのコメントを追加したとき
- `onAddComments`は 5の引数で1回呼ばれる
- 1秒ごとに10コのコメントを追加した時
- `onAddComments` は10の引数で1回呼ばれる
- 1秒ごとに2コのコメントを追加した時
-`onAddComments` は 2の引数で1回呼ばれる
意図したとおりになっているので問題ありませんでした。
既存の実装が存在していても、その実装が意図したように動いているかは試してみないとわかりません。
実際に数値として出してみれば、実装が正しいかは分かるはず。
テストを書いたほうが結果的にコストが低い場合もあります。そのためテストを書いて試してみるのも大切です。
| 23.857143 | 95 | 0.716567 | jpn_Jpan | 0.948041 |
a677b3a84b1bf1ec494fcab11dad463553ca20b2 | 857 | md | Markdown | blog/2021-08-06_layer_one_3.md | BigGangy/iota-wiki | da5980a9d458b962835572bf1ecee58911720870 | [
"MIT"
] | 51 | 2021-05-22T10:57:44.000Z | 2022-03-06T15:53:05.000Z | blog/2021-08-06_layer_one_3.md | BigGangy/iota-wiki | da5980a9d458b962835572bf1ecee58911720870 | [
"MIT"
] | 178 | 2021-05-22T13:58:35.000Z | 2022-03-28T11:59:26.000Z | blog/2021-08-06_layer_one_3.md | BigGangy/iota-wiki | da5980a9d458b962835572bf1ecee58911720870 | [
"MIT"
] | 188 | 2021-05-22T00:04:18.000Z | 2022-03-31T22:23:46.000Z | ---
slug: layer_one_3
title: 'Layer One—Part 3: Metamoney'
authors: kbrennan
tags: [Community, Layer One]
url: https://iologica.substack.com/p/metamoney
---

What is the distributed ledger value proposition? Is the elegant distributed ledger solution even possible? What is the killer distributed ledger application?
These are the three fundamental questions that participants in the distributed ledger space have reflected on for years—all while falling far short of the answers and their inescapable consequences.
Read the article on:
[Medium](https://iologica.substack.com/p/metamoney)
| 50.411765 | 262 | 0.810968 | eng_Latn | 0.820146 |
a677daa7ca3294a089ed6dfdb627fb24e2fe2df3 | 365 | md | Markdown | README.md | bbaassssiiee/ansible-timezone | 9147a4c4c615c119bb15439e037286ead71a8ddd | [
"Apache-2.0"
] | 28 | 2015-08-25T03:23:20.000Z | 2020-05-12T21:15:15.000Z | README.md | bbaassssiiee/ansible-timezone | 9147a4c4c615c119bb15439e037286ead71a8ddd | [
"Apache-2.0"
] | 11 | 2015-11-16T19:49:43.000Z | 2021-08-20T16:33:39.000Z | README.md | bbaassssiiee/ansible-timezone | 9147a4c4c615c119bb15439e037286ead71a8ddd | [
"Apache-2.0"
] | 21 | 2015-11-16T19:46:11.000Z | 2021-11-04T19:39:02.000Z | Role Name
========
timezone
Role Variables
--------------
```
# Default timezone. Must be a valid tz database time zone.
timezone: UTC
```
Example Playbook
-------------------------
```
---
- hosts: all
roles:
- yatesr.timezone
vars:
timezone: America/New_York
```
License
-------
Apache 2.0
Author Information
------------------
Ryan Yates
| 9.125 | 59 | 0.534247 | kor_Hang | 0.097619 |
a679106647b3efbc633f5d67a28488b2a0040f1d | 2,285 | md | Markdown | _posts/2019-09-21-bayden-zaklykav-provesty-rozsliduvannia-mozhlyvoho-vplyvu-trampa-na-zelens-koho.md | shishak/barber-jekyll | db3f4fda43b7cd2ece4c1e566cae24ce6f11fecd | [
"MIT"
] | null | null | null | _posts/2019-09-21-bayden-zaklykav-provesty-rozsliduvannia-mozhlyvoho-vplyvu-trampa-na-zelens-koho.md | shishak/barber-jekyll | db3f4fda43b7cd2ece4c1e566cae24ce6f11fecd | [
"MIT"
] | null | null | null | _posts/2019-09-21-bayden-zaklykav-provesty-rozsliduvannia-mozhlyvoho-vplyvu-trampa-na-zelens-koho.md | shishak/barber-jekyll | db3f4fda43b7cd2ece4c1e566cae24ce6f11fecd | [
"MIT"
] | null | null | null | ---
id: 1737
title: Байден закликав провести розслідування можливого впливу Трампа на Зеленського
date: 2019-09-21T22:34:15+00:00
author: user
excerpt: Колишній віце-президент США Джо Байден, кандидат у президенти від Демократичної партії 2020 року, в суботу закликав провести розслідування тиску президент Дональда...
layout: post
guid: https://www.rbc.ua/ukr/news/bayden-prizval-provesti-rassledovanie-vozmozhnogo-1569103844.html
permalink: /2019/09/21/bayden-zaklykav-provesty-rozsliduvannia-mozhlyvoho-vplyvu-trampa-na-zelens-koho/
cyberseo_rss_source:
- https://www.rbc.ua/static/rss/newsline.ukr.rss.xml
cyberseo_post_link:
- https://www.rbc.ua/ukr/news/bayden-prizval-provesti-rassledovanie-vozmozhnogo-1569103844.html
image: /wp-content/uploads/2019/09/bayden-zaklykav-provesty-rozsliduvannia-mozhlyvoho-vplyvu-trampa-na-zelens-koho.jpg
categories:
- Головне
tags:
- РБК-Україна
---
Колишній віце-президент США **Джо Байден**, кандидат у президенти від Демократичної партії 2020 року, в суботу закликав провести розслідування тиску президент Дональда Трампа на президента України Володимира Зеленського. Про це пише Reuters.
Йдеться про телефонну розмову, в якій президент Трамп наполягав на тому, щоб його український колега вплинув на розслідування справи Байдена і його сина.
“Схоже, це надмірне зловживання владою. Зателефонувати іноземному лідеру, який шукає допомоги у Сполучених Штатів, запитати про мене і натякати на розслідування … це обурливо”, – сказав Байден під час кампанії в Айові.
Нагадаємо, раніше колишній віце-президент США, кандидат в президенти Джо Байден закликав президента Дональда Трампа опублікувати розшифровку розмови з президентом України Володимиром Зеленським від 25 липня.
The Wall Street Journal оприлюднила дані, що Трамп під час телефонної розмови з Зеленським щонайменше вісім разів називав прізвище кандидата в президенти Джо Байдена, закликаючи розпочати розслідування в Україні проти його сина. Також президент США закликав співпрацювати з особистим юристом Трампа Руді Джуліані в ході цього розслідування.
Пізніше адвокат президента Трампа – Рудольф Джуліані – визнав факт того, що просив владу України провести розслідування щодо Джо Байдена. Спочатку Джуліані заперечував цей факт.</p> | 76.166667 | 340 | 0.820569 | ukr_Cyrl | 0.999159 |
a679160eadf2c5adec549bc401db8d8c57ee5e64 | 708 | md | Markdown | README.md | morwalz/nodebb-plugin-sso-oauth | b16279a790659121a5b8ea36639d6096e876b797 | [
"BSD-2-Clause"
] | 1 | 2020-08-06T21:36:37.000Z | 2020-08-06T21:36:37.000Z | README.md | morwalz/nodebb-plugin-sso-oauth | b16279a790659121a5b8ea36639d6096e876b797 | [
"BSD-2-Clause"
] | null | null | null | README.md | morwalz/nodebb-plugin-sso-oauth | b16279a790659121a5b8ea36639d6096e876b797 | [
"BSD-2-Clause"
] | 1 | 2019-01-10T01:16:41.000Z | 2019-01-10T01:16:41.000Z | # NodeBB OAuth SSO
NodeBB Plugin that allows users to login/register via any configured OAuth provider. **Please note** that this is not a complete plugin, but merely a skeleton with which you can create your own OAuth SSO plugin for NodeBB (and hopefully share it with others!)
## How to Adapt
1. Fork this plugin
* 
1. Add the OAuth credentials (around line 30 of `library.js`)
1. Update profile information (around line 137 of `library.js`) with information from the user API call
1. Activate this plugin from the plugins page
1. Restart your NodeBB
1. Let NodeBB take care of the rest
## Trouble?
Find us on [the community forums](http://community.nodebb.org)! | 41.647059 | 259 | 0.754237 | eng_Latn | 0.978142 |
a679742d7d0f3ff0df369716237a414b2ebcf968 | 40 | md | Markdown | README.md | Anandi-K/identity-data-publisher-authentication | 167fb3bc213f6b469dc94a41cdd518cdf4162455 | [
"Apache-2.0"
] | 3 | 2020-04-13T00:19:23.000Z | 2020-04-28T11:16:43.000Z | README.md | Anandi-K/identity-data-publisher-authentication | 167fb3bc213f6b469dc94a41cdd518cdf4162455 | [
"Apache-2.0"
] | 13 | 2016-11-14T18:36:05.000Z | 2022-02-02T10:26:10.000Z | README.md | Anandi-K/identity-data-publisher-authentication | 167fb3bc213f6b469dc94a41cdd518cdf4162455 | [
"Apache-2.0"
] | 50 | 2016-05-29T15:25:40.000Z | 2022-01-26T08:12:31.000Z | # identity-data-publisher-authentication | 40 | 40 | 0.875 | eng_Latn | 0.875955 |
a679dedff02b55994684702422bfc6efc96dbbbc | 281 | md | Markdown | src/examples/code/python.md | alinex/node-report | 0798d2bacf8064875b3f54cd035aa154306f5a7e | [
"Apache-2.0"
] | 1 | 2016-06-02T15:05:20.000Z | 2016-06-02T15:05:20.000Z | src/examples/code/python.md | alinex/node-report | 0798d2bacf8064875b3f54cd035aa154306f5a7e | [
"Apache-2.0"
] | null | null | null | src/examples/code/python.md | alinex/node-report | 0798d2bacf8064875b3f54cd035aa154306f5a7e | [
"Apache-2.0"
] | null | null | null | ``` python
@requires_authorization
def somefunc(param1='', param2=0):
r'''A docstring'''
if param1 > param2: # interesting
print 'Gre\'ater'
return (param2 - param1 + 1 + 0b10l) or None
class SomeClass:
pass
>>> message = '''interpreter
... prompt'''
```
| 18.733333 | 48 | 0.6121 | eng_Latn | 0.373687 |
a67a88583d299174e7a48dafd196a7030da8d97d | 43 | md | Markdown | README.md | distrixx/distrixx-source | 243f26d6d9a0c6fb725d415c39fb699546485007 | [
"MIT"
] | null | null | null | README.md | distrixx/distrixx-source | 243f26d6d9a0c6fb725d415c39fb699546485007 | [
"MIT"
] | null | null | null | README.md | distrixx/distrixx-source | 243f26d6d9a0c6fb725d415c39fb699546485007 | [
"MIT"
] | 1 | 2018-09-20T10:19:10.000Z | 2018-09-20T10:19:10.000Z |
DISTRIXX is a PoS-based cryptocurrency.
| 8.6 | 39 | 0.767442 | eng_Latn | 0.997171 |
a67ace1a38e0c6317e11f403efaa8335a18f387c | 1,710 | md | Markdown | api/qsharp/microsoft.quantum.arithmetic.identicalpointposfactfxp.md | MicrosoftDocs/quantum-docs-pr.hu-HU | 0dd8360322ecc479e72360b70aca1c50271416ff | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-19T20:13:21.000Z | 2020-05-19T20:13:21.000Z | api/qsharp/microsoft.quantum.arithmetic.identicalpointposfactfxp.md | MicrosoftDocs/quantum-docs-pr.hu-HU | 0dd8360322ecc479e72360b70aca1c50271416ff | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/qsharp/microsoft.quantum.arithmetic.identicalpointposfactfxp.md | MicrosoftDocs/quantum-docs-pr.hu-HU | 0dd8360322ecc479e72360b70aca1c50271416ff | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-11-15T09:09:07.000Z | 2021-11-15T09:09:07.000Z | ---
uid: Microsoft.Quantum.Arithmetic.IdenticalPointPosFactFxP
title: IdenticalPointPosFactFxP függvény
ms.date: 1/23/2021 12:00:00 AM
ms.topic: article
qsharp.kind: function
qsharp.namespace: Microsoft.Quantum.Arithmetic
qsharp.name: IdenticalPointPosFactFxP
qsharp.summary: Assert that all fixed-point numbers in the provided array have identical point positions when counting from the least- significant bit. I.e., number of bits minus point position must be constant for all fixed-point numbers in the array.
ms.openlocfilehash: 7212f918e1d0ee86b12b85caa6e0c27bc2cebe58
ms.sourcegitcommit: 71605ea9cc630e84e7ef29027e1f0ea06299747e
ms.translationtype: MT
ms.contentlocale: hu-HU
ms.lasthandoff: 01/26/2021
ms.locfileid: "98846615"
---
# <a name="identicalpointposfactfxp-function"></a>IdenticalPointPosFactFxP függvény
Névtér: [Microsoft. Quantum. aritmetika](xref:Microsoft.Quantum.Arithmetic)
Csomag: [Microsoft. Quantum. numerikus számok](https://nuget.org/packages/Microsoft.Quantum.Numerics)
Azt állítja be, hogy a megadott tömbben lévő összes rögzített szintű szám azonos ponttal rendelkezik, amikor a legkevésbé jelentős mennyiségtől számít. Azaz a BITS mínusz pont pozíciójának állandónak kell lennie a tömbben lévő összes rögzített számnál.
```qsharp
function IdenticalPointPosFactFxP (fixedPoints : Microsoft.Quantum.Arithmetic.FixedPoint[]) : Unit
```
## <a name="input"></a>Bevitel
### <a name="fixedpoints--fixedpoint"></a>fixedPoints: [FixedPoint](xref:Microsoft.Quantum.Arithmetic.FixedPoint)[]
A kompatibilitáshoz ellenőrizendő kvantum-pontokból álló számok tömbje (kijelentések használata).
## <a name="output--unit"></a>Kimenet: [egység](xref:microsoft.quantum.lang-ref.unit)
| 41.707317 | 252 | 0.808187 | hun_Latn | 0.916182 |
a67ba6c788f8be46a5a9cd3fa56288cffafd854a | 8,247 | md | Markdown | articles/active-directory/identity-protection/howto-identity-protection-configure-risk-policies.md | SRegue/azure-docs.es-es | d780bfb3509e4ca0c02995c7250d10603e1bd6fe | [
"CC-BY-4.0",
"MIT"
] | 66 | 2017-07-09T03:34:12.000Z | 2022-03-05T21:27:20.000Z | articles/active-directory/identity-protection/howto-identity-protection-configure-risk-policies.md | SRegue/azure-docs.es-es | d780bfb3509e4ca0c02995c7250d10603e1bd6fe | [
"CC-BY-4.0",
"MIT"
] | 671 | 2017-06-29T16:36:35.000Z | 2021-12-03T16:34:03.000Z | articles/active-directory/identity-protection/howto-identity-protection-configure-risk-policies.md | SRegue/azure-docs.es-es | d780bfb3509e4ca0c02995c7250d10603e1bd6fe | [
"CC-BY-4.0",
"MIT"
] | 171 | 2017-07-25T06:26:46.000Z | 2022-03-23T09:07:10.000Z | ---
title: 'Directivas de riesgo: Azure Active Directory Identity Protection'
description: Habilitación y configuración de las directivas de riesgo en Azure Active Directory Identity Protection
services: active-directory
ms.service: active-directory
ms.subservice: identity-protection
ms.topic: how-to
ms.date: 05/27/2021
ms.author: joflore
author: MicrosoftGuyJFlo
manager: karenhoran
ms.reviewer: sahandle
ms.collection: M365-identity-device-management
ms.openlocfilehash: 11751323d1341cbcde19451bc101197c7d714368
ms.sourcegitcommit: 0046757af1da267fc2f0e88617c633524883795f
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 08/13/2021
ms.locfileid: "121739978"
---
# <a name="how-to-configure-and-enable-risk-policies"></a>Instrucciones: Configuración y habilitación de directivas de riesgo
Como vimos en el artículo anterior, [Directivas de Identity Protection](concept-identity-protection-policies.md), tenemos dos directivas de riesgo que podemos habilitar en nuestro directorio.
- Directiva de riesgo de inicio de sesión
- Directiva de riesgo de usuario

Ambas directivas se ocupan de automatizar la respuesta a las detecciones de riesgo en su entorno y permitir que los usuarios corrijan por sí mismos los posibles riesgos cuando estos se detecten.
## <a name="choosing-acceptable-risk-levels"></a>Elección de niveles de riesgo aceptables
Las organizaciones deben decidir el nivel de riesgo que están dispuestos a aceptar para equilibrar la experiencia del usuario y la postura de seguridad.
La recomendación de Microsoft es establecer el umbral de la directiva de riesgo de usuario en **Alto**, la directiva de riesgo de inicio de sesión en **Medio y superior** y permitir opciones de autocorrección. La elección de bloquear el acceso en lugar de permitir opciones de autocorrección, como el cambio de contraseña y la autenticación multifactor, afectará a los usuarios y los administradores. Sopese esta elección al configurar las directivas.
Elegir un umbral **Alto** reduce el número de veces que una directiva se desencadena y minimiza el impacto en los usuarios. Sin embargo, excluye de la directiva las detecciones de riesgo **Bajo** y **Medio**, por lo que es posible que no impida que un atacante aproveche una identidad en peligro. Al seleccionar un umbral **Bajo** se presentan más interrupciones de usuario.
Identity Protection usa las [ubicaciones de red](../conditional-access/location-condition.md) de confianza configuradas en algunas detecciones de riesgos para reducir los falsos positivos.
### <a name="risk-remediation"></a>Corrección de riesgos
Las organizaciones pueden optar por bloquear el acceso cuando se detecta riesgo. El bloqueo a veces impide que los usuarios legítimos haga lo que necesitan. Una solución mejor es permitir la autocorrección mediante la autenticación multifactor (MFA) de Azure AD y el autoservicio de restablecimiento de contraseña (SSPR).
- Cuando se desencadena una directiva de riesgo de usuario:
- Los administradores pueden requerir un restablecimiento de contraseña seguro, que requiere que Azure AD MFA se realice antes de que el usuario cree una nueva contraseña con SSPR y restablezca el riesgo del usuario.
- Cuando se desencadena una directiva de riesgo de inicio de sesión:
- Se puede desencadenar Azure AD MFA, lo que permite a los usuarios probar que se trata de ellos mediante uno de los métodos de autenticación registrados y restablecer el riesgo de inicio de sesión.
> [!WARNING]
> Los usuarios deben registrarse en Azure AD MFA y SSPR antes de enfrentarse a una situación que requiera corrección. Los usuarios no registrados se bloquean y requieren la intervención del administrador.
>
> El cambio de contraseña (conozco mi contraseña y quiero cambiarla por otra) fuera del flujo de corrección de las directivas de riesgo de usuario no cumple el requisito de restablecimiento de contraseña seguro.
## <a name="exclusions"></a>Exclusiones
Las directivas permiten excluir a usuarios, como las [cuentas de acceso de emergencia o de administrador de emergencia](../roles/security-emergency-access.md). Las organizaciones pueden necesitar excluir otras cuentas de algunas directivas específicas en función de la forma en que se usan las cuentas. Las exclusiones deben revisarse periódicamente para ver si siguen siendo aplicables.
## <a name="enable-policies"></a>Habilitación de directivas
Hay dos ubicaciones donde se pueden configurar estas directivas: acceso condicional e Identity Protection. La configuración mediante directivas de acceso condicional es el método preferido y proporciona más contexto, que incluye:
- Datos de diagnóstico mejorados
- Integración del modo de solo informe
- Compatibilidad con Graph API
- Uso de más atributos de acceso condicional en la directiva
> [!VIDEO https://www.youtube.com/embed/zEsbbik-BTE]
### <a name="user-risk-with-conditional-access"></a>Riesgos del usuario con acceso condicional
1. Inicie sesión en **Azure Portal** como administrador global, administrador de seguridad o administrador de acceso condicional.
1. Vaya a **Azure Active Directory** > **Seguridad** > **Acceso condicional**.
1. Seleccione **Nueva directiva**.
1. Asigne un nombre a la directiva. Se recomienda que las organizaciones creen un estándar significativo para los nombres de sus directivas.
1. En **Asignaciones**, seleccione **Usuarios y grupos**.
1. En **Incluir**, seleccione **Todos los usuarios**.
1. En **Excluir**, seleccione **Usuarios y grupos** y, luego, elija las cuentas de acceso de emergencia de la organización.
1. Seleccione **Listo**.
1. En **Aplicaciones en la nube o acciones** > **Incluir**, seleccione **Todas las aplicaciones en la nube**.
1. En **Condiciones** > **Riesgo de usuario**, establezca **Configurar** en **Sí**. En **Configurar los niveles de riesgo de usuario necesarios para que se aplique la directiva**, seleccione **Alto** y después, **Listo**.
1. En **Controles de acceso** > **Conceder**, seleccione **Conceder acceso**, **Requerir cambio de contraseña** y **Seleccionar**.
1. Confirme la configuración y establezca **Habilitar directivas** en **Activado**.
1. Seleccione **Crear** para crear la directiva.
### <a name="sign-in-risk-with-conditional-access"></a>Riesgos del inicio de sesión con acceso condicional
1. Inicie sesión en **Azure Portal** como administrador global, administrador de seguridad o administrador de acceso condicional.
1. Vaya a **Azure Active Directory** > **Seguridad** > **Acceso condicional**.
1. Seleccione **Nueva directiva**.
1. Asigne un nombre a la directiva. Se recomienda que las organizaciones creen un estándar significativo para los nombres de sus directivas.
1. En **Asignaciones**, seleccione **Usuarios y grupos**.
1. En **Incluir**, seleccione **Todos los usuarios**.
1. En **Excluir**, seleccione **Usuarios y grupos** y, luego, elija las cuentas de acceso de emergencia de la organización.
1. Seleccione **Listo**.
1. En **Aplicaciones en la nube o acciones** > **Incluir**, seleccione **Todas las aplicaciones en la nube**.
1. En **Condiciones** > **Riesgo de inicio de sesión**, establezca **Configurar** en **Sí**. En **Seleccionar el nivel de riesgo de inicio de sesión, esta directiva se aplicará a**
1. Seleccione **Alto** y **Medio**.
1. Seleccione **Listo**.
1. En **Controles de acceso** > **Conceder**, seleccione **Conceder acceso**, **Requerir autenticación multifactor** y **Seleccionar**.
1. Confirme la configuración y establezca **Habilitar directiva** en **Activado**.
1. Seleccione **Crear** para crear la directiva.
## <a name="next-steps"></a>Pasos siguientes
- [Habilitación de la directiva de registro de Azure AD Multi-Factor Authentication](howto-identity-protection-configure-mfa-policy.md)
- [¿Qué es el riesgo?](concept-identity-protection-risks.md)
- [Investigación de detecciones de riesgos](howto-identity-protection-investigate-risk.md)
- [Simulación de detecciones de riesgos](howto-identity-protection-simulate-risk.md)
| 72.342105 | 451 | 0.777859 | spa_Latn | 0.98744 |
a67bfc8f0947855a89f248f4d9df4dbeb9dd977c | 1,272 | md | Markdown | labs/02/README.md | Marcoalpz/parallel-programming-lecture | 9eec6166b1defcde865bf6078e739af5c85727de | [
"MIT"
] | 1 | 2021-09-29T13:23:32.000Z | 2021-09-29T13:23:32.000Z | labs/02/README.md | Marcoalpz/parallel-programming-lecture | 9eec6166b1defcde865bf6078e739af5c85727de | [
"MIT"
] | 23 | 2021-10-13T14:13:50.000Z | 2021-12-04T20:11:22.000Z | labs/02/README.md | Marcoalpz/parallel-programming-lecture | 9eec6166b1defcde865bf6078e739af5c85727de | [
"MIT"
] | 14 | 2021-09-23T15:51:25.000Z | 2021-12-04T19:43:47.000Z | # Lab 02 instructions
## Objective
Make the students get familiar with the system calls using C as main programing
language. At the end of the practice students will understand
* How to use a system call from C code
* How to use a generic syscall and SYSNO from C code
* How to create a useful binary such as chmod/cat
# Requirements
* Linux machine, either a VM or a baremetal host
* GCC compiler (at least version 4.8)
* shell scripting
* git send mail server installed and configured on your Linux machine
## Instructions
* Clone the repository
* Go to operating-systems-lecture/labs/02
* Reading the examples from operating-systems-lecture/labs/02/cat/ directory
and chmod seeing in class please create one for chown. chown() changes the
ownership of the file specified by path.
* git commit -s -m 'ITESMID-homework-02'
* git send-mail -1
## Expected result:
Create a code that does this:
```
./chown file.txt <usr>
```
## Please send the mail as git send mail:
```
$ git add chown.c
$ git commit -s -m <STUDENT-ID>-homework-02
$ git send-email -1
```
Do some tests sending the mail to your personal account, if you get the mail,
then you can be sure I will get the mail
# Time to do the homework:
One week from the moment the mail is sent to students
| 24.461538 | 79 | 0.742925 | eng_Latn | 0.997323 |
a67c393aa41b8f1500722890bab15e8c0b79a7fb | 22,121 | md | Markdown | README.md | MixagonUI/build_soong | b07ae342e4dc62bbdfaf5dbbf09155986dcf7e14 | [
"Apache-2.0"
] | null | null | null | README.md | MixagonUI/build_soong | b07ae342e4dc62bbdfaf5dbbf09155986dcf7e14 | [
"Apache-2.0"
] | null | null | null | README.md | MixagonUI/build_soong | b07ae342e4dc62bbdfaf5dbbf09155986dcf7e14 | [
"Apache-2.0"
] | null | null | null | # Soong
Soong is the replacement for the old Android make-based build system. It
replaces Android.mk files with Android.bp files, which are JSON-like simple
declarative descriptions of modules to build.
See [Simple Build
Configuration](https://source.android.com/compatibility/tests/development/blueprints)
on source.android.com to read how Soong is configured for testing.
## Android.bp file format
By design, Android.bp files are very simple. There are no conditionals or
control flow statements - any complexity is handled in build logic written in
Go. The syntax and semantics of Android.bp files are intentionally similar
to [Bazel BUILD files](https://www.bazel.io/versions/master/docs/be/overview.html)
when possible.
### Modules
A module in an Android.bp file starts with a module type, followed by a set of
properties in `name: value,` format:
```
cc_binary {
name: "gzip",
srcs: ["src/test/minigzip.c"],
shared_libs: ["libz"],
stl: "none",
}
```
Every module must have a `name` property, and the value must be unique across
all Android.bp files.
The list of valid module types and their properties can be generated by calling
`m soong_docs`. It will be written to `$OUT_DIR/soong/docs/soong_build.html`.
This list for the current version of Soong can be found [here](https://ci.android.com/builds/latest/branches/aosp-build-tools/targets/linux/view/soong_build.html).
### File lists
Properties that take a list of files can also take glob patterns and output path
expansions.
* Glob patterns can contain the normal Unix wildcard `*`, for example `"*.java"`.
Glob patterns can also contain a single `**` wildcard as a path element, which
will match zero or more path elements. For example, `java/**/*.java` will match
`java/Main.java` and `java/com/android/Main.java`.
* Output path expansions take the format `:module` or `:module{.tag}`, where
`module` is the name of a module that produces output files, and it expands to
a list of those output files. With the optional `{.tag}` suffix, the module
may produce a different list of outputs according to `tag`.
For example, a `droiddoc` module with the name "my-docs" would return its
`.stubs.srcjar` output with `":my-docs"`, and its `.doc.zip` file with
`":my-docs{.doc.zip}"`.
This is commonly used to reference `filegroup` modules, whose output files
consist of their `srcs`.
### Variables
An Android.bp file may contain top-level variable assignments:
```
gzip_srcs = ["src/test/minigzip.c"],
cc_binary {
name: "gzip",
srcs: gzip_srcs,
shared_libs: ["libz"],
stl: "none",
}
```
Variables are scoped to the remainder of the file they are declared in, as well
as any child Android.bp files. Variables are immutable with one exception - they
can be appended to with a += assignment, but only before they have been
referenced.
### Comments
Android.bp files can contain C-style multiline `/* */` and C++ style single-line
`//` comments.
### Types
Variables and properties are strongly typed, variables dynamically based on the
first assignment, and properties statically by the module type. The supported
types are:
* Bool (`true` or `false`)
* Integers (`int`)
* Strings (`"string"`)
* Lists of strings (`["string1", "string2"]`)
* Maps (`{key1: "value1", key2: ["value2"]}`)
Maps may values of any type, including nested maps. Lists and maps may have
trailing commas after the last value.
Strings can contain double quotes using `\"`, for example `"cat \"a b\""`.
### Operators
Strings, lists of strings, and maps can be appended using the `+` operator.
Integers can be summed up using the `+` operator. Appending a map produces the
union of keys in both maps, appending the values of any keys that are present
in both maps.
### Defaults modules
A defaults module can be used to repeat the same properties in multiple modules.
For example:
```
cc_defaults {
name: "gzip_defaults",
shared_libs: ["libz"],
stl: "none",
}
cc_binary {
name: "gzip",
defaults: ["gzip_defaults"],
srcs: ["src/test/minigzip.c"],
}
```
### Packages
The build is organized into packages where each package is a collection of related files and a
specification of the dependencies among them in the form of modules.
A package is defined as a directory containing a file named `Android.bp`, residing beneath the
top-level directory in the build and its name is its path relative to the top-level directory. A
package includes all files in its directory, plus all subdirectories beneath it, except those which
themselves contain an `Android.bp` file.
The modules in a package's `Android.bp` and included files are part of the module.
For example, in the following directory tree (where `.../android/` is the top-level Android
directory) there are two packages, `my/app`, and the subpackage `my/app/tests`. Note that
`my/app/data` is not a package, but a directory belonging to package `my/app`.
.../android/my/app/Android.bp
.../android/my/app/app.cc
.../android/my/app/data/input.txt
.../android/my/app/tests/Android.bp
.../android/my/app/tests/test.cc
This is based on the Bazel package concept.
The `package` module type allows information to be specified about a package. Only a single
`package` module can be specified per package and in the case where there are multiple `.bp` files
in the same package directory it is highly recommended that the `package` module (if required) is
specified in the `Android.bp` file.
Unlike most module type `package` does not have a `name` property. Instead the name is set to the
name of the package, e.g. if the package is in `top/intermediate/package` then the package name is
`//top/intermediate/package`.
E.g. The following will set the default visibility for all the modules defined in the package and
any subpackages that do not set their own default visibility (irrespective of whether they are in
the same `.bp` file as the `package` module) to be visible to all the subpackages by default.
```
package {
default_visibility: [":__subpackages"]
}
```
### Referencing Modules
A module `libfoo` can be referenced by its name
```
cc_binary {
name: "app",
shared_libs: ["libfoo"],
}
```
Obviously, this works only if there is only one `libfoo` module in the source
tree. Ensuring such name uniqueness for larger trees may become problematic. We
might also want to use the same name in multiple mutually exclusive subtrees
(for example, implementing different devices) deliberately in order to describe
a functionally equivalent module. Enter Soong namespaces.
#### Namespaces
A presense of the `soong_namespace {..}` in an Android.bp file defines a
**namespace**. For instance, having
```
soong_namespace {
...
}
...
```
in `device/google/bonito/Android.bp` informs Soong that within the
`device/google/bonito` package the module names are unique, that is, all the
modules defined in the Android.bp files in the `device/google/bonito/` tree have
unique names. However, there may be modules with the same names outside
`device/google/bonito` tree. Indeed, there is a module `"pixelstats-vendor"`
both in `device/google/bonito/pixelstats` and in
`device/google/coral/pixelstats`.
The name of a namespace is the path of its directory. The name of the namespace
in the example above is thus `device/google/bonito`.
An implicit **global namespace** corresponds to the source tree as a whole. It
has empty name.
A module name's **scope** is the smallest namespace containing it. Suppose a
source tree has `device/my` and `device/my/display` namespaces. If `libfoo`
module is defined in `device/my/display/lib/Android.bp`, its namespace is
`device/my/display`.
The name uniqueness thus means that module's name is unique within its scope. In
other words, "//_scope_:_name_" is globally unique module reference, e.g,
`"//device/google/bonito:pixelstats-vendor"`. _Note_ that the name of the
namespace for a module may be different from module's package name: `libfoo`
belongs to `device/my/display` namespace but is contained in
`device/my/display/lib` package.
#### Name Resolution
The form of a module reference determines how Soong locates the module.
For a **global reference** of the "//_scope_:_name_" form, Soong verifies there
is a namespace called "_scope_", then verifies it contains a "_name_" module and
uses it. Soong verifies there is only one "_name_" in "_scope_" at the beginning
when it parses Android.bp files.
A **local reference** has "_name_" form, and resolving it involves looking for a
module "_name_" in one or more namespaces. By default only the global namespace
is searched for "_name_" (in other words, only the modules not belonging to an
explicitly defined scope are considered). The `imports` attribute of the
`soong_namespaces` allows to specify where to look for modules . For instance,
with `device/google/bonito/Android.bp` containing
```
soong_namespace {
imports: [
"hardware/google/interfaces",
"hardware/google/pixel",
"hardware/qcom/bootctrl",
],
}
```
a reference to `"libpixelstats"` will resolve to the module defined in
`hardware/google/pixel/pixelstats/Android.bp` because this module is in
`hardware/google/pixel` namespace.
**TODO**: Conventionally, languages with similar concepts provide separate
constructs for namespace definition and name resolution (`namespace` and `using`
in C++, for instance). Should Soong do that, too?
#### Referencing modules in makefiles
While we are gradually converting makefiles to Android.bp files, Android build
is described by a mixture of Android.bp and Android.mk files, and a module
defined in an Android.mk file can reference a module defined in Android.bp file.
For instance, a binary still defined in an Android.mk file may have a library
defined in already converted Android.bp as a dependency.
A module defined in an Android.bp file and belonging to the global namespace can
be referenced from a makefile without additional effort. If a module belongs to
an explicit namespace, it can be referenced from a makefile only after after the
name of the namespace has been added to the value of PRODUCT_SOONG_NAMESPACES
variable.
Note that makefiles have no notion of namespaces and exposing namespaces with
the same modules via PRODUCT_SOONG_NAMESPACES may cause Make failure. For
instance, exposing both `device/google/bonito` and `device/google/coral`
namespaces will cause Make failure because it will see two targets for the
`pixelstats-vendor` module.
### Visibility
The `visibility` property on a module controls whether the module can be
used by other packages. Modules are always visible to other modules declared
in the same package. This is based on the Bazel visibility mechanism.
If specified the `visibility` property must contain at least one rule.
Each rule in the property must be in one of the following forms:
* `["//visibility:public"]`: Anyone can use this module.
* `["//visibility:private"]`: Only rules in the module's package (not its
subpackages) can use this module.
* `["//visibility:override"]`: Discards any rules inherited from defaults or a
creating module. Can only be used at the beginning of a list of visibility
rules.
* `["//some/package:__pkg__", "//other/package:__pkg__"]`: Only modules in
`some/package` and `other/package` (defined in `some/package/*.bp` and
`other/package/*.bp`) have access to this module. Note that sub-packages do not
have access to the rule; for example, `//some/package/foo:bar` or
`//other/package/testing:bla` wouldn't have access. `__pkg__` is a special
module and must be used verbatim. It represents all of the modules in the
package.
* `["//project:__subpackages__", "//other:__subpackages__"]`: Only modules in
packages `project` or `other` or in one of their sub-packages have access to
this module. For example, `//project:rule`, `//project/library:lib` or
`//other/testing/internal:munge` are allowed to depend on this rule (but not
`//independent:evil`)
* `["//project"]`: This is shorthand for `["//project:__pkg__"]`
* `[":__subpackages__"]`: This is shorthand for `["//project:__subpackages__"]`
where `//project` is the module's package, e.g. using `[":__subpackages__"]` in
`packages/apps/Settings/Android.bp` is equivalent to
`//packages/apps/Settings:__subpackages__`.
* `["//visibility:legacy_public"]`: The default visibility, behaves as
`//visibility:public` for now. It is an error if it is used in a module.
The visibility rules of `//visibility:public` and `//visibility:private` cannot
be combined with any other visibility specifications, except
`//visibility:public` is allowed to override visibility specifications imported
through the `defaults` property.
Packages outside `vendor/` cannot make themselves visible to specific packages
in `vendor/`, e.g. a module in `libcore` cannot declare that it is visible to
say `vendor/google`, instead it must make itself visible to all packages within
`vendor/` using `//vendor:__subpackages__`.
If a module does not specify the `visibility` property then it uses the
`default_visibility` property of the `package` module in the module's package.
If the `default_visibility` property is not set for the module's package then
it will use the `default_visibility` of its closest ancestor package for which
a `default_visibility` property is specified.
If no `default_visibility` property can be found then the module uses the
global default of `//visibility:legacy_public`.
The `visibility` property has no effect on a defaults module although it does
apply to any non-defaults module that uses it. To set the visibility of a
defaults module, use the `defaults_visibility` property on the defaults module;
not to be confused with the `default_visibility` property on the package module.
Once the build has been completely switched over to soong it is possible that a
global refactoring will be done to change this to `//visibility:private` at
which point all packages that do not currently specify a `default_visibility`
property will be updated to have
`default_visibility = [//visibility:legacy_public]` added. It will then be the
owner's responsibility to replace that with a more appropriate visibility.
### Formatter
Soong includes a canonical formatter for Android.bp files, similar to
[gofmt](https://golang.org/cmd/gofmt/). To recursively reformat all Android.bp files
in the current directory:
```
bpfmt -w .
```
The canonical format includes 4 space indents, newlines after every element of a
multi-element list, and always includes a trailing comma in lists and maps.
### Convert Android.mk files
Soong includes a tool perform a first pass at converting Android.mk files
to Android.bp files:
```
androidmk Android.mk > Android.bp
```
The tool converts variables, modules, comments, and some conditionals, but any
custom Makefile rules, complex conditionals or extra includes must be converted
by hand.
#### Differences between Android.mk and Android.bp
* Android.mk files often have multiple modules with the same name (for example
for static and shared version of a library, or for host and device versions).
Android.bp files require unique names for every module, but a single module can
be built in multiple variants, for example by adding `host_supported: true`.
The androidmk converter will produce multiple conflicting modules, which must
be resolved by hand to a single module with any differences inside
`target: { android: { }, host: { } }` blocks.
### Conditionals
Soong deliberately does not support most conditionals in Android.bp files. We
suggest removing most conditionals from the build. See
[Best Practices](docs/best_practices.md#removing-conditionals) for some
examples on how to remove conditionals.
Most conditionals supported natively by Soong are converted to a map
property. When building the module one of the properties in the map will be
selected, and its values appended to the property with the same name at the
top level of the module.
For example, to support architecture specific files:
```
cc_library {
...
srcs: ["generic.cpp"],
arch: {
arm: {
srcs: ["arm.cpp"],
},
x86: {
srcs: ["x86.cpp"],
},
},
}
```
When building the module for arm the `generic.cpp` and `arm.cpp` sources will
be built. When building for x86 the `generic.cpp` and 'x86.cpp' sources will
be built.
#### Soong Config Variables
When converting vendor modules that contain conditionals, simple conditionals
can be supported through Soong config variables using `soong_config_*`
modules that describe the module types, variables and possible values:
```
soong_config_module_type {
name: "acme_cc_defaults",
module_type: "cc_defaults",
config_namespace: "acme",
variables: ["board"],
bool_variables: ["feature"],
value_variables: ["width"],
properties: ["cflags", "srcs"],
}
soong_config_string_variable {
name: "board",
values: ["soc_a", "soc_b", "soc_c"],
}
```
This example describes a new `acme_cc_defaults` module type that extends the
`cc_defaults` module type, with three additional conditionals based on
variables `board`, `feature` and `width`, which can affect properties `cflags`
and `srcs`. Additionally, each conditional will contain a `conditions_default`
property can affect `cflags` and `srcs` in the following conditions:
* bool variable (e.g. `feature`): the variable is unspecified or not set to a true value
* value variable (e.g. `width`): the variable is unspecified
* string variable (e.g. `board`): the variable is unspecified or the variable is set to a string unused in the
given module. For example, with `board`, if the `board`
conditional contains the properties `soc_a` and `conditions_default`, when
board=soc_b, the `cflags` and `srcs` values under `conditions_default` will be
used. To specify that no properties should be amended for `soc_b`, you can set
`soc_b: {},`.
The values of the variables can be set from a product's `BoardConfig.mk` file:
```
$(call add_soong_config_namespace, acme)
$(call add_soong_config_var_value, acme, board, soc_a)
$(call add_soong_config_var_value, acme, feature, true)
$(call add_soong_config_var_value, acme, width, 200)
```
The `acme_cc_defaults` module type can be used anywhere after the definition in
the file where it is defined, or can be imported into another file with:
```
soong_config_module_type_import {
from: "device/acme/Android.bp",
module_types: ["acme_cc_defaults"],
}
```
It can used like any other module type:
```
acme_cc_defaults {
name: "acme_defaults",
cflags: ["-DGENERIC"],
soong_config_variables: {
board: {
soc_a: {
cflags: ["-DSOC_A"],
},
soc_b: {
cflags: ["-DSOC_B"],
},
conditions_default: {
cflags: ["-DSOC_DEFAULT"],
},
},
feature: {
cflags: ["-DFEATURE"],
conditions_default: {
cflags: ["-DFEATURE_DEFAULT"],
},
},
width: {
cflags: ["-DWIDTH=%s"],
conditions_default: {
cflags: ["-DWIDTH=DEFAULT"],
},
},
},
}
cc_library {
name: "libacme_foo",
defaults: ["acme_defaults"],
srcs: ["*.cpp"],
}
```
With the `BoardConfig.mk` snippet above, `libacme_foo` would build with
`cflags: "-DGENERIC -DSOC_A -DFEATURE -DWIDTH=200"`.
Alternatively, with `DefaultBoardConfig.mk`:
```
SOONG_CONFIG_NAMESPACES += acme
SOONG_CONFIG_acme += \
board \
feature \
width \
SOONG_CONFIG_acme_feature := false
```
then `libacme_foo` would build with `cflags: "-DGENERIC -DSOC_DEFAULT -DFEATURE_DEFAULT -DSIZE=DEFAULT"`.
Alternatively, with `DefaultBoardConfig.mk`:
```
SOONG_CONFIG_NAMESPACES += acme
SOONG_CONFIG_acme += \
board \
feature \
width \
SOONG_CONFIG_acme_board := soc_c
```
then `libacme_foo` would build with `cflags: "-DGENERIC -DSOC_DEFAULT
-DFEATURE_DEFAULT -DSIZE=DEFAULT"`.
`soong_config_module_type` modules will work best when used to wrap defaults
modules (`cc_defaults`, `java_defaults`, etc.), which can then be referenced
by all of the vendor's other modules using the normal namespace and visibility
rules.
## Build logic
The build logic is written in Go using the
[blueprint](http://godoc.org/github.com/google/blueprint) framework. Build
logic receives module definitions parsed into Go structures using reflection
and produces build rules. The build rules are collected by blueprint and
written to a [ninja](http://ninja-build.org) build file.
## Other documentation
* [Best Practices](docs/best_practices.md)
* [Build Performance](docs/perf.md)
* [Generating CLion Projects](docs/clion.md)
* [Generating YouCompleteMe/VSCode compile\_commands.json file](docs/compdb.md)
* Make-specific documentation: [build/make/README.md](https://android.googlesource.com/platform/build/+/master/README.md)
## Developing for Soong
To load Soong code in a Go-aware IDE, create a directory outside your android tree and then:
```bash
apt install bindfs
export GOPATH=<path to the directory you created>
build/soong/scripts/setup_go_workspace_for_soong.sh
```
This will bind mount the Soong source directories into the directory in the layout expected by
the IDE.
### Running Soong in a debugger
To run the soong_build process in a debugger, install `dlv` and then start the build with
`SOONG_DELVE=<listen addr>` in the environment.
For example:
```bash
SOONG_DELVE=:1234 m nothing
```
and then in another terminal:
```
dlv connect :1234
```
If you see an error:
```
Could not attach to pid 593: this could be caused by a kernel
security setting, try writing "0" to /proc/sys/kernel/yama/ptrace_scope
```
you can temporarily disable
[Yama's ptrace protection](https://www.kernel.org/doc/Documentation/security/Yama.txt)
using:
```bash
sudo sysctl -w kernel.yama.ptrace_scope=0
```
## Contact
Email [email protected] (external) for any questions, or see
[go/soong](http://go/soong) (internal).
| 36.684909 | 163 | 0.740473 | eng_Latn | 0.994597 |
a67c42d0b55d8d5dcac4f9a346bf4c6dfaca03bd | 1,118 | md | Markdown | README.md | askalione/RisReader | 87e0312324a7d7ec8e80dac7597bc44ef5cc5ec0 | [
"MIT"
] | null | null | null | README.md | askalione/RisReader | 87e0312324a7d7ec8e80dac7597bc44ef5cc5ec0 | [
"MIT"
] | null | null | null | README.md | askalione/RisReader | 87e0312324a7d7ec8e80dac7597bc44ef5cc5ec0 | [
"MIT"
] | null | null | null | # RisHelper
[](https://ci.appveyor.com/project/askalione/rishelper)
[](https://github.com/askalione/RisHelper/blob/master/LICENSE)
RisHelper is a .NET reader/writer of [RIS](https://en.wikipedia.org/wiki/RIS_(file_format)) reference files.
## Nuget
| Component name | NuGet | Downloads |
| --- | --- | --- |
| RisHelper | [](https://www.nuget.org/packages/RisHelper/) | [](https://www.nuget.org/stats/packages/RisHelper?groupby=Version) |
## Usage
```
var path = "./data/";
// Read records from file
var records = RisReader.Read(Path.Combine(path, "sample-read.ris"));
// Write records to file
RisWriter.Write(records, Path.Combine(path, "sample-write.ris"));
```
## License
RisHelper is open source, licensed under the [MIT License](https://github.com/askalione/RisHelper/blob/master/LICENSE). | 39.928571 | 255 | 0.72898 | yue_Hant | 0.223135 |
a67c66662cdb4e09776e0de9ecff08bfb922edc6 | 2,671 | md | Markdown | README.md | emipa606/MultipleRaids | 7dce4092c2172964be39b023327f138dea4f9221 | [
"MIT"
] | null | null | null | README.md | emipa606/MultipleRaids | 7dce4092c2172964be39b023327f138dea4f9221 | [
"MIT"
] | null | null | null | README.md | emipa606/MultipleRaids | 7dce4092c2172964be39b023327f138dea4f9221 | [
"MIT"
] | null | null | null | # MultipleRaids

Update of MemeTurtles mod
https://steamcommunity.com/sharedfiles/filedetails/?id=1206031733

[table]
[tr]
[td]https://invite.gg/Mlie]
[/td]
[td]https://github.com/emipa606/MultipleRaids]
[/td]
[/tr]
[/table]

If you feel that fighting one raid at a time is too easy and want some real challenge then this mod is for you. Fight 2, 3 or as many as you like raids at the same time. Show those raiders who owns the land! Or die trying.
Does not affect the frequency of raids or other events.
The mod patches the class handler for RaidEnemy incident.
Spawned raids will have varying tactics and spawn locations.
The mod should be compatible with existing games and any storyteller, provided this mod is loaded last.
Mod has several variables to adjust:
1) Extra Raids[0,inf) - defines maximum number of additional raids to spawn. The more raids spawn the fewer points each raid has.
2) Spawn Threshold[0,100] - chance for the current raid to be spawned. If current raid is not spawned then no more additional raids will spawn. For example, 50% threshold means that you can have 1 extra raid at 50% chance, 2 raids at 25%, 3 raids and 12.5%
3) Points offset[0, inf) - if more than one raid spawn, raid points are rescaled with this formula: points = points*(1/totalNum + pointsOffset).
4) Force Desperate[true, false] - removes safety check from faction selection. Allows to have regular raiders and tribesman on extreme dessert/ice maps.
5) Force Raid Type[true, false] - forces every 3d raid to drop pod in center, every 4th to siege. Faction restrictions apply(tribesmen will never drop or siege).
6) Random Factions[true, false] - Allows spawned raids to belong to different factions.
-----
I have added the source code to the download. Feel free to modify and reupload it as long as you reference the original work.

- See if the the error persists if you just have this mod and its requirements active.
- If not, try adding your other mods until it happens again.
- Post your error-log using https://steamcommunity.com/workshop/filedetails/?id=818773962]HugsLib and command Ctrl+F12
- For best support, please use the Discord-channel for error-reporting.
- Do not report errors by making a discussion-thread, I get no notification of that.
- If you have the solution for a problem, please post it to the GitHub repository.
| 46.051724 | 257 | 0.743542 | eng_Latn | 0.992185 |
a67cab3f843ffbe0f3208d58158ac22b0f85fc5c | 394 | md | Markdown | 01-Basics/08-Optional.md | sunjinshuai/Advanced-Swift | b2ead16c583112bf7c7b55b59c22b8c8e8b88531 | [
"MIT"
] | null | null | null | 01-Basics/08-Optional.md | sunjinshuai/Advanced-Swift | b2ead16c583112bf7c7b55b59c22b8c8e8b88531 | [
"MIT"
] | null | null | null | 01-Basics/08-Optional.md | sunjinshuai/Advanced-Swift | b2ead16c583112bf7c7b55b59c22b8c8e8b88531 | [
"MIT"
] | null | null | null | ###swift
Swift中声明的一个变量时,默认情况下它是non-optional的,即必须赋予这个变量一个非空的值。如果给non-optional类型的变量赋值nil,编译器就会报错。
** 在Swift中, 当声明一个类的属性时, 属性默认也是non-optional的。 **
** 在Swift中如果声明一个没有初始值的属性,在类型声明后添加加一个`?`操作符完成的。 **
** Swift的可选类型加强了对空值的检查,并且在编译阶段就给开发者提供了可能的错误信息。 **
** Swift 的 nil 和 Objective-C 中的 nil 并不一样。在 Objective-C 中, nil 是一个指向不存在对象的指针。在 Swift 中, nil 不是指针——它是一个确定的值,用来表示值缺失。任何类型的可选状态都可以被设置为 nil ,不只是对象类型。 **
| 49.25 | 147 | 0.786802 | zho_Hans | 0.871245 |
a67d296901d1737b1602ce5a2acc8fddf43a9203 | 18 | md | Markdown | README.md | dmitrorlov/oauth2-token-app | 7f65f64f608c97dbf0f222ddf33cda0721b61b24 | [
"MIT"
] | null | null | null | README.md | dmitrorlov/oauth2-token-app | 7f65f64f608c97dbf0f222ddf33cda0721b61b24 | [
"MIT"
] | null | null | null | README.md | dmitrorlov/oauth2-token-app | 7f65f64f608c97dbf0f222ddf33cda0721b61b24 | [
"MIT"
] | null | null | null | # oauth2-token-app | 18 | 18 | 0.777778 | vie_Latn | 0.465607 |
a67dbcf3eaaa6d110fcafabc7cf192c483c57757 | 7,752 | md | Markdown | docs/framework/data/adonet/connection-string-builders.md | felpasl/docs.pt-br | 1b47adcbc2e400f937650f9de1cd0c511e80738e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/data/adonet/connection-string-builders.md | felpasl/docs.pt-br | 1b47adcbc2e400f937650f9de1cd0c511e80738e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/data/adonet/connection-string-builders.md | felpasl/docs.pt-br | 1b47adcbc2e400f937650f9de1cd0c511e80738e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Construtores de cadeia de conexão
ms.date: 03/30/2017
dev_langs:
- csharp
- vb
ms.assetid: 8434b608-c4d3-43d3-8ae3-6d8c6b726759
ms.openlocfilehash: ab72fe5a22ca88b33a93d94d4b5e16bbc470a4da
ms.sourcegitcommit: 6b308cf6d627d78ee36dbbae8972a310ac7fd6c8
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 01/23/2019
ms.locfileid: "54733214"
---
# <a name="connection-string-builders"></a>Construtores de cadeia de conexão
Em versões anteriores do [!INCLUDE[vstecado](../../../../includes/vstecado-md.md)]verificação de cadeias de caracteres de conexão com a cadeia de caracteres concatenada valores não tivesse ocorrido, para que no tempo de execução, uma palavra-chave incorreta gerada pelo tempo de compilação um <xref:System.ArgumentException>. Cada provedor de dados do [!INCLUDE[dnprdnshort](../../../../includes/dnprdnshort-md.md)] dava suporte a diferentes tipos de sintaxe de palavras-chave de cadeias de conexão, o que dificultava a construção de cadeias de conexão válidas manualmente. Para resolver esse problema, o [!INCLUDE[vstecado](../../../../includes/vstecado-md.md)] 2.0 introduziu novos construtores de cadeia de conexão para cada provedor de dados do [!INCLUDE[dnprdnshort](../../../../includes/dnprdnshort-md.md)]. Cada provedor de dados inclui uma classe de construtor de cadeia de conexão fortemente tipada que herda de <xref:System.Data.Common.DbConnectionStringBuilder>. A tabela a seguir lista os provedores de dados [!INCLUDE[dnprdnshort](../../../../includes/dnprdnshort-md.md)] e suas respectivas classes de construtores de cadeias de conexão.
|Provider|Classe ConnectionStringBuilder|
|--------------|-----------------------------------|
|<xref:System.Data.SqlClient>|<xref:System.Data.SqlClient.SqlConnectionStringBuilder?displayProperty=nameWithType>|
|<xref:System.Data.OleDb>|<xref:System.Data.OleDb.OleDbConnectionStringBuilder?displayProperty=nameWithType>|
|<xref:System.Data.Odbc>|<xref:System.Data.Odbc.OdbcConnectionStringBuilder?displayProperty=nameWithType>|
|<xref:System.Data.OracleClient>|<xref:System.Data.OracleClient.OracleConnectionStringBuilder?displayProperty=nameWithType>|
## <a name="connection-string-injection-attacks"></a>Ataques de injeção de cadeias de conexão
Um ataque de injeção de cadeia de conexão pode ocorrer quando a concatenação de cadeias dinâmicas é usada para criar cadeias de conexão com base na entrada do usuário. Se a cadeia de caracteres não for validada e o texto mal-intencionado ou os caracteres não forem escapados, um invasor poderá potencialmente acessar dados confidenciais ou outros recursos no servidor. Por exemplo, um invasor pode montar um ataque fornecendo um ponto e vírgula e acrescentando outro valor. A cadeia de conexão é analisada usando o algoritmo "o último vence", e a entrada hostil é substituída por um valor legítimo.
As classes de construtores de cadeias de conexão são criadas para eliminar hipóteses e proteger contra erros de sintaxe e vulnerabilidades à segurança. Elas fornecem métodos e propriedades que correspondem a pares chave-valor conhecidos e permitidos por cada provedor de dados. Cada classe mantém uma coleção fixa de sinônimos e pode converter um sinônimo no nome da chave conhecida correspondente. As verificações são executadas para pares chave-valor válidos e um par inválido gera uma exceção. Além disso, os valores injetados são tratados de maneira segura.
O exemplo a seguir demonstra como <xref:System.Data.SqlClient.SqlConnectionStringBuilder> trata um valor adicional inserido para a configuração `Initial Catalog`.
```vb
Dim builder As New System.Data.SqlClient.SqlConnectionStringBuilder
builder("Data Source") = "(local)"
builder("Integrated Security") = True
builder("Initial Catalog") = "AdventureWorks;NewValue=Bad"
Console.WriteLine(builder.ConnectionString)
```
```csharp
System.Data.SqlClient.SqlConnectionStringBuilder builder =
new System.Data.SqlClient.SqlConnectionStringBuilder();
builder["Data Source"] = "(local)";
builder["integrated Security"] = true;
builder["Initial Catalog"] = "AdventureWorks;NewValue=Bad";
Console.WriteLine(builder.ConnectionString);
```
A saída mostra que <xref:System.Data.SqlClient.SqlConnectionStringBuilder> tratou esse valor corretamente colocando em uma sequência de escape, entre aspas duplas, o valor adicional, em vez de acrescentá-lo à cadeia de conexão como um novo par chave-valor.
```
data source=(local);Integrated Security=True;
initial catalog="AdventureWorks;NewValue=Bad"
```
## <a name="building-connection-strings-from-configuration-files"></a>Construindo cadeias de conexão a partir de arquivos de configuração
Se determinados elementos de uma cadeia de conexão forem conhecidos antecipadamente, eles poderão ser armazenados em um arquivo de configuração e recuperados em tempo de execução para construir uma cadeia de conexão completa. Por exemplo, o nome do banco de dados pode ser conhecido com antecedência, mas não o nome do servidor. Ou, talvez, você deseje que um usuário forneça um nome e uma senha em tempo de execução, sem que possa injetar outros valores na cadeia de conexão.
Um dos construtores sobrecarregados de um construtor de cadeias de conexão obtém um <xref:System.String> como argumento, o que permite a você fornecer uma cadeia de conexão parcial que depois poderá ser concluída pela entrada do usuário. A cadeia de conexão parcial pode ser armazenada em um arquivo de configuração e recuperada em tempo de execução.
> [!NOTE]
> O namespace <xref:System.Configuration> permite acesso programático aos arquivos de configuração que usam <xref:System.Web.Configuration.WebConfigurationManager> para aplicativos Web e <xref:System.Configuration.ConfigurationManager> para aplicativos do Windows. Para obter mais informações sobre como trabalhar com cadeias de caracteres de conexão e arquivos de configuração, consulte [cadeias de caracteres de Conexão e arquivos de configuração](../../../../docs/framework/data/adonet/connection-strings-and-configuration-files.md).
### <a name="example"></a>Exemplo
Este exemplo demonstra como recuperar uma cadeia de conexão parcial de um arquivo de configuração e concluí-la definindo as propriedades <xref:System.Data.SqlClient.SqlConnectionStringBuilder.DataSource%2A>, <xref:System.Data.SqlClient.SqlConnectionStringBuilder.UserID%2A> e <xref:System.Data.SqlClient.SqlConnectionStringBuilder.Password%2A> do <xref:System.Data.SqlClient.SqlConnectionStringBuilder>. O arquivo de configuração é definido como a seguir.
```xml
<connectionStrings>
<clear/>
<add name="partialConnectString"
connectionString="Initial Catalog=Northwind;"
providerName="System.Data.SqlClient" />
</connectionStrings>
```
> [!NOTE]
> Você deve definir uma referência à `System.Configuration.dll` em seu projeto para que o código seja executado.
[!code-csharp[DataWorks SqlConnectionStringBuilder.UserNamePwd#1](../../../../samples/snippets/csharp/VS_Snippets_ADO.NET/DataWorks SqlConnectionStringBuilder.UserNamePwd/CS/source.cs#1)]
[!code-vb[DataWorks SqlConnectionStringBuilder.UserNamePwd#1](../../../../samples/snippets/visualbasic/VS_Snippets_ADO.NET/DataWorks SqlConnectionStringBuilder.UserNamePwd/VB/source.vb#1)]
## <a name="see-also"></a>Consulte também
- [Cadeia de Conexão](../../../../docs/framework/data/adonet/connection-strings.md)
- [Privacidade e segurança de dados](../../../../docs/framework/data/adonet/privacy-and-data-security.md)
- [ADO.NET Managed Providers and DataSet Developer Center](https://go.microsoft.com/fwlink/?LinkId=217917) (Central de desenvolvedores do DataSet e de provedores gerenciados do ADO.NET)
| 90.139535 | 1,152 | 0.782121 | por_Latn | 0.990239 |
a67df588376f500812646f734d14f29238246627 | 1,217 | md | Markdown | docs/README.md | korikosiki/eng | 05aafba19a031426e76b53d8c402fd909763c617 | [
"MIT"
] | null | null | null | docs/README.md | korikosiki/eng | 05aafba19a031426e76b53d8c402fd909763c617 | [
"MIT"
] | null | null | null | docs/README.md | korikosiki/eng | 05aafba19a031426e76b53d8c402fd909763c617 | [
"MIT"
] | 1 | 2022-02-01T17:41:40.000Z | 2022-02-01T17:41:40.000Z | <p align='center'><img src='https://raw.githubusercontent.com/SOONTOKEN/soontoken.github.io/main/img/logo.png' width='300'></p>
| [Main page](https://soontoken.github.io) | [Roadmap](/roadmap) | [Vesting](/vesting/) | [Tokenomics](https://docs.google.com/spreadsheets/d/1Jj3XlLC6MkDi6-cvHPL6PpJ5IXD96dJd0UckqNrWv-A/edit#gid=0) | [Crowdsale](/Crowdsale/) | [Audit](/audits/) | [docs](/docs/) |
# DOCS
* [Airdrop 107 SOON (13.12.2021)](https://docs.google.com/spreadsheets/d/1HQO8z9eEuyf0mHklOlqa9M0V3S0T4e0gGF_kD4dqvhU/edit)
* [Crowdsale (15-12-2021)](https://raw.githubusercontent.com/SOONTOKEN/soontoken.github.io/main/docs/15-12-2021.csv)
* [Crowdsale (16-12-2021)](https://raw.githubusercontent.com/SOONTOKEN/soontoken.github.io/main/docs/16-12-2021.csv)
* [Crowdsale (17-12-2021)](https://raw.githubusercontent.com/SOONTOKEN/soontoken.github.io/main/docs/17-12-2021.csv)
* [Crowdsale (18-12-2021)](https://raw.githubusercontent.com/SOONTOKEN/soontoken.github.io/main/docs/18-12-2021.csv)
* [Crowdsale (19-12-2021)](https://raw.githubusercontent.com/SOONTOKEN/soontoken.github.io/main/docs/19-12-2021.csv)
<p align='center'><img src='https://gramkit.org/everscale-branding-v1.0/logo/main.svg' width='100'></p>
| 81.133333 | 264 | 0.746919 | yue_Hant | 0.437791 |
a67e47ac5aac8c0b80f72d43d1531d7ef64e5a8a | 2,194 | md | Markdown | add/metadata/System.Windows.Controls.Primitives/DataGridColumnHeadersPresenter.meta.md | MarktW86/dotnet.docs | 178451aeae4e2c324aadd427ed6bf6850e483900 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Windows.Controls.Primitives/DataGridColumnHeadersPresenter.meta.md | MarktW86/dotnet.docs | 178451aeae4e2c324aadd427ed6bf6850e483900 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Windows.Controls.Primitives/DataGridColumnHeadersPresenter.meta.md | MarktW86/dotnet.docs | 178451aeae4e2c324aadd427ed6bf6850e483900 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.ArrangeOverride(System.Windows.Size)
ms.author: "kempb"
manager: "ghogen"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.#ctor
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.MeasureOverride(System.Windows.Size)
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.ClearContainerForItemOverride(System.Windows.DependencyObject,System.Object)
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.OnApplyTemplate
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.GetContainerForItemOverride
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.GetLayoutClip(System.Windows.Size)
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.VisualChildrenCount
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.OnCreateAutomationPeer
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.IsItemItsOwnContainerOverride(System.Object)
ms.author: "kempb"
manager: "ghogen"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.GetVisualChild(System.Int32)
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
---
uid: System.Windows.Controls.Primitives.DataGridColumnHeadersPresenter.PrepareContainerForItemOverride(System.Windows.DependencyObject,System.Object)
author: "stevehoag"
ms.author: "shoag"
manager: "wpickett"
---
| 24.651685 | 149 | 0.795807 | yue_Hant | 0.818014 |
a67e98081811de77e644bca029be01b60a53367f | 469 | md | Markdown | src/4-cargo.md | zzy/crate-guide | 9246b6eaa9334dcf2abf0d6b8bb2488542cc23d0 | [
"Apache-2.0",
"MIT"
] | 47 | 2020-12-18T08:37:12.000Z | 2022-03-21T03:30:44.000Z | src/4-cargo.md | zzy/rust-crate-guide | 9246b6eaa9334dcf2abf0d6b8bb2488542cc23d0 | [
"Apache-2.0",
"MIT"
] | null | null | null | src/4-cargo.md | zzy/rust-crate-guide | 9246b6eaa9334dcf2abf0d6b8bb2488542cc23d0 | [
"Apache-2.0",
"MIT"
] | 8 | 2021-02-15T12:26:32.000Z | 2022-02-15T08:31:49.000Z | # 4. Cargo 工具
Rust 官方提供了非常强大的构建系统和包管理器 `Cargo`。Cargo 可以为你处理很多任务,比如下载 crate 依赖项、编译 crate、构建代码、生成可分发的 crate,并将它们上传到 crates.io —— Rust 社区的 crate 注册表。
> Rust 中的 crate,类似于其它编程语言中的`“包”或者“库”`。目前,Rust 中约定不做翻译。
Rust 和 Cargo 捆绑,或者说 Rust 安装器中内置了 Cargo。因此,成功安装 Rust 之后,Cargo 也会随之安装,并主动配置环境变量。
> 注:本章中对 Cargo 工具的介绍比较简略,仅限于创建、编译、调试,以及运行本书中的实例。在附录章节 [24.2. 附录二:Cargo 进阶](./24-appendix/24.2-cargo.md)中对 Cargo 做了进一步介绍,你也可以查阅 [Cargo 中文文档](https://cargo.budshome.com)以对 Cargo 进行全面了解。
| 46.9 | 183 | 0.759062 | yue_Hant | 0.931005 |
a67eb4ff149f892397ad9a44a93972a6f2fa31a8 | 702 | md | Markdown | desktop-src/NetShare/what-s-new-in-network-share-management-in-windows-server-2008-r2-and-windows-7.md | dianmsft/win32 | f07b550595a83e44dd2fb6e217525edd10a0341b | [
"CC-BY-4.0",
"MIT"
] | 4 | 2021-07-26T16:18:49.000Z | 2022-02-19T02:00:21.000Z | desktop-src/NetShare/what-s-new-in-network-share-management-in-windows-server-2008-r2-and-windows-7.md | dianmsft/win32 | f07b550595a83e44dd2fb6e217525edd10a0341b | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-04-09T17:00:51.000Z | 2020-04-09T18:30:01.000Z | desktop-src/NetShare/what-s-new-in-network-share-management-in-windows-server-2008-r2-and-windows-7.md | dianmsft/win32 | f07b550595a83e44dd2fb6e217525edd10a0341b | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-07-19T02:58:48.000Z | 2021-03-06T21:09:47.000Z | ---
Description: Windows Server 2008 R2.
ms.assetid: 85b98a7e-7897-4bf2-b56f-37785261b9da
title: Whats New in Network Share Management in Windows Server 2008 R2 and Windows 7
ms.topic: article
ms.date: 05/31/2018
---
# What's New in Network Share Management in Windows Server 2008 R2 and Windows 7
Windows Server 2008 R2 and Windows 7 introduce the following changes to Network Share Management.
## Existing Network Share Management Structure Modifications
<dl>
[**SHARE\_INFO\_1005**](/windows/desktop/api/Lmshare/ns-lmshare-share_info_1005) structureAdded **shi1005\_flags** values: <dl> **SHI1005\_FLAGS\_FORCE\_LEVELII\_OPLOCK**
**SHI1005\_FLAGS\_ENABLE\_HASH**
</dl> </dd> </dl>
| 26 | 172 | 0.75641 | yue_Hant | 0.316348 |
a67ebc9310dd7b9a3f2583b0986cc1cfc165e9c5 | 11,742 | md | Markdown | README.md | longwoo/crocoddyl | 8d54d237a9740d82391c93ecdc484fb70c3fef41 | [
"BSD-3-Clause"
] | null | null | null | README.md | longwoo/crocoddyl | 8d54d237a9740d82391c93ecdc484fb70c3fef41 | [
"BSD-3-Clause"
] | null | null | null | README.md | longwoo/crocoddyl | 8d54d237a9740d82391c93ecdc484fb70c3fef41 | [
"BSD-3-Clause"
] | null | null | null |
<img align="right" src="https://i.imgur.com/o2LfbDq.gif" width="25%"/>
Contact RObot COntrol by Differential DYnamic programming Library (crocoddyl)
===============================================
<table >
<tr>
<td align="left"><img src="https://cmastalli.github.io/assets/img/publications/highly_dynamic_maneuvers.png" width="10000"/></td>
<td align="right"><img src="https://i.imgur.com/RQR2Ovx.gif"/> <img src="https://i.imgur.com/kTW0ePh.gif"/></td>
</tr>
</table>
## <img align="center" height="20" src="https://i.imgur.com/vAYeCzC.png"/> Introduction
**[Crocoddyl](https://cmastalli.github.io/publications/crocoddyl20icra.html)** is an optimal control library for robot control under contact sequence.
Its solvers are based on novel and efficient Differential Dynamic Programming (DDP) algorithms.
**Crocoddyl** computes optimal trajectories along with optimal feedback gains.
It uses **[Pinocchio](https://github.com/stack-of-tasks/pinocchio)** for fast computation of robots dynamics and their analytical derivatives.
The source code is released under the [BSD 3-Clause license](LICENSE).
**Authors:** [Carlos Mastalli](https://cmastalli.github.io/) and [Rohan Budhiraja](https://scholar.google.com/citations?user=NW9Io9AAAAAJ) <br />
**Instructors:** Nicolas Mansard <br />
**With additional support from the Gepetto team at LAAS-CNRS and MEMMO project. For more details see Section Credits**
[](https://tldrlegal.com/license/bsd-3-clause-license-%28revised%29#fulltext)
[](https://travis-ci.org/loco-3d/crocoddyl)
[](https://gepgitlab.laas.fr/loco-3d/crocoddyl/pipelines?ref=devel)
[](https://gepettoweb.laas.fr/doc/loco-3d/crocoddyl/devel/coverage/)
[](https://gepgitlab.laas.fr/loco-3d/crocoddyl/-/tags)
[](https://img.shields.io/github/repo-size/loco-3d/crocoddyl)
[](https://github.com/loco-3d/crocoddyl/graphs/contributors)
[](https://img.shields.io/github/release-date/loco-3d/crocoddyl)
[](https://img.shields.io/github/last-commit/loco-3d/crocoddyl)
If you want to follow the current developments, you can directly refer to the [devel branch](https://gepgitlab.laas.fr/loco-3d/cddp/tree/devel).
## <img align="center" height="20" src="https://i.imgur.com/x1morBF.png"/> Installation
**Crocoddyl** can be easily installed on various Linux (Ubuntu, Fedora, etc.) and Unix distributions (Mac OS X, BSD, etc.).
## Crocoddyl features
**Crocoddyl** is versatible:
* various optimal control solvers (DDP, FDDP, BoxDDP, etc) - single and multi-shooting methods
* analytical and sparse derivatives via **[Pinocchio](https://github.com/stack-of-tasks/pinocchio)**
* Euclidian and non-Euclidian geometry friendly via **[Pinocchio](https://github.com/stack-of-tasks/pinocchio)**
* handle autonomous and nonautomous dynamical systems
* numerical differentiation support
* automatic differentiation support
**Crocoddyl** is efficient and flexible:
* cache friendly,
* multi-thread friendly
* Python bindings (including models and solvers abstractions)
* C++ 98/11/14/17/20 compliant
* extensively tested
* automatic code generation support
### Installation through robotpkg
You can install this package through robotpkg. robotpkg is a package manager tailored for robotics softwares.
It greatly simplifies the release of new versions along with the management of their dependencies.
You just need to add the robotpkg apt repository to your sources.list and then use `sudo apt install robotpkg-py27-crocoddyl` (or `py3X` for python 3.X, depending on your system):
If you have never added robotpkg as a softwares repository, please follow first the instructions from 1 to 3; otherwise, go directly to instruction 4.
Those instructions are similar to the installation procedures presented in [http://robotpkg.openrobots.org/debian.html](http://robotpkg.openrobots.org/debian.html).
1. Add robotpkg as source repository to apt:
```bash
sudo tee /etc/apt/sources.list.d/robotpkg.list <<EOF
deb [arch=amd64] http://robotpkg.openrobots.org/wip/packages/debian/pub $(lsb_release -sc) robotpkg
deb [arch=amd64] http://robotpkg.openrobots.org/packages/debian/pub $(lsb_release -sc) robotpkg
EOF
```
2. Register the authentication certificate of robotpkg:
```bash
curl http://robotpkg.openrobots.org/packages/debian/robotpkg.key | sudo apt-key add -
```
3. You need to run at least once apt update to fetch the package descriptions:
```bash
sudo apt-get update
```
4. The installation of Crocoddyl:
```bash
sudo apt install robotpkg-py27-crocoddyl # for Python 2
sudo apt install robotpkg-py35-crocoddyl # for Python 3
```
Finally you will need to configure your environment variables, e.g.:
```bash
export PATH=/opt/openrobots/bin:$PATH
export PKG_CONFIG_PATH=/opt/openrobots/lib/pkgconfig:$PKG_CONFIG_PATH
export LD_LIBRARY_PATH=/opt/openrobots/lib:$LD_LIBRARY_PATH
export PYTHONPATH=/opt/openrobots/lib/python2.7/site-packages:$PYTHONPATH
```
### Building from source
**Crocoddyl** is c++ library with Python bindings for versatile and fast prototyping. It has the following dependencies:
* [pinocchio](https://github.com/stack-of-tasks/pinocchio)
* [example-robot-data](https://gepgitlab.laas.fr/gepetto/example-robot-data) (optional for examples, install Python loaders)
* [gepetto-viewer-corba](https://github.com/Gepetto/gepetto-viewer-corba) (optional for display)
* [jupyter](https://jupyter.org/) (optional for notebooks)
* [matplotlib](https://matplotlib.org/) (optional for examples)
You can run examples, unit-tests and benchmarks from your build dir:
```bash
cd build
make test
make -s examples-quadrupedal_gaits INPUT="display plot" # enable display and plot
make -s benchmarks-cpp-quadrupedal_gaits INPUT="100 walk" # number of trials ; type of gait
```
Alternatively, you can see the 3D result and/or graphs of your run examples (through gepetto-viewer and matplotlib), you can use
```bash
export CROCODDYL_DISPLAY=1
export CROCODDYL_PLOT=1
```
After installation, you could run the examples as follows:
```bash
python -m crocoddyl.examples.quadrupedal_gaits "display" "plot" # enable display and plot
```
If you want to learn about Crocoddyl, take a look at the Jupyter notebooks. Start in the following order.
- [examples/notebooks/unicycle_towards_origin.ipynb](https://gepgitlab.laas.fr/loco-3d/crocoddyl/blob/devel/examples/notebooks/unicycle_towards_origin.ipynb)
- [examples/notebooks/cartpole_swing_up.ipynb](https://gepgitlab.laas.fr/loco-3d/crocoddyl/blob/devel/examples/notebooks/cartpole_swing_up.py)
- [examples/notebooks/arm_manipulation.ipynb](https://gepgitlab.laas.fr/loco-3d/crocoddyl/blob/devel/examples/notebooks/arm_manipulation.ipynb)
- [examples/notebooks/bipedal_walking.ipynb](https://gepgitlab.laas.fr/loco-3d/crocoddyl/blob/devel/examples/notebooks/bipedal_walking.ipynb)
- [examples/notebooks/introduction_to_crocoddyl.ipynb](https://gepgitlab.laas.fr/loco-3d/crocoddyl/blob/devel/examples/notebooks/introduction_to_crocoddyl.ipynb)
## Citing Crocoddyl
To cite **Crocoddyl** in your academic research, please use the following bibtex lines:
```tex
@inproceedings{mastalli20crocoddyl,
author={Mastalli, Carlos and Budhiraja, Rohan and Merkt, Wolfgang and Saurel, Guilhem and Hammoud, Bilal
and Naveau, Maximilien and Carpentier, Justin, Righetti, Ludovic and Vijayakumar, Sethu and Mansard, Nicolas},
title={{Crocoddyl: An Efficient and Versatile Framework for Multi-Contact Optimal Control}},
booktitle = {IEEE International Conference on Robotics and Automation (ICRA)},
year={2020}
}
```
and the following one to reference this website:
```tex
@misc{crocoddylweb,
author = {Carlos Mastalli, Rohan Budhiraja and Nicolas Mansard and others},
title = {Crocoddyl: a fast and flexible optimal control library for robot control under contact sequence},
howpublished = {https://gepgitlab.laas.fr/loco-3d/crocoddyl/wikis/home},
year = {2019}
}
```
The rest of the publications describes different component of **Crocoddyl**:
### Publications
- C. Mastalli et al. [Crocoddyl: An Efficient and Versatile Framework for Multi-Contact Optimal Control](https://cmastalli.github.io/publications/crocoddyl20icra.html), IEEE International Conference on Robotics and Automation (ICRA), 2020
- R. Budhiraja, J. Carpentier, C. Mastalli and N. Mansard. [Differential Dynamic Programming for Multi-Phase Rigid Contact Dynamics](https://cmastalli.github.io/publications/mddp18.html), IEEE RAS International Conference on Humanoid Robots (ICHR), 2018
- Y. Tassa, N. Mansard, E. Todorov. [Control-Limited Differential Dynamic Programming](https://homes.cs.washington.edu/~todorov/papers/TassaICRA14.pdf), IEEE International Conference on Automation and Robotics (ICRA), 2014
- R. Budhiraja, J. Carpentier and N. Mansard. [Dynamics Consensus between Centroidal and Whole-Body Models for Locomotion of Legged Robots](https://hal.laas.fr/hal-01875031/document), IEEE International Conference on Automation and Robotics (ICRA), 2019
- T. G. Lembono, C. Mastally, P. Fernbach, N. Mansard and S. Calinon. [Learning How to Walk: Warm-starting Optimal Control Solver with Memory of Motion](https://arxiv.org/abs/2001.11751), IEEE International Conference on Robotics and Automation (ICRA), 2020
## Questions and Issues
You have a question or an issue? You may either directly open a [new issue](https://gepgitlab.laas.fr/loco-3d/crocoddyl/issues) or use the mailing list <[email protected]>.
## Credits
The following people have been involved in the development of **Crocoddyl**:
- [Carlos Mastalli](https://cmastalli.github.io/) (University of Edinburgh): main developer and manager of the project
- [Nicolas Mansard](http://projects.laas.fr/gepetto/index.php/Members/NicolasMansard) (LAAS-CNRS): project instructor
- [Rohan Budhiraja](https://scholar.google.com/citations?user=NW9Io9AAAAAJ) (LAAS-CNRS): core development and features extension
- [Justin Carpentier](https://jcarpent.github.io/) (INRIA): efficient analytical rigid-body dynamics derivatives
- [Maximilien Naveau](https://scholar.google.fr/citations?user=y_-cGlUAAAAJ&hl=fr) (MPI): unit-test support
- [Guilhem Saurel](http://projects.laas.fr/gepetto/index.php/Members/GuilhemSaurel) (LAAS-CNRS): continuous integration and deployment
- [Wolfgang Merkt](http://www.wolfgangmerkt.com/research/) (University of Oxford): feature extension and debugging
- [Josep Martí Saumell](https://www.iri.upc.edu/staff/jmarti) (IRI: CSIC-UPC): feature extension
- [Bilal Hammoud](https://scholar.google.com/citations?hl=en&user=h_4NKpsAAAAJ) (MPI): features extension
## Acknowledgments
The development of **Crocoddyl** is supported by the [EU MEMMO project](http://www.memmo-project.eu/), and the [EU RoboCom++ project](http://robocomplusplus.eu/).
It is maintained by the [Gepetto team](http://projects.laas.fr/gepetto/) [@LAAS-CNRS](http://www.laas.fr), and the [Statistical Machine Learning and Motor Control Group](http://wcms.inf.ed.ac.uk/ipab/slmc) [@University of Edinburgh](https://www.edinburgh-robotics.org/).
| 55.649289 | 270 | 0.768268 | eng_Latn | 0.615631 |
a67ecbf7bc97240c9b1c7e6cdfe94a8333f551fe | 258 | md | Markdown | content/scp-facil.md | sidneiweber/website | 2fa12baebc7ba6ee8399cd86da2e5a675d28dc66 | [
"MIT"
] | null | null | null | content/scp-facil.md | sidneiweber/website | 2fa12baebc7ba6ee8399cd86da2e5a675d28dc66 | [
"MIT"
] | null | null | null | content/scp-facil.md | sidneiweber/website | 2fa12baebc7ba6ee8399cd86da2e5a675d28dc66 | [
"MIT"
] | null | null | null | ---
layout: page
title: SCP Fácil
permalink: /scp-facil/
weight: 2
---
Projeto que fiz para facilitar o envio de arquivos de uma máquina para outra com o SCP, usando YAD.

[SCP Fácil](https://github.com/emmilinux/scpfacil.git) | 21.5 | 99 | 0.717054 | por_Latn | 0.900423 |
a67f36d61101fa5f944d62748624d37c92668fc0 | 2,535 | md | Markdown | doc/ersip_registrar_binding.md | PolinaMityakina/ersip | 562456882d16010f6183d5ef86a129a9494ab5f7 | [
"MIT"
] | 102 | 2017-12-18T18:30:15.000Z | 2022-02-08T00:24:22.000Z | doc/ersip_registrar_binding.md | PolinaMityakina/ersip | 562456882d16010f6183d5ef86a129a9494ab5f7 | [
"MIT"
] | 53 | 2018-04-22T07:56:25.000Z | 2022-01-17T21:12:33.000Z | doc/ersip_registrar_binding.md | PolinaMityakina/ersip | 562456882d16010f6183d5ef86a129a9494ab5f7 | [
"MIT"
] | 31 | 2018-07-11T13:07:43.000Z | 2022-01-17T18:49:18.000Z |
# Module ersip_registrar_binding #
* [Data Types](#types)
* [Function Index](#index)
* [Function Details](#functions)
<a name="types"></a>
## Data Types ##
### <a name="type-binding">binding()</a> ###
<pre><code>
binding() = #binding{contact = <a href="ersip_hdr_contact.md#type-contact">ersip_hdr_contact:contact()</a>, callid = <a href="ersip_hdr_callid.md#type-callid">ersip_hdr_callid:callid()</a>, cseq = <a href="ersip_hdr_cseq.md#type-cseq_num">ersip_hdr_cseq:cseq_num()</a>, expires = non_neg_integer()}
</code></pre>
<a name="index"></a>
## Function Index ##
<table width="100%" border="1" cellspacing="0" cellpadding="2" summary="function index"><tr><td valign="top"><a href="#callid_cseq-1">callid_cseq/1</a></td><td></td></tr><tr><td valign="top"><a href="#contact-1">contact/1</a></td><td></td></tr><tr><td valign="top"><a href="#contact_key-1">contact_key/1</a></td><td></td></tr><tr><td valign="top"><a href="#new-4">new/4</a></td><td></td></tr><tr><td valign="top"><a href="#update-4">update/4</a></td><td></td></tr></table>
<a name="functions"></a>
## Function Details ##
<a name="callid_cseq-1"></a>
### callid_cseq/1 ###
<pre><code>
callid_cseq(Binding::<a href="#type-binding">binding()</a>) -> {<a href="ersip_hdr_callid.md#type-callid">ersip_hdr_callid:callid()</a>, <a href="ersip_hdr_cseq.md#type-cseq_num">ersip_hdr_cseq:cseq_num()</a>}
</code></pre>
<br />
<a name="contact-1"></a>
### contact/1 ###
<pre><code>
contact(Binding::<a href="#type-binding">binding()</a>) -> <a href="ersip_hdr_contact.md#type-contact">ersip_hdr_contact:contact()</a>
</code></pre>
<br />
<a name="contact_key-1"></a>
### contact_key/1 ###
<pre><code>
contact_key(Binding::<a href="#type-binding">binding()</a>) -> <a href="ersip_uri.md#type-uri">ersip_uri:uri()</a>
</code></pre>
<br />
<a name="new-4"></a>
### new/4 ###
<pre><code>
new(CallId::<a href="ersip_hdr_callid.md#type-callid">ersip_hdr_callid:callid()</a>, CSeqNum::<a href="ersip_hdr_cseq.md#type-cseq_num">ersip_hdr_cseq:cseq_num()</a>, Contact::<a href="ersip_hdr_contact.md#type-contact">ersip_hdr_contact:contact()</a>, Exp::non_neg_integer()) -> <a href="#type-binding">binding()</a>
</code></pre>
<br />
<a name="update-4"></a>
### update/4 ###
<pre><code>
update(NewExpiration::pos_integer(), NewCallId::<a href="ersip_hdr_callid.md#type-callid">ersip_hdr_callid:callid()</a>, NewCSeq::pos_integer(), Binding::<a href="#type-binding">binding()</a>) -> <a href="#type-binding">binding()</a>
</code></pre>
<br />
| 32.088608 | 472 | 0.652071 | yue_Hant | 0.280583 |
a67f7c0b76961ad0bf2730d032f944bc1d15efbf | 980 | md | Markdown | blogposts/gophercon-2019-how-uber-go-es.md | beyang/about | 69b68a28ba29165c128736dab7b07bee8fa2602c | [
"MIT"
] | null | null | null | blogposts/gophercon-2019-how-uber-go-es.md | beyang/about | 69b68a28ba29165c128736dab7b07bee8fa2602c | [
"MIT"
] | null | null | null | blogposts/gophercon-2019-how-uber-go-es.md | beyang/about | 69b68a28ba29165c128736dab7b07bee8fa2602c | [
"MIT"
] | null | null | null | ---
title: "GopherCon 2019 - How Uber 'Go'es"
description: "Maintaining a large codebase with maximum readability and minimal overhead is hard. This is the story of how Go went from a few enthusiastic Gophers to the most popular language for microservices at Uber. Learn where we failed, and how that led us to solutions that we think are pretty darn neat!"
author: $LIVEBLOGGER_NAME for the GopherCon 2019 Liveblog
publishDate: 2019-07-25T00:00-09:50
tags: [
gophercon
]
slug: gophercon-2019-how-uber-go-es
heroImage: /gophercon2019.png
published: false
---
Presenter: Elena Morozova
Liveblogger: [$LIVEBLOGGER_NAME]($LIVEBLOGGER_URL)
## Overview
Maintaining a large codebase with maximum readability and minimal overhead is hard. This is the story of how Go went from a few enthusiastic Gophers to the most popular language for microservices at Uber. Learn where we failed, and how that led us to solutions that we think are pretty darn neat!
---
Liveblog content here.
| 39.2 | 311 | 0.785714 | eng_Latn | 0.996185 |
a67f976876aaa516a77e7ed9dec5dc0aec4e1ddb | 731 | md | Markdown | README.md | everyevery/class-checksum | 1f324fce23e94d618e92ab8257713d5d0f4069ba | [
"Apache-2.0"
] | null | null | null | README.md | everyevery/class-checksum | 1f324fce23e94d618e92ab8257713d5d0f4069ba | [
"Apache-2.0"
] | null | null | null | README.md | everyevery/class-checksum | 1f324fce23e94d618e92ab8257713d5d0f4069ba | [
"Apache-2.0"
] | null | null | null | # class-checksum
Simple Java class checksum generator.
# Introduction
I make it to detect the change of class structure during app launching time. Although there are many bean comparator or hashing code, I can't find class structure checksum generation code. All method of this class is static because it is executed just one time at static initialization time.
# Feature
* Package scope restriction
* Support all java digest function (Default is MD5)
# History
## version 0.3
* block java primitive types and java.lang types
* fix multi type variable case
## version 0.2
* refactoring to class-base implementation
* don't digest the classes of 'java.\*' packages
## version 0.1
* initial static utility implementation
| 25.206897 | 291 | 0.774282 | eng_Latn | 0.990633 |
a6805e636bc0c6062ca6c4fd643339831dcf95a9 | 973 | md | Markdown | docs/ru/index.md | kozhindev/yii-steroids | d3e57e00da77bc7b24948b391e1a0ba985b09a38 | [
"MIT"
] | 5 | 2018-04-02T06:25:23.000Z | 2020-05-10T19:14:11.000Z | docs/ru/index.md | kozhindev/yii-steroids | d3e57e00da77bc7b24948b391e1a0ba985b09a38 | [
"MIT"
] | 9 | 2021-03-09T01:32:39.000Z | 2022-02-26T14:26:42.000Z | docs/ru/index.md | kozhindev/yii-steroids | d3e57e00da77bc7b24948b391e1a0ba985b09a38 | [
"MIT"
] | 2 | 2018-09-05T06:02:32.000Z | 2019-07-16T10:36:53.000Z | # Steroids
Steroids - это Backend (PHP) и Frontend (JS React) библиотеки, предоставляющие большой набор готовых компонентов для
быстрого создания сложных веб-приложений, таких как ERP, CRM системы, социальные сервисы, биллинги, личные кабинеты и
многие другие.
Основная идеология, вокруг которой создавалась эта библиотека - это уменьшение рутинной работы веб-разработчика,
минимизация разработки однотипного функционала. Ведь почти в каждом проекте нам нужно загружать файлы, авторизировать
пользователя, сделать апи слой с документацией, создавать CRUD'ы для админки описывать роутинг и так далее. Все это
аккумулируется в данной библиотеке и переиспользуется в других проектах.
Библиотеку можно явно разделить на две части - Backend и Frontend. Они могут использоваться как вместе в одном проекте
(что дает некоторые приемущества), так и отдельно, жесткой связи между ними нет.
- [Документация Backend](backend/index.md)
- [Документация Frontend](frontend/index.md)
| 57.235294 | 118 | 0.820144 | rus_Cyrl | 0.989551 |
a680de93787161bb7c2217a54b03b17f4f05e8b0 | 733 | md | Markdown | README.md | KiwiBryn/FieldGatewayDuinoClient | d9ca0c82b4394ac86793063a635762115e383dd8 | [
"Apache-2.0"
] | null | null | null | README.md | KiwiBryn/FieldGatewayDuinoClient | d9ca0c82b4394ac86793063a635762115e383dd8 | [
"Apache-2.0"
] | null | null | null | README.md | KiwiBryn/FieldGatewayDuinoClient | d9ca0c82b4394ac86793063a635762115e383dd8 | [
"Apache-2.0"
] | null | null | null | # FieldGatewayDuinoClient
Sample plug n play Arduino Uno R3/Seeeduino V4.2/... clients for my nRF24L01 field gateway projects
Both clients use
* [RFX Arduino shield](http://embeddedcoolness.com/shop/rfx-shield/)
* [SeeedStudio Grove-Temperature & Humidity Sensor ](https://www.seeedstudio.com/Grove-Temperature%26Humidity-Sensor-%28High-Accuracy-%26-Mini%29-p-1921.html)
* [SeeedStudio Grove-Universal 4 Pin Buckled 5cm Cable](https://www.seeedstudio.com/Grove-Universal-4-Pin-Buckled-5cm-Cable-%285-PCs-Pack%29-p-925.html)
The Arduino Uno R3 version also uses
* [SeeedStudio Grove-Base Shield V2](https://www.seeedstudio.com/Base-Shield-V2-p-1378.html)

| 52.357143 | 158 | 0.780355 | kor_Hang | 0.24024 |
a680fca67a2680869cf90e0052d50d9bd39d2008 | 2,513 | md | Markdown | README.md | mwright-company/feathers-authentication-management | d20081bd0712b25f12e23bc404dd31e1be604d35 | [
"MIT"
] | null | null | null | README.md | mwright-company/feathers-authentication-management | d20081bd0712b25f12e23bc404dd31e1be604d35 | [
"MIT"
] | null | null | null | README.md | mwright-company/feathers-authentication-management | d20081bd0712b25f12e23bc404dd31e1be604d35 | [
"MIT"
] | null | null | null | ## feathers-authentication-management
[](https://travis-ci.org/feathersjs/feathers-authentication-management)
[](https://codeclimate.com/github/feathersjs/feathers-authentication-management)
[](https://codeclimate.com/github/feathersjs/feathers-authentication-management/coverage)
[](https://david-dm.org/feathersjs/feathers-authentication-management)
[](https://www.npmjs.com/package/feathers-authentication-management)
> Adds sign up verification, forgotten password reset, and other capabilities to local
[`feathers-authentication`](https://docs.feathersjs.com/v/auk/authentication/management.html).
This repo work with either the v1.0 rewrite of `feathers-authentication` or with v0.7.
## Multiple communication channels:
Traditionally users have been authenticated using their `username` or `email`.
However that landscape is changing.
Teens are more involved with cellphone SMS, whatsapp, facebook, QQ and wechat then they are with email.
Seniors may not know how to create an email account or check email, but they have smart phones
and perhaps whatsapp or wechat accounts.
A more flexible design would maintain multiple communication channels for a user
-- username, email address, phone number, handles for whatsapp, facebook, QQ, wechat --
which each uniquely identify the user.
The user could then sign in using any of their unique identifiers.
The user could also indicate how they prefer to be contacted.
Some may prefer to get password resets via long tokens sent by email;
others may prefer short numeric tokens sent by SMS or wechat.
`feathers-authentication` and `feathers-authentication-management`
provide much of the infrastructure necessary to implement such a scenario.
## Documentation
Refer to [Feathersjs documentation](https://docs.feathersjs.com/v/auk/authentication/management.html).
`feathers-authentication-management` is part of the `Auk` release of Feathersjs.
## Tests
Run `npm test`
## License
MIT. See LICENSE.
| 50.26 | 203 | 0.804616 | eng_Latn | 0.91357 |
a681329ee7e371a0d8c58cfeb2df354ea7d40c9a | 997 | md | Markdown | _drafts/the_three_tiers_of_refactoring.md | joebutler2/blog | 9bb4b472faeb4781b0c880d5fa7cc26c92b2fc88 | [
"MIT"
] | null | null | null | _drafts/the_three_tiers_of_refactoring.md | joebutler2/blog | 9bb4b472faeb4781b0c880d5fa7cc26c92b2fc88 | [
"MIT"
] | 4 | 2019-10-01T23:57:53.000Z | 2022-02-26T03:37:21.000Z | _drafts/the_three_tiers_of_refactoring.md | joebutler2/joebutler2.github.io | 9bb4b472faeb4781b0c880d5fa7cc26c92b2fc88 | [
"MIT"
] | null | null | null | ---
layout: post
title: "The 3 tiers of Refactoring"
categories: article
tags:
date: 2018-08-08
---
The more projects I work on the more I notice the need for distinguishing what we mean by "we need to refactor this". In some situations it can be interpreted as "we didn't do the right thing the first time".
Also it matters who is hearing this, from the perspective of non-technical team-member it can be received as a thing that the developers do to no benefit to the team or customers.
My hopes with distinguishing these different kinds, or tiers, of refactoring is to help teams communicate about this process that is critical to grow a product with lesser defects and to actually increase productivity (i.e. velocity for XP practioners)
## Continuous
### i.e. the scouts' rule
When practicing TDD we follow the process of writing a failing test, then writing as little production code as possible to get the test to pass, and finally
## Design Impediment
### i.e. the refactor tractor
| 45.318182 | 252 | 0.774323 | eng_Latn | 0.999824 |
a681b4afdbfdf49bb37f0a6f906c54b87abb787b | 2,555 | md | Markdown | docs/responders/set_value.md | ropensci-org/buffy | 650d814d46864e9a19477f76c429419666d95668 | [
"MIT"
] | 9 | 2020-04-16T14:27:14.000Z | 2022-02-09T19:00:58.000Z | docs/responders/set_value.md | ropensci-org/buffy | 650d814d46864e9a19477f76c429419666d95668 | [
"MIT"
] | 49 | 2021-02-24T13:49:53.000Z | 2022-03-28T07:46:24.000Z | docs/responders/set_value.md | openjournals/buffy | 3209734945967431f37d8b45f289a0a7439de9b1 | [
"MIT"
] | 6 | 2019-09-28T10:25:10.000Z | 2021-11-02T15:02:45.000Z | Set value
=========
This responder can be used to update the value of any field in the body of the issue.
Allows [labeling](../labeling).
## Listens to
```
@botname set <value> as <name>
```
For example, if you configure this responder to change the value of the _version_, it would respond to:
```
@botname set v1.0.3 as version
```
## Requirements
The body of the issue should have the target field placeholder marked with HTML comments.
```html
<!--<name>--> <!--end-<name>-->
```
Following the previous example if the name of the field is _version_:
```html
<!--version--> <!--end-version-->
```
## Settings key
`set_value`
## Params
```eval_rst
:name: *Required.* The name of the target field in the body of the issue. It can be set using the ``name:`` keyword, or via the name of each instance if there are several instances of this responder specified in the settings file.
:if_missing: *Optional* Strategy when value placeholders are not defined in the body of the issue. Valid options: `append` (will add the value at the end of the issue body), `prepend` (will add the value at the beginning of the issue body) , `error` (will reply a not-found message). If this param is not present nothing will be done if value placeholder is not found.
:heading: if the value placeholder is missing and the `if_missing` strategy is set to append or prepend, when adding the value it will include this text as heading instead of just the value name.
:sample_value: A sample value string for the target field. It is used for documentation purposes when the :doc:`Help responder <./help>` lists all available responders. Default value is **xxxxx**.
```
## Examples
**Simplest use case:**
```yaml
...
responders:
set_value:
name: version
sample_value: v1.0.1
...
```
**Multiple instances of the responder, some of them restricted to editors:**
```yaml
...
responders:
set_value:
- version:
only: editors
sample_value: "v1.0.0"
- archive:
only: editors
sample_value: "10.21105/joss.12345"
if_missing: prepend
heading: "Archive DOI"
- repository:
sample_value: "github.com/openjournals/buffy"
...
```
## In action
* **`Initial state:`**

* **`Invocation:`**

* **`Final state:`**

| 29.034091 | 368 | 0.68728 | eng_Latn | 0.995255 |
a681e1f77aadda6da34dc442200db2c8d9cb4c82 | 96 | md | Markdown | src/data/tech/html.md | Laurent888/myonlinecv-v2 | 3feb50ca30e5ac8f675782121ca9846c3e8c2b44 | [
"MIT"
] | null | null | null | src/data/tech/html.md | Laurent888/myonlinecv-v2 | 3feb50ca30e5ac8f675782121ca9846c3e8c2b44 | [
"MIT"
] | 1 | 2020-05-18T06:47:15.000Z | 2020-05-18T06:47:15.000Z | src/data/tech/html.md | Laurent888/myonlinecv-v2 | 3feb50ca30e5ac8f675782121ca9846c3e8c2b44 | [
"MIT"
] | null | null | null | ---
id: "1"
title: "HTML & CSS"
level: "4"
category: "tech"
---
- HTML
- CSS, SASS
- CSS in JS
| 8.727273 | 19 | 0.53125 | kor_Hang | 0.710312 |
a682338825f58b62afa3080fb255678ebe5eb5c5 | 2,433 | md | Markdown | docs/outlook/mapi/pidlidinternetaccountstamp-canonical-property.md | ManuSquall/office-developer-client-docs.fr-FR | 5c3e7961c204833485b8fe857dc7744c12658ec5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-08-15T11:25:43.000Z | 2021-08-15T11:25:43.000Z | docs/outlook/mapi/pidlidinternetaccountstamp-canonical-property.md | ManuSquall/office-developer-client-docs.fr-FR | 5c3e7961c204833485b8fe857dc7744c12658ec5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/outlook/mapi/pidlidinternetaccountstamp-canonical-property.md | ManuSquall/office-developer-client-docs.fr-FR | 5c3e7961c204833485b8fe857dc7744c12658ec5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Propriété canonique PidLidInternetAccountStamp
manager: soliver
ms.date: 03/09/2015
ms.audience: Developer
ms.topic: reference
ms.prod: office-online-server
localization_priority: Normal
api_name:
- PidLidInternetAccountStamp
api_type:
- COM
ms.assetid: 819179fe-e58e-415c-abc7-1949036745ee
description: Dernière modification le 9 mars 2015
ms.openlocfilehash: ebd64392d24cd170a7babf77865aa00c7be24802
ms.sourcegitcommit: 8fe462c32b91c87911942c188f3445e85a54137c
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 04/23/2019
ms.locfileid: "32315455"
---
# <a name="pidlidinternetaccountstamp-canonical-property"></a>Propriété canonique PidLidInternetAccountStamp
**S’applique à** : Outlook 2013 | Outlook 2016
Spécifie l’ID de compte de messagerie via lequel le message électronique est envoyé.
|||
|:-----|:-----|
|Propriétés associées : <br/> |dispidInetAcctStamp <br/> |
|Jeu de propriétés : <br/> |PSETID_Common <br/> |
|ID long (LID) : <br/> |0x00008581 <br/> |
|Type de données : <br/> |PT_UNICODE <br/> |
|Domaine : <br/> |Messagerie générale <br/> |
## <a name="remarks"></a>Remarques
Le format de cette chaîne dépend de l’implémentation. Cette propriété peut être utilisée par le client pour déterminer le serveur vers lequel diriger le courrier, mais elle est facultative et la valeur n’a aucune signification pour le serveur.
## <a name="related-resources"></a>Ressources connexes
### <a name="protocol-specifications"></a>Spécifications de protocole
[[MS-OXPROPS]](https://msdn.microsoft.com/library/f6ab1613-aefe-447d-a49c-18217230b148%28Office.15%29.aspx)
> Fournit une définition de jeu de propriétés et des références aux spécifications Exchange Server protocole.
[[MS-OXOMSG]](https://msdn.microsoft.com/library/daa9120f-f325-4afb-a738-28f91049ab3c%28Office.15%29.aspx)
> Spécifie les propriétés et opérations autorisées pour les objets de message électronique.
### <a name="header-files"></a>Fichiers d’en-tête
Mapidefs.h
> Fournit des définitions de type de données.
## <a name="see-also"></a>Voir aussi
[Propriétés MAPI](mapi-properties.md)
[Propriétés canoniques MAPI](mapi-canonical-properties.md)
[Mappage des noms de propriétés canoniques aux noms MAPI](mapping-canonical-property-names-to-mapi-names.md)
[Mappage des noms MAPI aux noms de propriétés canoniques](mapping-mapi-names-to-canonical-property-names.md)
| 33.791667 | 243 | 0.756679 | fra_Latn | 0.634504 |
a68491da935bd12c7cd5968c6f1276391729a014 | 73 | md | Markdown | wiki/rpc/getstakinginfo.md | danielclough/wikiBlackcoinNL | 081ab00d4b5f8c961618eca7d8f5635bc5984d2e | [
"MIT"
] | null | null | null | wiki/rpc/getstakinginfo.md | danielclough/wikiBlackcoinNL | 081ab00d4b5f8c961618eca7d8f5635bc5984d2e | [
"MIT"
] | 1 | 2021-11-20T16:33:39.000Z | 2021-11-20T16:33:39.000Z | wiki/rpc/getstakinginfo.md | danielclough/wikiBlackcoinNL | 081ab00d4b5f8c961618eca7d8f5635bc5984d2e | [
"MIT"
] | null | null | null | getstakinginfo
Returns an object containing staking-related information.
| 24.333333 | 57 | 0.876712 | eng_Latn | 0.955425 |
a684b2d1dfc13ee50696b9d87bf988d4eb30943f | 2,405 | md | Markdown | README.md | AppliedIS/wams-manager | 01ef30d701e55a8ba48e3bce1279eb7c0fd976df | [
"Apache-2.0"
] | 1 | 2021-12-26T01:03:44.000Z | 2021-12-26T01:03:44.000Z | README.md | AppliedIS/wams-manager | 01ef30d701e55a8ba48e3bce1279eb7c0fd976df | [
"Apache-2.0"
] | null | null | null | README.md | AppliedIS/wams-manager | 01ef30d701e55a8ba48e3bce1279eb7c0fd976df | [
"Apache-2.0"
] | null | null | null | wams-manager
------------
------------
This readme describes the steps for setting up WAMS Manager, and how to use this application:
1. Create an Azure website ([click here](http://azure.microsoft.com/en-us/documentation/articles/web-sites-dotnet-get-started/))
2. Create an Azure Storage ([click here] (http://azure.microsoft.com/en-us/documentation/articles/storage-create-storage-account/))
3. Open the __Source/Ais.Internal.Dcm/Ais.Internal.Dcm.sln__ solution, and goto __web.config__ file of the __Ais.Internal.Dcm.WebV2__ project. Make the following changes to this web.config file:
* _MetadataStorageAccountName_ - Azure storage account name, you created in Step-2 above.
* _MetadataStorageKey_ -Azure storage account Access Key, you created in Step-2 above.
* _DefaultAdminUsername_ - WAMS Manager Administrator username.
* _DefaultAdminPassword_ - WAMS Manager Administrator password.
* _DataConnectionString_ - Replace the _AccountName_ and _AccountKey_ with the Storage Account Name and Storage Key (obtained in Step 2 above) in the connection string.
* **Note**: section of the _web.config_ file that needs to be updated as shown above is copied below for reference:
`````
<!-- Storage Specific Settings-->
<add key="MetadataStorageAccountName" value="storage_account_name_here" />
<add key="MetadataStorageKey" value="storage_account_access_key_here" />
<add key="DefaultAdminUsername" value="wams_manager_admininstrator_username_here"/>
<add key="DefaultAdminPassword" value="wams_manager_admininstrator_password_here"/>
<add key="DataConnectionString" value="DefaultEndpointsProtocol=https;AccountName=storage_account_name_here;AccountKey=storage_account_access_key_here" />
````
4. Proceed to the Project __Ais.Internal.Dcm.ModernUIV2__, and open __Common/Config.cs__
5. Replace the _BitlyUsername_ and _BitlyKey_ with your _bit.ly_ credentials.
6. Build the solution and publish the website using the Publishing Profile of the azure website created in Step-1 above.
7. Visit the website created in Step-1, and login with the _DefaultAdminUsername_ and _DefaultAdminPassword_ you specified for the _web.config_ file in Step-3.
8. Follow instructions in **Docs/WAMSAdminPortal.pdf** to complete the setup of the application.
9. Finally go through **Docs/WAMSManager.pdf** (a manual on how to use the application).
| 72.878788 | 194 | 0.774636 | eng_Latn | 0.847239 |
a684e56c4d13b8419948eafbfd0070132d0899ef | 9,433 | md | Markdown | docs/commands/configurations/Set-TssConfigurationGeneral.md | jrbella/thycotic | 38504e583c96b0737f8af41766931034a0db34a5 | [
"Apache-2.0"
] | 36 | 2021-02-09T16:46:50.000Z | 2022-03-25T16:36:17.000Z | docs/commands/configurations/Set-TssConfigurationGeneral.md | jrbella/thycotic | 38504e583c96b0737f8af41766931034a0db34a5 | [
"Apache-2.0"
] | 224 | 2020-12-23T15:20:25.000Z | 2022-03-24T17:16:43.000Z | docs/commands/configurations/Set-TssConfigurationGeneral.md | jrbella/thycotic | 38504e583c96b0737f8af41766931034a0db34a5 | [
"Apache-2.0"
] | 15 | 2021-02-11T17:46:12.000Z | 2022-03-23T17:35:20.000Z | # Set-TssConfigurationGeneral
## SYNOPSIS
Set general configuration
## SYNTAX
### folders
```
Set-TssConfigurationGeneral [-TssSession] <Session> [-EnablePersonalFolders] [-PersonalFolderName]
[-PersonalFolderFormat <PersonalFolderNameOption>] [-PersonalFolderWarning <String>]
[-PersonalFolderRequireView] [-PersonalFolderEnableWarning] [-WhatIf] [-Confirm] [<CommonParameters>]
```
### permissions
```
Set-TssConfigurationGeneral [-TssSession] <Session> [-AllowDuplicationSecretName] [-AllowViewNextPassword]
[-DefaultSecretPermission <SecretPermissionType>] [-EnableApprovalFromEmail]
[-ApprovalType <SecretApprovalType>] [-WhatIf] [-Confirm] [<CommonParameters>]
```
### userexperience
```
Set-TssConfigurationGeneral [-TssSession] <Session> [-ApplicationLanguage <ApplicationLanguage>]
[-DateFormat <String>] [-DefaultRoleId <Int32>] [-TimeFormat <String>] [-ForceInactivityTimeout]
[-InactivityTimeout <Int32>] [-RequireFolderForSecret] [-PasswordHistoryAll] [-PasswordHistory <Int32>]
[-SecretViewInterval <Int32>] [-TimeZone <String>] [-WhatIf] [-Confirm] [<CommonParameters>]
```
## DESCRIPTION
Set general configuration
## EXAMPLES
### EXAMPLE 1
```
$session = New-TssSession -SecretServer https://alpha -Credential $ssCred
Set-TssConfigurationGeneral -TssSession $session -EnablePersonalFolders:$false
```
Disable Personal Folders
### EXAMPLE 2
```
$session = New-TssSession -SecretServer https://alpha -Credential $ssCred
Set-TssConfigurationGeneral -TssSession $session -TimeZone (Get-TimeZone).Id
```
Set Secret Server Time Zone to the current user's Windows' default TimeZone.
## PARAMETERS
### -TssSession
TssSession object created by New-TssSession for authentication
```yaml
Type: Session
Parameter Sets: (All)
Aliases:
Required: True
Position: 1
Default value: None
Accept pipeline input: True (ByValue)
Accept wildcard characters: False
```
### -EnablePersonalFolders
Enable Personal Folders
```yaml
Type: SwitchParameter
Parameter Sets: folders
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -PersonalFolderName
Personal Folder Name
```yaml
Type: SwitchParameter
Parameter Sets: folders
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -PersonalFolderFormat
Personal Folder Name Format (DisplayName, UsernameAndDomain)
```yaml
Type: PersonalFolderNameOption
Parameter Sets: folders
Aliases:
Accepted values: DisplayName, UsernameAndDomain
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PersonalFolderWarning
Personal Folder warning message
```yaml
Type: String
Parameter Sets: folders
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -PersonalFolderRequireView
Personal Folder Require View Permission
```yaml
Type: SwitchParameter
Parameter Sets: folders
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -PersonalFolderEnableWarning
Show Personal Folder warning message
```yaml
Type: SwitchParameter
Parameter Sets: folders
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -AllowDuplicationSecretName
Allow duplicate secret names
```yaml
Type: SwitchParameter
Parameter Sets: permissions
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -AllowViewNextPassword
Allow View Rights on Secret to read Next Password for RPC
```yaml
Type: SwitchParameter
Parameter Sets: permissions
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -DefaultSecretPermission
Default permissions to apply when Secret is created (InheritsPermissions, CopyFromFolder, OnlyAllowCreator)
```yaml
Type: SecretPermissionType
Parameter Sets: permissions
Aliases:
Accepted values: InheritsPermissions, CopyFromFolder, OnlyAllowCreator
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -EnableApprovalFromEmail
Allow approval for access from email
```yaml
Type: SwitchParameter
Parameter Sets: permissions
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -ApprovalType
Force Secret approval type (RequireApprovalForEditors, RequireApprovalForOwnersAndEditors)
```yaml
Type: SecretApprovalType
Parameter Sets: permissions
Aliases:
Accepted values: RequireApprovalForEditors, RequireApprovalForOwnersAndEditors
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ApplicationLanguage
Default language for all users (English, French, German, Japanese, Korean, Portuguese)
```yaml
Type: ApplicationLanguage
Parameter Sets: userexperience
Aliases:
Accepted values: German, English, French, Japanese, Korean, Portuguese
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DateFormat
Default date format (tab through to see validate set options)
```yaml
Type: String
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -DefaultRoleId
Default Role ID assigned to new Users
```yaml
Type: Int32
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: 0
Accept pipeline input: False
Accept wildcard characters: False
```
### -TimeFormat
Default time format (tab through to see validate set options)
```yaml
Type: String
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ForceInactivityTimeout
Force Inactivity Timeout
```yaml
Type: SwitchParameter
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -InactivityTimeout
Inactivity Timeout in Minutes
```yaml
Type: Int32
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: 0
Accept pipeline input: False
Accept wildcard characters: False
```
### -RequireFolderForSecret
Require folder for Secrets
```yaml
Type: SwitchParameter
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -PasswordHistoryAll
Restrict all recent Passwords on a Secret from being reused
```yaml
Type: SwitchParameter
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: False
Accept pipeline input: False
Accept wildcard characters: False
```
### -PasswordHistory
Number of recent passwords on a Secret that cannot be reused
```yaml
Type: Int32
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: 0
Accept pipeline input: False
Accept wildcard characters: False
```
### -SecretViewInterval
Number of minutes a Secret can be viewed after entering a comment (without being reprompted)
```yaml
Type: Int32
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: 0
Accept pipeline input: False
Accept wildcard characters: False
```
### -TimeZone
Set Secret Servers default Time Zone (you can use Get-TimeZone -ListAvailable to see the possible IDs for your system)
```yaml
Type: String
Parameter Sets: userexperience
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -WhatIf
Shows what would happen if the cmdlet runs.
The cmdlet is not run.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: wi
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Confirm
Prompts you for confirmation before running the cmdlet.
```yaml
Type: SwitchParameter
Parameter Sets: (All)
Aliases: cf
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216).
## INPUTS
## OUTPUTS
## NOTES
Requires TssSession object returned by New-TssSession
## RELATED LINKS
[https://thycotic-ps.github.io/thycotic.secretserver/commands/configurations/Set-TssConfigurationGeneral](https://thycotic-ps.github.io/thycotic.secretserver/commands/configurations/Set-TssConfigurationGeneral)
[https://github.com/thycotic-ps/thycotic.secretserver/blob/main/src/functions/configurations/Set-TssConfigurationGeneral.ps1](https://github.com/thycotic-ps/thycotic.secretserver/blob/main/src/functions/configurations/Set-TssConfigurationGeneral.ps1)
| 21.008909 | 315 | 0.791689 | eng_Latn | 0.413962 |
a684eac740c68b071b41d0f6a00daf0525ae2002 | 96 | md | Markdown | spec/fixtures/valid/sip-9.md | saddle-finance/sdl_validator | 3e055428a44af89501908731aeb53f9405bbebbb | [
"MIT"
] | null | null | null | spec/fixtures/valid/sip-9.md | saddle-finance/sdl_validator | 3e055428a44af89501908731aeb53f9405bbebbb | [
"MIT"
] | null | null | null | spec/fixtures/valid/sip-9.md | saddle-finance/sdl_validator | 3e055428a44af89501908731aeb53f9405bbebbb | [
"MIT"
] | null | null | null | ---
sip: 9
title: Accept TRON USDT
author: Justin Sun
status: Withdrawn
created: 2020-07-15
---
| 12 | 23 | 0.708333 | eng_Latn | 0.482983 |
a68534c1210a7762717f20849dddebb022326477 | 2,115 | md | Markdown | README.md | cryptoassetlab/cryptoassetlab.github.io | f2f59757d915ea17eb3627c68e4cdebe14b92ee9 | [
"MIT"
] | null | null | null | README.md | cryptoassetlab/cryptoassetlab.github.io | f2f59757d915ea17eb3627c68e4cdebe14b92ee9 | [
"MIT"
] | null | null | null | README.md | cryptoassetlab/cryptoassetlab.github.io | f2f59757d915ea17eb3627c68e4cdebe14b92ee9 | [
"MIT"
] | 1 | 2020-02-03T14:58:49.000Z | 2020-02-03T14:58:49.000Z | ---
title: Crypto Asset Lab
excerpt: Website of the Crypto Asset Lab
layout: homelay
permalink: /
---
### Mission Statement
The [Crypto Asset Lab](https://cryptoassetlab.diseade.unimib.it/) (CAL)
is a research initiative
on bitcoin and crypto assets
as investment opportunity, fintech innovation,
and regulatory challenge,
with special regard for their disruptive role
in the future of money and finance.
We also pay attention to the innovations
in cryptography and blockchain technology,
given their relevance for privacy,
security, and other applications
(e.g. [timestamping](http://dgi.io/ots/), see
[OpenTimestamps](http://opentimestamps.org/))
but we do not subscribe to the widespread hype
about blockchain potential beyond crypto assets.
We are a meeting point between academia, industry,
institutions, and regulators; we encourage students, researchers,
and practitioners to join us and help with
research, development, training, teaching, and
other experimental activities.
We organize the [CAL yearly conference]({{ site.baseurl }}/calconf).
## About
[University of Milano-Bicocca](http://www.unimib.it)
has been offering a
"[Bitcoin and Blockchain Technology](http://www.ametrano.net/bbt/)"
course since the 2015/2016 academic year.
Crypto Asset Lab is a joint initiative of
the [Digital Gold Institute](http://www.dgi.io)
and our academic and scientific community.
[<img src="/img/bicocca-logo.png" height="80">](http://www.diseade.unimib.it/it)
**+**
[<img src="/img/dgi-logo.png" height="80">](http://dgi.io)
**==**
[<img src="/img/cal.png" height="80">]({{ site.baseurl }})
## Address
Crypto Asset Lab - Università Milano-Bicocca
Di.SEA.DE, Building U7
Via Bicocca degli Arcimboldi 8
20126 Milano
<iframe src="https://www.google.com/maps/embed?pb=!1m18!1m12!1m3!1d2795.6348896124377!2d9.210284016342875!3d45.51742797910175!2m3!1f0!2f0!3f0!3m2!1i1024!2i768!4f13.1!3m3!1m2!1s0x4786c7481b141dd7%3A0x57e9ff45dc8331de!2sU7+Universit%C3%A0+Milano+Bicocca!5e0!3m2!1sen!2sit!4v1557314816331!5m2!1sen!2sit" width="100%" height="auto" frameborder="0" style="border:0" allowfullscreen></iframe>
| 36.465517 | 386 | 0.767376 | eng_Latn | 0.794435 |
a6855673a8efacb53c8967d971c5b6d8374f353a | 2,208 | md | Markdown | _posts/2021-01-17-os-linux-type.md | kevinsung123/kevinsung123.github.io | e322c1043da2d7937460738f8e01257ead0a6a03 | [
"MIT"
] | null | null | null | _posts/2021-01-17-os-linux-type.md | kevinsung123/kevinsung123.github.io | e322c1043da2d7937460738f8e01257ead0a6a03 | [
"MIT"
] | null | null | null | _posts/2021-01-17-os-linux-type.md | kevinsung123/kevinsung123.github.io | e322c1043da2d7937460738f8e01257ead0a6a03 | [
"MIT"
] | null | null | null | ---
layout: post
title: "[Linux] Linux 종류 "
subtitle: "Linux Type "
categories: os
tags: linux type
comments: true
---
### Linux란?
리눅스(Linux)는 리누스 토발즈가 커뮤니티 주체로 개발한 컴퓨터 운영 체제입니다.
<!--more-->
리눅스(Linux)는 UNIX운영체제를 기반으로 만들어진 운영체제 입니다. 리눅스(Linux)는 유닉스(UNIX)와 마찬가지로 다중 사용자, 다중 작업(멀티태스킹), 다중 스레드를 지원하는 네트워크 운영 체제(NOS)입니다.
리눅스의 원형이 되는 UNIX가 애초부터 통신 네트워크를 지향하여 설계된것처럼 리눅스 역시 서버로 작동하는데 최적화되어있습니다. 고로 서버에서 사용되는 운영체제로 많이 사용되고 있습니다.
### Linux특징
1. 리눅스는 유닉스와 완벽하게 호환가능합니다.
2. 리눅스는 공개 운영체제입니다. 오픈소스이므로 누구든지 자유롭게 수정이 가능합니다.
3. 리눅스는 PC용 OS보다 안정이며 보안쪽면에서도 PC용 OS보다 비교적 우수한 성능을 가지고 있습니다.
4. 리눅스는 다양한 네트워킹 기술을 제공하고 있으며 서버용 OS로 적합합니다.
5. 배포판이 아닌 리눅스 그 자체는 무료입니다.
### Linux 종류
- 대표적인 계열이 레드햇 계열과 데비안 계열
- 레드햇 계열 : Redhat, Centos
- 데비안 계열 : Ubuntu
#### REDHAT계열
레드햇계열은 레드햇이라는 회사에서 배포한 리눅스를 말합니다. 2003년까지는 오픈소스 라이선스로 진행하다가 이후 상용화되었습니다. 레드햇 리눅스는 배포판 중에서 가장 인기가 많습니다. 커뮤니티가 아닌 회사에서 관리하는 레드햇계열의 리눅스는 다른 리눅스 배포판에 비해 패치가 빠르며 내장되어있는 유틸리티의 양도 많고 관리툴의 성능도 우수합니다. 또 호환성면에서도 나무랄데가 없지요. 여러모로 장점이 많습니다. 레드햇 계열의 리눅스에는 페도라와 센토스가 있는데 오늘날에는 페도라보다는 센토스를 더 많이 사용하는 추세입니다.
#### CENTOS
CentOS는 Community Enterprise Operating System 의 약자로 Red Hat이 공개한 RHEL을 그대로 가져와서 Red Hat의 브랜드와 로고만 제거하고 배포한 배포본입니다. 사실상 RHEL 의 소스를 그대로 사용하고 있기에 RHEL 과 OS 버전, Kernel 버전, 패키지 구성이 똑같고 바이너리가 100%로 호환됩니다. 무료로 사용 가능하지만 문제 발생시 레드햇이라는 회사가 아닌 커뮤니티를 통해 지원이 되므로 다소 패치가 느린감이 없지않아 있습니다. 특히 서버용 운영체제로 인기가 매우 높으며 서버용으로 리눅스를 운영할 목적이라면 아마 대부분 이 센토스OS를 사용하는것이 대부분일 것입니다.
#### 데비안계열
데비안은 온라인 커뮤니티에서 제작하여 레드햇보다 더 먼저 배포되어 시장을 선점하였습니다. 이 데비안에서 파생되어진 OS를 데비안 계열이라고 부릅니다. 하지만, 자발적인 커뮤니티에서 만드는 배포판이라 전문적인 회사에서 서비스를 했던 레드햇계열에 비해 사후지원과 배포가 늦고 내장 유틸들의 성능이 레드햇계열에 비해 부족한감이 있어 오랫동안 레드햇에 밀렸었습니다. 하지만 현재는 무료 개인사용자 서버용으로 인기가 매우 높으며 최근에는 지속적인 업데이트를 거친 결과 레드햇계열에 비해 결코 성능이나 뒤쳐지지 않습니다. 그리고 넓은 유저층을 가지고 있는 데비안계열은 그 사용법이 온라인 웹사이트나 커뮤니티에 자세히 기술되어 있다는 점이 진입장벽을 낮추어 초보 리눅스유저들이 접근하기 쉬운 OS라고 할 수 있겠습니다.
#### UBUNTU
영국의 캐노니컬이라는 회사에서 만든 배포판으로 쉽고 편한 설치와 이용법 덕분에 진입장벽이 낮아 초보자들이 쉽게 접근할 수 있으며 데스크탑용 리눅스 배포판 가운데서 가장 많이 사용되어지고있는 배포판입니다. 개인용 데스크톱 운영체제로 많이들 사용합니다., 서버용으로도 기능이 부족하거나 성능이 딸리지는 않습니다만 서버용 리눅스 점유율로 볼때 센토스에 많이 밀리는것은 사실입니다. 서버는 센토스 데스크톱으로는 우분투라고 생각하시면 될듯 하네요.
### 참조
- [blog](https://coding-factory.tistory.com/318?category=760718) | 48 | 396 | 0.718297 | kor_Hang | 1.00001 |
a68644648fa39b240bdb1045057630e7f4dd66af | 134 | md | Markdown | gentoo/README.md | fizzoo/dotfiles | 8204570c67ffbe324f69d4eab9b6a34ac4615314 | [
"MIT"
] | 19 | 2017-02-14T13:23:51.000Z | 2017-04-16T20:03:48.000Z | gentoo/README.md | fizzoo/dotfiles | 8204570c67ffbe324f69d4eab9b6a34ac4615314 | [
"MIT"
] | null | null | null | gentoo/README.md | fizzoo/dotfiles | 8204570c67ffbe324f69d4eab9b6a34ac4615314 | [
"MIT"
] | null | null | null | # Gentooey
'arc' is a Thinkpad X260, so if you have the same machine it is likely that the
corresponding .config works well for you.
| 26.8 | 79 | 0.761194 | eng_Latn | 0.999948 |
a686805a046fb441b83a4f3ce676c9c400e32281 | 254 | md | Markdown | _posts/2017-04-24-39-9032.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | _posts/2017-04-24-39-9032.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | _posts/2017-04-24-39-9032.md | ashmaroli/mycad.github.io | 47fb8e82e4801dd33a3d4335a3d4a6657ec604cf | [
"MIT"
] | null | null | null | ---
layout: post
amendno: 39-9032
cadno: CAD2017-B787-05
title: 在客舱中安装防火挡块和防火隔离胶带
date: 2017-04-24 00:00:00 +0800
effdate: 2017-05-01 00:00:00 +0800
tag: B787
categories: 上海航空器适航审定中心
author: 朱宁文
---
##适用范围:
本指令适用于本指令第三段2、3、4所列出的波音公司服务通告中所涉及的波音787-8飞机。
| 15.875 | 44 | 0.751969 | yue_Hant | 0.559484 |
a686bfb387106f2cd2765677beae23d1d690a071 | 3,033 | md | Markdown | README.md | rcasto/pi-up | fb5a2e88c79001ddfd965359ed6eef40bc8e2be9 | [
"MIT"
] | null | null | null | README.md | rcasto/pi-up | fb5a2e88c79001ddfd965359ed6eef40bc8e2be9 | [
"MIT"
] | null | null | null | README.md | rcasto/pi-up | fb5a2e88c79001ddfd965359ed6eef40bc8e2be9 | [
"MIT"
] | null | null | null | # pi-up
The goal of this project is to execute a custom update script on a Raspberry Pi according to a desired schedule and be notified of the results.
## Background
Truthfully, this project could be used to execute any custom script on a Raspberry Pi according to a desired schedule.
The primary focus with this project though, is to update a Raspberry Pi. Also, this project currently assumes the Raspberry Pi is running Pi-hole and will attempt to update that as well.
## Usage
### Prerequisites
- Must have SSH access to Raspberry Pi
- Must know the IP address of your Raspberry Pi
- [Git](https://git-scm.com/downloads) must be installed: `apt-get install git`
- [Node.js](https://nodejs.org/en/download/) must be installed on Raspberry Pi
- Must have sendmail package installed: `apt-get install sendmail`
### Configuration
To get started you will want to update the `config.json` file in this repo. Below is a sample configuration:
```json
{
"scheduleCron": "0 0 18 * * *",
"runOnInit": true,
"scriptPath": "/home/pi/pi-up/custom-update.sh",
"email": "[email protected]"
}
```
- **scheduleCron** - crontab describing schedule of which to run update script. [6 fields supported, finest granularity being 1 second](https://www.npmjs.com/package/cron#available-cron-patterns).
- **runOnInit** - Whether or not the update script should be ran on initialization of schedule or not. When false, first run will happen on next scheduled occurrence.
- **scriptPath** - Path to custom script containing update routine. It's much safer to use the exact path, instead of a relative path to the script.
- **email** - Optional field for email address, email is sent to this address with results of update routine. Simply leave empty to not send an email.
- **name** - Optional field to identify your Raspberry Pi, this will be included in the subject of the email sent to you.
### Starting scheduled update
1. SSH into Raspberry Pi
```
ssh pi@<your-pi-ip-address>
```
2. Git clone this repo (on Raspberry Pi):
```
git clone https://github.com/rcasto/pi-up
```
3. Change directory into repository folder:
```
cd pi-up
```
4. Install npm package dependencies:
```
npm install
```
5. Adjust configuration and script to your liking
6. Start update schedule:
```
npm start
```
### Running on Raspberry Pi bootup
Edit `/etc/rc.local`, adding:
```
node /home/pi/pi-up/index.js &
```
**Note:** You will want to have changed your `scriptPath` in the configuration to use an exact path.
### Resources
- [SSH Raspberry Pi](https://www.raspberrypi.org/documentation/remote-access/ssh/)
- [Install Node.js and Npm on Raspberry Pi](https://github.com/nodesource/distributions/blob/master/README.md#debinstall)
- [Helpful crontab scheduling editor](https://crontab.guru/)
- [Run command on Raspberry Pi bootup](https://www.raspberrypi.org/documentation/linux/usage/rc-local.md)
- [Pi-hole](https://pi-hole.net/)
- [Big Blocklist Collection](https://firebog.net/)
| 41.547945 | 196 | 0.724036 | eng_Latn | 0.96001 |
a687187907d65f14a9639adcc9d850b54205c146 | 921 | md | Markdown | src/pages/zh-cn2/search.md | fx-mobile/fx-doc | dacfc02073e9051064c72cb472ab66d0e7c5ffe5 | [
"MIT"
] | null | null | null | src/pages/zh-cn2/search.md | fx-mobile/fx-doc | dacfc02073e9051064c72cb472ab66d0e7c5ffe5 | [
"MIT"
] | null | null | null | src/pages/zh-cn2/search.md | fx-mobile/fx-doc | dacfc02073e9051064c72cb472ab66d0e7c5ffe5 | [
"MIT"
] | null | null | null | # Search
> 搜索框,可显示搜索结果列表。
----------
## 引入
```javascript
import { Search } from 'fx-mui';
Vue.component(Search.name, Search);
```
## 例子
基础用法
```html
<mt-search v-model="value"></mt-search>
```
设置显示文字
```html
<mt-search
v-model="value"
cancel-text="取消"
placeholder="搜索">
</mt-search>
```
带搜索结果
```html
<mt-search v-model="value" :result.sync="result"></mt-search>
```
自定义搜索结果
```html
<mt-search v-model="value">
<mt-cell
v-for="item in result"
:title="item.title"
:value="item.value">
</mt-cell>
</mt-search>
```
## API
| 参数 | 说明 | 类型 | 可选值 | 默认值 |
|------|-------|---------|-------|--------|
| value | 搜索结果绑定值 | String | | |
|cancel-text | 取消按钮文字 | String | | 取消 |
|placeholder | 搜索框占位内容 | String | | 搜索 |
| result | 搜索结果列表 | Array | | |
| autofocus | 自动聚焦 | Boolean | - | false |
| show | 始终显示搜索列表 | Boolean | - | false |
## Slot
| name | 描述 |
|------|--------|
| - | 自定义搜索结果列表|
| 14.390625 | 61 | 0.534202 | yue_Hant | 0.558112 |
a6872afdd4295542eb263b5b10f6bcace5e1f944 | 10,821 | md | Markdown | articles/data-factory/data-access-strategies.md | mKenfenheuer/azure-docs.de-de | 54bb936ae8b933b69bfd2270990bd9f253a7f876 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/data-factory/data-access-strategies.md | mKenfenheuer/azure-docs.de-de | 54bb936ae8b933b69bfd2270990bd9f253a7f876 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/data-factory/data-access-strategies.md | mKenfenheuer/azure-docs.de-de | 54bb936ae8b933b69bfd2270990bd9f253a7f876 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Datenzugriffsstrategien
description: Azure Data Factory unterstützt jetzt statische IP-Adressbereiche.
services: data-factory
ms.author: abnarain
author: nabhishek
ms.service: data-factory
ms.workload: data-services
ms.topic: conceptual
ms.date: 05/28/2020
ms.openlocfilehash: 76181f089511a6645a51707f9a8537c1589d82bf
ms.sourcegitcommit: de2750163a601aae0c28506ba32be067e0068c0c
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 09/04/2020
ms.locfileid: "89484951"
---
# <a name="data-access-strategies"></a>Datenzugriffsstrategien
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
Ein wichtiges Sicherheitsziel einer Organisation besteht darin, ihre Datenspeicher vor zufälligem Zugriff über das Internet zu schützen – unabhängig davon, ob es sich um einen lokalen oder einen Cloud-/SaaS-Datenspeicher handelt.
Normalerweise steuert ein Clouddatenspeicher den Zugriff mithilfe der folgenden Mechanismen:
* Privater Link aus einem virtuellen Netzwerk zu Datenquellen, die für den privaten Endpunkt aktiviert sind
* Firewallregeln, die die Konnektivität nach IP-Adresse einschränken
* Authentifizierungsmechanismen, bei denen Benutzer ihre Identität nachweisen müssen
* Autorisierungsmechanismen, die Benutzer auf bestimmte Aktionen und Daten einschränken
> [!TIP]
> Mit der [Einführung des statischen IP-Adressbereichs](https://docs.microsoft.com/azure/data-factory/azure-integration-runtime-ip-addresses) können Sie jetzt Listen-IP-Adressbereiche für die jeweilige Azure Integration Runtime-Region zulassen, um sicherzustellen, dass Sie in Ihren Clouddatenspeichern nicht alle Azure-IP-Adressen zulassen müssen. Auf diese Weise können Sie die IP-Adressen einschränken, denen der Zugriff auf die Datenspeicher gestattet wird.
> [!NOTE]
> Die IP-Adressbereiche sind für Azure Integration Runtime gesperrt und werden derzeit nur für Datenverschiebungs-, Pipeline- und externe Aktivitäten verwendet. Dataflows und Azure Integration Runtimes, die ein verwaltetes virtuelles Netzwerk aktivieren, verwenden diese IP-Adressbereiche derzeit nicht.
Dies sollte in vielen Szenarien funktionieren, und wir verstehen, dass eine eindeutige statische IP-Adresse pro Integration Runtime wünschenswert wäre. Dies wäre aber bei der derzeitigen Verwendung von Azure Integration Runtime nicht möglich, weil es serverlos ist. Bei Bedarf können Sie eine selbstgehostete Integration Runtime jederzeit einrichten und dabei Ihre statische IP-Adresse verwenden.
## <a name="data-access-strategies-through-azure-data-factory"></a>Datenzugriffsstrategien für Azure Data Factory
* **[Private Link](https://docs.microsoft.com/azure/private-link/private-link-overview)** : Sie können eine Azure Integration Runtime in einem verwalteten virtuellen Azure Data Factory-Netzwerk erstellen. Diese nutzt private Endpunkte, um eine sichere Verbindung mit unterstützten Datenspeichern herzustellen. Der Datenverkehr zwischen dem verwalteten virtuellen Netzwerk und den Datenquellen wird über das Microsoft-Backbonenetzwerk übertragen und nicht für das öffentliche Netzwerk verfügbar gemacht.
* **[Vertrauenswürdiger Dienst](https://docs.microsoft.com/azure/storage/common/storage-network-security#exceptions)** – Azure Storage (Blob, ADLS Gen2) unterstützt die Firewallkonfiguration, die ausgewählten vertrauenswürdigen Azure-Plattformdiensten einen sicheren Zugriff auf das Speicherkonto ermöglicht. Vertrauenswürdige Dienste erzwingen die Authentifizierung der verwalteten Identität. Dadurch wird sichergestellt, dass keine andere Data Factory eine Verbindung mit diesem Speicher herstellen kann – außer wenn die Whitelist den Vermerk enthält, dass dies mithilfe von dessen verwalteter Identität möglich ist. Weitere Informationen finden Sie in **[diesem Blog](https://techcommunity.microsoft.com/t5/azure-data-factory/data-factory-is-now-a-trusted-service-in-azure-storage-and-azure/ba-p/964993)** . Deshalb ist dies extrem sicher und empfehlenswert.
* **Eindeutige statische IP-Adresse** – Sie müssen eine selbstgehostete Integration Runtime einrichten, um eine statische IP-Adresse für Data Factory-Connectors zu erhalten. Durch diesen Mechanismus wird sichergestellt, dass Sie den Zugriff von allen anderen IP-Adressen blockieren können.
* **[Statischer IP-Adressbereich](https://docs.microsoft.com/azure/data-factory/azure-integration-runtime-ip-addresses)** – Sie können die IP-Adressen der Azure Integration Runtime verwenden, um deren Auflistung in Ihrem Speicher (z B. S3, Salesforce usw.) zuzulassen. Er schränkt IP-Adressen, die eine Verbindung mit den Datenspeichern herstellen können, sicherlich ein, basiert aber auch auf Authentifizierungs-/Autorisierungsregeln.
* **[Diensttag](https://docs.microsoft.com/azure/virtual-network/service-tags-overview)** – Ein Diensttag stellt eine Gruppe von IP-Adresspräfixen eines bestimmten Azure-Diensts (wie Azure Data Factory) dar. Microsoft verwaltet die Adresspräfixe, für die das Diensttag gilt, und aktualisiert das Tag automatisch, wenn sich die Adressen ändern. Auf diese Weise wird die Komplexität häufiger Updates an Netzwerksicherheitsregeln minimiert. Dies ist nützlich beim Whitelisting des Datenzugriffs auf in IaaS gehostete Datenspeicher in Virtual Network.
* **Azure-Dienste zulassen** – Einige Dienste ermöglichen es Ihnen, die Verbindung aller Azure-Dienste damit zuzulassen, falls Sie diese Option auswählen.
Weitere Informationen zu unterstützten Netzwerksicherheitsmechanismen für Datenspeicher in Azure Integration Runtime und selbstgehosteter Integration Runtime finden Sie in den nachstehenden zwei Tabellen.
* **Azure Integration Runtime**
| Datenspeicher | Unterstützter Netzwerksicherheitsmechanismus für Datenspeicher | Private Link | Vertrauenswürdiger Dienst | Statischer IP-Adressbereich | Diensttags | Azure-Dienste zulassen |
|------------------------------|-------------------------------------------------------------|---------------------|-----------------|--------------|----------------------|-----------------|
| Azure PaaS-Datenspeicher | Azure Cosmos DB | Ja | - | Ja | - | Ja |
| | Azure-Daten-Explorer | - | - | Ja* | Ja* | - |
| | Azure Data Lake Gen1 | - | - | Ja | - | Ja |
| | Azure Database for MariaDB, MySQL, PostgreSQL | - | - | Ja | - | Ja |
| | Azure File Storage | Ja | - | Ja | - | . |
| | Azure Storage (Blob, ADLS Gen2) | Ja | Ja (nur MSI-Authentifizierung) | Ja | - | . |
| | Azure SQL DB, Azure Synapse Analytics, SQL ML | Ja (nur Azure SQL DB/DW) | - | Ja | - | Ja |
| | Azure Key Vault (zum Abrufen von Geheimnissen/Verbindungszeichenfolge) | ja | Ja | Ja | - | - |
| Andere PaaS-/SaaS-Datenspeicher | AWS S3, SalesForce, Google Cloud Storage usw. | - | - | Ja | - | - |
| Azure laaS | SQL Server, Oracle usw. | - | - | Ja | Ja | - |
| Lokales IaaS | SQL Server, Oracle usw. | - | - | Ja | - | - |
**Gilt nur, wenn Azure Data Explorer in das virtuelle Netzwerk eingefügt wird und der IP-Adressbereich auf NSG/Firewall angewendet werden kann.*
* **Selbstgehostete Integration Runtime (in VNET/lokal)**
| Datenspeicher | Unterstützter Netzwerksicherheitsmechanismus für Datenspeicher | Statische IP | Vertrauenswürdige Dienste |
|--------------------------------|---------------------------------------------------------------|-----------|---------------------|
| Azure PaaS-Datenspeicher | Azure Cosmos DB | Ja | - |
| | Azure-Daten-Explorer | - | - |
| | Azure Data Lake Gen1 | Ja | - |
| | Azure Database for MariaDB, MySQL, PostgreSQL | Ja | - |
| | Azure File Storage | Ja | - |
| | Azure Storage (Blog, ADLS Gen2) | Ja | Ja (nur MSI-Authentifizierung) |
| | Azure SQL DB, Azure Synapse Analytics, SQL ML | Ja | - |
| | Azure Key Vault (zum Abrufen von Geheimnissen/Verbindungszeichenfolge) | Ja | Ja |
| Andere PaaS-/SaaS-Datenspeicher | AWS S3, SalesForce, Google Cloud Storage usw. | Ja | - |
| Azure laaS | SQL Server, Oracle usw. | Ja | - |
| Lokales laaS | SQL Server, Oracle usw. | Ja | - |
## <a name="next-steps"></a>Nächste Schritte
Weitere Informationen finden Sie in den folgenden verwandten Artikeln:
* [Unterstützte Datenspeicher](https://docs.microsoft.com/azure/data-factory/copy-activity-overview#supported-data-stores-and-formats)
* [Azure Key Vault – „Vertrauenswürdige Dienste“](https://docs.microsoft.com/azure/key-vault/key-vault-overview-vnet-service-endpoints#trusted-services)
* [Azure Storage – „Vertrauenswürdige Microsoft-Dienste“](https://docs.microsoft.com/azure/storage/common/storage-network-security#trusted-microsoft-services)
* [Verwaltete Identität für Data Factory](https://docs.microsoft.com/azure/data-factory/data-factory-service-identity)
| 121.58427 | 862 | 0.602625 | deu_Latn | 0.972835 |
a6873e99b6280aa7060fc382db100bca7b9d9dc1 | 3,147 | md | Markdown | _posts/2019-07-15-Download-implementation-of-mother-tongue-education-in-ethiopia-the-case-of-xamtanga-language-in-waghimra-zone.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2019-07-15-Download-implementation-of-mother-tongue-education-in-ethiopia-the-case-of-xamtanga-language-in-waghimra-zone.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2019-07-15-Download-implementation-of-mother-tongue-education-in-ethiopia-the-case-of-xamtanga-language-in-waghimra-zone.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Implementation of mother tongue education in ethiopia the case of xamtanga language in waghimra zone book
" With a "Thank Gimma for taking our side. "You'd have done the "Wasn't ever the case I was schemin' toward that, Barry had every reason to be optimistic. "I sought the deer today. It was a cosy, as everything since the ship's arrival had amply demonstrated, encapsulate it in a implementation of mother tongue education in ethiopia the case of xamtanga language in waghimra zone and bury it, which is a relief to Curtis, it is my medical curiosity, however. But it was as if they did not see me at all. Ice-Sieve, not like Earth the last time I was there, this detective. " Quoth I, maybe, piping voice, and in some mountains. hard flat crump draws Curtis's attention to the town just in time to see one information as to the state of the ice in that sea. It is probable that with few interruptions, p. "My name is Hal. In fact, nor will I subtract this from aught of my due. "She'd only want to reintegrate me! She lifted her head and kissed me hard! Implementation of mother tongue education in ethiopia the case of xamtanga language in waghimra zone Klonk, and after Cass has determined that the "You're spooking me. Then a second? To avoid brooding too much about her impotence in the matter of Leilani Klonk, the troops (104) sallied forth of Baghdad and went out to meet those of El Abbas. The Merchant and the Two Sharpers clii visible. The probable reason didn't require much guesswork; Earth's political history was riddled with instances of authorities provoking disturbances deliberately in order to justify tough responses in the eyes of their own people. The Voyage Home--Christmas, Small wars unlikely to escalate into worldwide clashes should be viewed not as and toward the end I believed that they were there with me, penetrating the grumble and the bleat of traffic. "Sure. "It was one of the fruit," she said, But other accounts lead us to infer that the Russian _lodjas_ Boy and dog enter the meadow without being challenged at the open gate. " Nolan straightened quickly. "I think they fear them too," said Veil. At the most, limp. As Dr. " LEDEB. They shuffled uncomfortably and exchanged apprehensive looks, the sea almost dead calm. Farrel, be ceremony laid aside between us. A number of things which form "In consequence of the soft state of the snow we were complete! " On the afternoon of November ninth, Mai, closing it behind her to hide what lay inside. the forests that were or might yet be. " Yeller in the movie. Never would he pause to reload at this desperate penultimate moment, pouchy-cheeked face of a fish, raped her, then buried her face against my shoulder. He got in the Suburban, fearless, if you count limited editions and pamphlets and such, and in it a magnificent purse. Early one morning, sympathized with every minority to a limited extent, both rich and poor. cracked, and whoso reareth the young of the serpent shall get of them nought but biting. healed Ring to Havnor, anyway, and could not be induced to take exercise. | 349.666667 | 2,973 | 0.787099 | eng_Latn | 0.999881 |
a6888882208a75ca8c87105e9d417435ff8f3d79 | 39,668 | md | Markdown | docs/test/overview.md | wangyoutian/azure-devops-docs | a38fff177d9478aa3fad0f29e85c447b911e74fe | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/test/overview.md | wangyoutian/azure-devops-docs | a38fff177d9478aa3fad0f29e85c447b911e74fe | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/test/overview.md | wangyoutian/azure-devops-docs | a38fff177d9478aa3fad0f29e85c447b911e74fe | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: What is Azure Test Plans? Manual, exploratory, and automated test tools.
description: Learn about the test tools and capabilities that Azure Test Plans provides to drive quality and collaboration throughout the development process.
ms.assetid: E9D8D614-A09A-4327-81B6-39F880D685E6
ms.technology: devops-test
ms.topic: overview
ms.author: kaelli
author: KathrynEE
monikerRange: '>= tfs-2015'
ms.date: 12/13/2021
---
# What is Azure Test Plans?
[!INCLUDE [version-gt-eq-2015](../includes/version-gt-eq-2015.md)]
Azure Test Plans provides rich and powerful
tools everyone in the team can use to drive quality and collaboration
throughout the development process. The easy-to-use, browser-based
test management solution provides all the capabilities required for
planned manual testing, user acceptance testing, exploratory testing,
and gathering feedback from stakeholders.
:::image type="content" source="media/overview/intro-test-plans.png" alt-text="Screenshot of Azure Test Plans, Test Plans, All":::
> [!NOTE]
> This article applies to Azure DevOps Services and Azure DevOps Server 2020 and later versions. Most of the information is valid for earlier on-premises versions, however, images show only examples for the latest version. Also, the user interface changed significantly with the release of Azure DevOps Server 2020. For an overview of the new interface and supported capabilities, see [Navigate Test Plans](navigate-test-plans.md).
## How does Azure Test Plans work?
Through a combination of browser-based tools—[**Test plans**](#test-plans), [**Progress report**](#progress-report), [**Parameters**](#parameters), [**Configurations**](#configurations), [**Runs**](#runs), and [Test tools](#test-tools)—and DevOps integration features, Azure Test Plans supports the following test objectives:
- [**Manual and exploratory testing**](#manual): Manual and exploratory testing which includes the following test activities:
- **[Planned manual testing](#test-plans)**. Manual testing by organizing tests into test plans and test suites by designated testers and test leads.
- **[User acceptance testing](#user-acceptance)**. Testing carried out by designated user acceptance testers to verify the value delivered meets customer requirements, while reusing the test artifacts created by engineering teams.
- **[Exploratory testing](#exploratory-testing)**. Testing carried out by development teams, including developers, testers, UX teams, product owners and more, by exploring the software systems without using test plans or test suites.
- **[Stakeholder feedback](#stakeholder-feedback)**. Testing carried out by stakeholders outside the development team, such as users from marketing and sales divisions.
- [**Automated testing**](#automated): Azure Test Plans is fully integrated with Azure Pipelines to support testing within continuous integration/continuous deployment (CI/CD). Test plans and test cases can be associated with build or release pipelines. Pipeline tasks can be added to pipeline definitions to capture and publish test results. Test results can be reviewed via built in progress reports and pipeline test reports.
- [**Traceability**](#traceability): Test cases and test suites linked to user stories, features, or requirements supports end-to-end traceability. Tests and defects are automatically linked to the requirements and builds being tested, which also helps tracking the quality of requirements. Users can add and run tests from the Kanban board, or for larger teams, use the Test plans hub to define test plans and test suites. Pipeline results and the Requirements widget provide a means to track testing of requirements.
- [**Reporting and analysis**](#reporting): Test result tracking and progress monitoring is supported through configurable tracking charts, test-specific widgets that you can add to dashboards, and built-in reports, such as Progress report, pipeline test result reports, and the Analytics service.
> [!NOTE]
> **Load and performance testing**: While Azure DevOps cloud-based load testing service is deprecated, Azure Load Testing Preview is available. Azure Load Testing Preview is a fully managed load testing service that enables you to use existing Apache JMeter scripts to generate high-scale load. To learn more, see [What is Azure Load Testing Preview?](/azure/load-testing/overview-what-is-azure-load-testing). For more information about the deprecation of Azure DevOps load testing, see [Changes to load test functionality in Visual Studio and cloud load testing in Azure DevOps](/previous-versions/azure/devops/test/load-test/overview).
### Key benefits
Azure Test Plans provides software development teams the following benefits.
- **Test on any platform**: With the **Test Plans** web portal, you can use any supported browser to access all the manual testing capabilities. It enables you to [create](create-test-cases.md) and [run manual tests](run-manual-tests.md) through an easy-to-use, browser-based interface that users can access from all major browsers on any platform.
- **Rich diagnostic data collection**: Using the web-based Test Runner and Test Runner client you can [collect rich diagnostic data](collect-diagnostic-data.md) during your manual tests. This includes screenshots, an image action log, screen recordings, code coverage, IntelliTrace traces, and test impact data for your apps under test. This data is automatically included in all the bugs you create during test, making it easy for developers to reproduce the issues.
- **End to End traceability**: Azure DevOps provides [end-to-end traceability of your requirements, builds, tests and bugs](../boards/queries/link-work-items-support-traceability.md?toc=/azure/devops/test/toc.json&bc=/azure/devops/test/breadcrumb/toc.json). Users can track their requirement quality from cards on the Kanban board. Bugs created while testing are automatically linked to the requirements and builds being tested, which helps you track the quality of the requirements or builds.
- **Integrated analytics**: The Analytics service provides data that feeds into built-in reports, configurable dashboard widgets, and customizable reports using Power BI. Data tracks test plan progress and trends for both manual and automated tests. Test analytics provides near real-time visibility into test data for builds and releases. Teams can act on this data to improve test collateral to help maintain healthy pipelines.
- **Extensible platform**. You can combine the tools and technologies you already know with the development tools that work best for you to integrate with and [extend Azure DevOps](../integrate/index.md). Use the REST APIs and contribution model available for the Test platform to create extensions that provide the experience you need for your test management lifecycle.
### Supported scenarios and access requirements
Access to Azure DevOps web portal features are managed through access levels assigned to users. The three main access levels are **Stakeholder**, **Basic**, and **Basic+Test** plans as described in [About access levels](../organizations/security/access-levels.md). The following table indicates the access-level required to exercise the associated tasks with Azure Test Plans. In addition to access levels, select features require permissions to execute. To learn more, see [Manual test access and permissions](manual-test-permissions.md).
:::row:::
:::column span="3":::
**Scenario and tasks**
:::column-end:::
:::column span="1":::
**Stakeholder**
:::column-end:::
:::column span="1":::
**Basic**
:::column-end:::
:::column span="1":::
**Basic +Test Plans**
:::column-end:::
:::row-end:::
---
:::row:::
:::column span="3":::
**Test planning**
- Create test plans and test suites
- Manage test plan run settings
- Manage configurations
:::column-end:::
:::column span="1":::
:::column-end:::
:::column span="1":::
:::column-end:::
:::column span="1":::
✔️
:::column-end:::
:::row-end:::
:::row:::
:::column span="3":::
**Test execution**
- Run tests on any platform (Windows, Linux, Mac) with Test Runner
:::column-end:::
:::column span="1":::
:::column-end:::
:::column span="1":::
✔️
:::column-end:::
:::column span="1":::
✔️
:::column-end:::
:::row-end:::
:::row:::
:::column span="3":::
**Perform exploratory testing with the Test & Feedback extension**
:::column-end:::
:::column span="1":::
✔️
:::column-end:::
:::column span="1":::
✔️
:::column-end:::
:::column span="1":::
✔️
:::column-end:::
:::row-end:::
:::row:::
:::column span="3":::
**Analyze and review tests**
- Create charts with various pivots like priority, configuration, etc., to track test progress
- Browse test results
- Export test plans and test suites for review
- User Acceptance Testing – Assign tests and invite by email
:::column-end:::
:::column span="1":::
:::column-end:::
:::column span="1":::
✔️
:::column-end:::
:::column span="1":::
✔️
:::column-end:::
:::row-end:::
<a id="manual" />
## Manual and exploratory testing
To support manual and exploratory testing, Azure Test Plans uses test-specific work item types to plan and author tests. In addition, it provides two test tools to support running tests. The [**Test plans**](#test-plans), [**Parameters**](#parameters), and [**Configurations**](#configurations) hubs provide the tools to efficiently create and manage test items, their settings, and configurations. Test suites can be dynamic—requirements-based-suites and query-based-suites—to help you understand the quality of associated requirements under development, or static to help you cover regression tests.
### Test-specific work item types
The work item types—**Test Plans**, **Test Suites**, **Test Cases**, **Shared Steps**, and **Shared Parameters**—support several explicit links to support requirements tracking and sharing test steps and data across many test cases. Test cases can be assigned as manual or automated. For a description of each of these test items, see [Test objects and terms](test-objects-overview.md).

> [!NOTE]
> With Azure DevOps Server 2020 and later versions, you can perform automated tests by adding test tasks to pipelines. Defining test plans, test cases, and test suites isn't required when test tasks are used.
<a id="test-plans" />
### Define test plans and test suites
You create and manage test plans and test suites from the **Test plans** hub.
Add one or more test suites—static, requirement-based, or query-based—to the test plans. Export and share test plans and test suites with your teams.
To learn how, see [Create test plans and test suites](create-a-test-plan.md) and [Copy or clone test plans, test suites, and test cases](copy-clone-test-items.md).
:::image type="content" source="media/overview/test-plan-define-execute-chart.png" alt-text="Screenshot of Azure Test Plans, Selected test plans":::
### Author tests using test cases
You define manual test cases by defining the test steps and optionally the test data to reference. Test suites consist of one or more test cases. You can share test cases within test suites. The Grid view for defining test cases supports copy, paste, insert, and delete operations. Quickly assign single or multiple testers to execute tests. View test results and references to a test case across test suites. To learn how, see [Create test cases](create-test-cases.md).
:::image type="content" source="media/overview/test-authoring.png" alt-text="Screenshot of Azure Test Plans, Test plans, test suites, Define tab":::
Within each test case, you specify a set of test steps with their expected outcomes. Optionally, you can add [shared steps](share-steps-between-test-cases.md) or [shared parameters](repeat-test-with-different-data.md). For traceability, link test cases to the user stories, features, or bugs that they test.
:::image type="content" source="media/overview/test-case-form.png" alt-text="Screenshot of test case work item form.":::
<a id="parameters" />
### Manage shared parameters
Teams use the [Parameters](repeat-test-with-different-data.md) hub, to define and manage parameters shared across test cases. Shared parameters provide support for repeating manual tests several times with different test data. For example, if your users can add different quantities of a product to a shopping cart, then you want to check that a quantity of 200 works just as well as a quantity of 1.
:::image type="content" source="media/overview/parameters.png" alt-text="Screenshot of Azure Test Plans, Parameters hub":::
<a id="configurations" />
### Manage test configurations and variables
With the [Configurations](test-different-configurations.md) hub, teams can define, review, and manage test configurations and variables referenced by test plans. Test configurations provide support for testing your applications on different operating systems, web browsers, and versions. As with shared parameters, test configurations can be shared across multiple test plans.
:::image type="content" source="media/overview/configurations.png" alt-text="Screenshot of Azure Test Plans, Configurations hub":::
<a id="test-tools" />
## Test execution and test tools
With the following tools, developers, testers, and stakeholders can initiate tests and capture rich data as they execute tests and automatically log code defects linked to the tests. Test your application by executing tests across desktop or web apps.
- [**Test Runner**](#test-runner): A browser-based tool for testing web applications and a desktop client version for testing desktop applications that you launch from the **Test plans** hub to run manual tests. Test Runner supports rich data collection while performing tests, such as image action log, video recording, code coverage, etc. It also allows users to create bugs and mark the status of tests.
- [**Test & Feedback extension**](#exploratory-testing): A free extension to support exploratory testing that you access from Chrome, Edge, or Firefox browsers. The extension captures interactions with the application being explored through images or video and entering verbal or type-written comments. Information is captured in the Feedback Response work item type to help track response data.
### Test execution capability
You can perform the following tasks using the indicated tools.
| Task | Test plans hub | Test Runner | Test & Feedback extension |
|-----------------------------------------------------| ---------------|-------------|----------------|
| Bulk mark tests | ✔️ | | |
| Pass or fail tests or test steps | | ✔️ | ✔️ |
| Inline changes to tests during execution | | ✔️ | ✔️ |
| Pause and resume tests | | ✔️ | ✔️ |
| File bugs during test execution | | ✔️ | ✔️ |
| Capture screenshots, image action log, and screen recording during test execution | | ✔️ | ✔️ |
| Update existing bugs during test execution | | ✔️ | ✔️ |
| Verify bugs | | ✔️ | ✔️ |
| Assign a build for the test run | ✔️ | | |
| Assign test settings | ✔️ | | |
| Review test runs | ✔️ | | |
### Execute tests
From the **Test plans** hub, **Execute** tab, team members can initiate test execution for one or more test cases defined for a test suite. Choices include running **Test Runner** for a web or desktop application. Optionally, team members can select **Run with options** to choose other supported clients for manual testing, or to select a build for automated testing. To learn more, see [Run manual tests](run-manual-tests.md).
:::image type="content" source="media/overview/execute-tests.png" alt-text="Screenshot of execution of multiple test cases.":::
### Test Runner
**Test Runner** runs tests for your web and desktop applications. Mark test steps and test outcomes as pass or fail, and collect
diagnostic data such as system information, image action logs, screen recordings, and screen captures as you test. Bugs filed during the tests automatically include all captured diagnostic data
to help your developers reproduce the issues. To learn more, see [Run tests for web apps](run-manual-tests.md#run-web) and [Run tests for desktop apps](run-manual-tests.md#run-desktop).
!
:::image type="content" source="media/overview/test-runner.png" alt-text="Screenshot of Test Runner with annotations.":::
<a name="user-acceptance"></a>
### User acceptance testing
User acceptance testing (UAT) helps ensure teams deliver the the value requested by customers. You can create UAT test plans and suites, invite several testers to execute these tests, and monitor test progress and results using lightweight charts. To learn how, see [User acceptance testing](user-acceptance-testing.md).

<a name="exploratory-testing"></a>
### Exploratory testing with the Test & Feedback extension
The [Test & Feedback extension](perform-exploratory-tests.md)
is a simple browser-based extension you can use to test web apps
anytime and anywhere, and is simple enough for everyone in the team to use.
It helps to improve productivity by allowing you to spend more time
finding issues, and less time filing them.

<a name="stakeholder-feedback"></a>
### Stakeholder feedback
Seeking feedback from stakeholders outside the development team, such
as marketing and sales teams, is vital to develop good quality software.
Developers can request feedback on their user stories and features. Stakeholders can respond
to feedback requests using the browser-based Test & Feedback extension -
not just to rate and send comments, but also by capturing rich diagnostic
data and filing bugs and tasks directly.
See more at [Request stakeholder feedback](request-stakeholder-feedback.md)
and [Provide stakeholder feedback](provide-stakeholder-feedback.md).

<a id="automated" />
## Automated testing
Automated testing is facilitated by running tests within Azure Pipelines. Test analytics provides near real-time visibility into your test data for builds and releases. It helps improve pipeline efficiency by identifying repetitive, high impact quality issues.
Azure Test Plans supports automated testing in the following ways:
- Associate test plans or test cases with build or release pipelines
- Specify test-enable tasks within a pipeline definition. Azure Pipelines provides several tasks, including those listed below, that support a comprehensive test reporting and analytics experience.
- [Publish Test Results task](../pipelines/tasks/test/publish-test-results.md): Use to publish test results to Azure Pipelines.
- [Visual Studio Test task](../pipelines/tasks/test/vstest.md): Use to run unit and functional tests (Selenium, Appium, Coded UI test, and more) using the Visual Studio Test Runner.
- [.NET Core CLI task](../pipelines/tasks/build/dotnet-core-cli.md): Use to build, test, package, or publish a dotnet application.
For additional tasks, see [Publish Test Results task](../pipelines/tasks/test/publish-test-results.md)
- Provide built-in reports and configurable dashboard widgets to display results of pipeline testing.
- Collect test results and associated test data into the Analytics service.
<a id="traceability" />
## Traceability
Azure Test Plans supports linking bugs and requirements to test cases and test suites. In addition, the following web portal, test-related tools support traceability:
- [**View items linked to a test case**](#review-linking): View the test plans, test suites, requirements, and bugs that a test case links to.
- [**Add and run tests from the Kanban board**](#kanban): An Azure Boards feature that supports defining test cases from the user stories, features, or bugs from the Kanban board. Also, you can launch the Test Runner or the Test & Feedback extension to run tests or perform exploratory testing.
- [**Requirements quality widget**](#requirements-quality): Configurable widget used to track quality continuously from a build or release pipeline. The widget shows the mapping between a requirement and latest test results executed against that requirement. It provides insights into requirements traceability. For example, requirements not meeting the quality, requirements not tested, and so on.
<a id="review-linking" />
### View items linked to a test case
From the **Test plans** hub, you can view and open the test suites, requirements, and bugs linked to a test case. The **Test Suites** tab also indicates the test plans and projects that reference the test case. The **Requirements** tab lists work items linked to the test case that belong to the requirements category. In addition, you can create a direct-links query that lists items that link to test cases via the **Tests/Tested by** link type. To learn more, see [Create test cases](create-test-cases.md) and [Use direct links to view dependencies](../boards/queries/using-queries.md#use-direct-links-to-view-dependencies).
:::row:::
:::column span="":::
:::image type="content" source="media/overview/linked-test-suites.png" alt-text="Screenshot of Linked test suites for a test case.":::
:::column-end:::
:::column span="":::
:::image type="content" source="media/overview/linked-work-items.png" alt-text="Screenshot of Linked requirements for a test case.":::
:::column-end:::
:::row-end:::
<a id="kanban" />
### Add and run tests from the Kanban board
From the Azure Boards Kanban boards, you can add tests from a user story or feature, automatically linking the test case to the user story or feature. You can view, run, and interact with test cases directly from the Kanban board, and progressively monitor status directly from the card. Learn more at [Add, run, and update inline tests](../boards/boards/add-run-update-tests.md?toc=/azure/devops/test/toc.json&bc=/azure/devops/test/breadcrumb/toc.json).
:::image type="content" source="media/overview/kanban-board-inline-testing.png" alt-text="Screenshot of Kanban board showing inline tests added to work items.":::
<a id="requirements-quality" />
### Requirements quality widget
The Requirements quality widget displays a list of all the requirements in scope, along with the **Pass Rate **for the tests and count of **Failed** tests. Selecting a Failed test count opens the **Tests** tab for the selected build or release. The widget also helps to track the requirements without any associated test(s). To learn more, see [Requirements traceability](../pipelines/test/requirements-traceability.md).
:::image type="content" source="../pipelines/test/media/requirements-traceability/requirements-quality-widget.png" alt-text="Screenshot of Requirements traceability widget added to dashboard.":::
<a id="reporting" />
## Reporting and analysis
To support reporting and analysis, Azure Test Plans supports test tracking charts, a test **Runs** hub, several built-in pipeline test reports, dashboard widgets, and test-data stored in the Analytics service.
- [**Configurable test charts**](#configurable-charts): You can gain insight into the test plan authoring and execution activity by creating test tracking charts.
- [**Progress report**](#progress-report): Track progress of one or test plans or test suites.
- [**Test Runs**](#runs): Review the results of manual and automated test runs.
- Dashboard widgets: Configurable widgets that display test results based on selected builds or releases. Widgets include the [Deployment status](#deployment-status) widget and the [Test Results Trend (Advanced)](#test-results-trend) widget.
- [Test Analytics](#test-analytics-service): Gain detailed insights from built-in pipeline reports or create custom reports by querying the Analytics service.
<a id="configurable-charts" />
### Configurable test charts
Quickly configure lightweight charts to track your manual test results
using the chart types of your choice, and pin the charts to your dashboard to
easily analyze these results. Choose a retention policy to control how
long your manual testing results are retained.
See more at [Track test status](track-test-status.md).

<a id="progress-report" />
### Progress reports
With the [Progress report](progress-report.md) hub, teams can track progress of more than one test plan or test suite. This report helps answer the following questions:
- *How much testing is complete?*
- *How many tests have passed, failed or are blocked?*
- *Is testing likely to complete in time?*
- *What is the daily rate of execution?*
- *Which test areas need attention?*
:::image type="content" source="media/overview/progress-report.png" alt-text="Screenshot of Azure Test Plans, Progress Report hub":::
<a id="runs" />
### Test runs
The [Runs](insights-exploratory-testing.md) hub displays the results of test runs. This includes all test runs, both manual and automated.
> [!NOTE]
> The **Runs** hub is available with Azure DevOps Server 2020 and later versions. It requires enabling the Analytics service which is used to store and manage test run data. To learn more about the service, see [What is the Analytics service?](../report/powerbi/what-is-analytics.md)
:::image type="content" source="media/overview/recent-test-runs.png" alt-text="Screenshot of Recent test runs":::
Choose any specific run to view a summary of the test run.
:::image type="content" source="media/overview/example-run-summary.png" alt-text="Screenshot of selected Test Runs summary":::
<a id="deployment-status" />
#### Deployment status
The Deployment status widget configurable widget shows a combined view of the deployment status and test pass rate across multiple environments for a recent set of builds. You configure the widget by specifying a build pipeline, branch, and linked release pipelines. To view the test summary across multiple environments in a release, the widget provides a matrix view of each environment and corresponding test pass rate.
:::image type="content" source="media/overview/deployment-status.png" alt-text="Screenshot of Deployment Status widget.":::
Hover over any build summary, and you can view more details, specifically the number of tests passed and failed.
:::image type="content" source="media/overview/deployment-status-details-hover-over.png" alt-text="Screenshot of Deployment Status widget, details displayed by hover over a build instance.":::
<a id="test-results-trend" />
### Test results trend (Advanced)
The Test Results Trend (Advanced) widget provides near real-time visibility into test data for multiple builds and releases. The widget shows a trend of your test results for selected pipelines. You can use it to track the daily count of test, pass rate, and test duration. Tracking test quality over time and improving test collateral is key to maintaining a healthy DevOps pipeline. The widget supports tracking advanced metrics for one or more build pipelines or release pipelines. The widget also allows filtering of test results by outcome, stacking metrics, and more. To learn more, see [Configure the Test Results Trend (Advanced) widget](../report/dashboards/configure-test-results-trend.md).
:::image type="content" source="../report/dashboards/media/test-results-trend-widget/passed-bypriority-pass.png" alt-text="Screenshot of Test results trend widget, Advanced version based on Analytics service.":::
<a id="test-analytics-service" />
### Test Analytics
The built-in tests and test-supported widgets derive their data from the Analytics service. The Analytics service is the reporting platform for Azure DevOps. Test Analytics data is available for Azure DevOps Server 2019 and later versions. It supports the **Analytics** and **Tests** tab and drill-down reports available from the **Pipelines** hub. The **Test failure** drill down report provides a summary of passed and failing tests. To learn more, see [Test Analytics](../pipelines/test/test-analytics.md).
:::image type="content" source="media/overview/pipeline-analytics.png" alt-text="Screenshot of Pipelines Analytics summary page.":::
In addition, you can create custom reports by querying the Analytics service. To learn more, see [Overview of sample reports using OData queries](../report/powerbi/sample-odata-overview.md).
<!--- TCM commands:
- Test failure report
- Analytics test data
- Test analytics for builds
- Test analytics for releases
- Test failures
### Supported tasks for test objects
|Test object | Task | Web portal | TCM |
|----------------------------------------------------------------|-------------|-----|
|Test plans | Create | ✔️ | |
| | Copy | ✔️ | |
| | Clone | | ✔️ |
| | Add requirements | ✔️ | |
| | New suites | ✔️ | |
| | Export | ✔️ | |
| | Import test suites | ✔️ | |
| | Charts | ✔️ | |
| | Configurations | ✔️ | |
| | Properties | ✔️ | |
| | Run settings | ✔️ | |
| | View/list | ✔️ | ✔️ |
|Test suite | Create | ✔️ | |
| | Clone | | ✔️ |
| | Export | ✔️ | |
| | Add and view requirement(s) | ✔️ | |
| | Assign configurations | ✔️ | |
| | Assign testers | ✔️ | |
| | Run | ✔️ | |
| | Run with options | ✔️ | |
| | Run in client | ✔️ | |
| | View/list | ✔️ | ✔️ |
|Test case | Create | ✔️ | |
| | Author test cases using Excel like Grid | ✔️ | |
| | Add existing | ✔️ | |
| | View results | ✔️ | |
| | Set state: Active, Passed, Fail, Blocked, N/A| ✔️ | |
|Shared steps | Create | ✔️ | |
| | Add to test cases | ✔️ | |
|Shared parameters| Create | ✔️ | |
| | Add to test cases | ✔️ | |
| | Manage global view | ✔️ | |
|Test runs | Create | | ✔️ |
| | Execute | ✔️ | ✔️ |
| | Export | | ✔️ |
| | Abort | | ✔️ |
| | Delete | ✔️ | ✔️ |
| | Publish | | ✔️ |
| | View/list | ✔️ | ✔️ |
|Test environments| Create | ✔️ | |
| | View/list | ✔️ | ✔️ |
Commands:
tcm configs Lists test configurations
tcm fieldmapping Imports or exports the XML file that maps to the type
provided.
tcm plans Provides operations to list and clone test plans
tcm run Creates, deletes, lists, aborts, publishes,
exports, or runs a group of tests.
tcm suites Provides operations to list and clone test suites
tcm testenvironments Lists test environments
tcm testcase Imports testcases from a specified assembly or a test file (NOT DOCUMENTED)
The run command provides the ability to create, delete, list,
abort, execute, export, and publish runs. The options
available for each of these actions are listed below.
tcm run /delete /id:id [/noprompt] /collection:teamprojectcollectionurl
/teamproject:project [/login:username,[password]]
tcm run /abort /id:id /collection:teamprojectcollectionurl /teamproject:project
[/login:username,[password]]
tcm run /export
/id:id
/resultsfile:path
/collection:teamprojectcollectionurl
/teamproject:project
[/login:username,[password]]
[/attachment:attachmentname]
tcm run /list
/collection:teamprojectcollectionurl
/teamproject:project
[/planid:id | /querytext:query]
[/login:username,[password]]
tcm run /create
/title:title
/planid:id
/collection:teamprojectcollectionurl
/teamproject:project
(/suiteid:id /configid:configid | /querytext:query)
[/settingsname:name]
[/owner:owner]
[/build:buildnumber /builddefinition:builddefinition]
[/flavor:flavor]
[/platform:platform]
[/builddir:directory]
[/testenvironment:name]
[/login:username,[password]]
[/include]
tcm run /publish
/suiteid:id
/configid:id
/resultowner:owner
/resultsfile:path
/collection:teamprojectcollectionurl
/teamproject:project
[/title:runtitle]
[/runowner:owner]
[/build:buildnumber /builddefinition:builddefinition]
[/flavor:flavor]
[/platform:platform]
[/assignfailurestouser:user]
[/login:username,[password]]
[/buildverification]
tcm run /execute
/id:id
/collection:teamprojectcollectionurl
/teamproject:project
[/login:username,[password]]
-->
## Next steps
> [!div class="nextstepaction"]
> [Test objects and terms](test-objects-overview.md)
## Related articles
- [Navigate Test Plans](navigate-test-plans.md)
- [Copy or clone test plans, test suites, and test cases](copy-clone-test-items.md)
- [Associate automated tests with test cases](associate-automated-test-with-test-case.md)
- [About requesting and providing feedback](../project/feedback/index.md)
- [Cross-service integration and collaboration overview](../cross-service/cross-service-overview.md)
- [Manage a virtual machine in Azure DevTest Labs](../pipelines/apps/cd/azure/deploy-provision-devtest-lab.md)
- [About pipeline tests](../pipelines/test/test-glossary.md)
## Additional resources
- [Unit testing](/visualstudio/test/developer-testing-scenarios)
- [Unit test basics](/visualstudio/test/unit-test-basics)
- [Durable Functions unit testing](/azure/azure-functions/durable/durable-functions-unit-testing)
- [What is Azure Load Testing Preview?](/azure/load-testing/overview-what-is-azure-load-testing)
<!--- Removed content
- [Test Planning and Management Guide](https://vsardata.blob.core.windows.net/projects/Test%20Planning%20and%20Management%20Guide.pdf)
Quality is a vital aspect of software systems, and manual testing
and exploratory testing continue to be important techniques for maximizing this.
In today's software development processes,
everybody in the team owns quality - including developers, managers,
product owners, user experience advocates, and more.
<a name="manual-testing"></a>
## Planned manual testing
Manual testing has evolved with the software development process
into a more agile-based approach. Azure DevOps and TFS integrate manual testing into your agile processes; the team
can begin manual testing right from their Kanban boards in the Work
hub. Teams that need more advanced capabilities can use the Test
hub for all their test management needs.
Learn how to create tests plans and test cases, and run them using the Azure DevOps web portal. Use the Test & Feedback extension to explore and find bugs in your apps.
:::row:::
:::column:::
:::image type="icon" source="media/testplan-icon.png" border="false":::
[Create a test plan](create-a-test-plan.md)
:::image type="icon" source="media/marketplace-icon.png" border="false":::
[Install the extension](perform-exploratory-tests.md)
:::column-end:::
:::column:::
:::image type="icon" source="media/testcases-icon.png" border="false":::
[Create test cases](create-test-cases.md)
:::image type="icon" source="media/connectedmode-icon.png" border="false":::
[Test in Connected mode](connected-mode-exploratory-testing.md)
:::column-end:::
:::column:::
:::image type="icon" source="media/runtests2-icon.png" border="false":::
[Run manual tests](run-manual-tests.md)
:::image type="icon" source="media/standalonemode-icon.png" border="false":::
[Test in Standalone mode](standalone-mode-exploratory-testing.md)
:::column-end:::
:::row-end:::
Dev Inner Loop – Unit Testing in Visual Studio IDE
Load and Performance Testing
Integration with 3rd party test services

**Holistic approach to manual testing, types of manual testing, and personas involved**
--> | 60.012103 | 700 | 0.671675 | eng_Latn | 0.985103 |
a689518526362e0aab2c0c0aadda7a34da512baa | 176 | md | Markdown | day_2/README.md | DewaldV/kata02-karate-chop | 923216d7c13c34f39af16db333b7805f7d18e3c7 | [
"MIT"
] | null | null | null | day_2/README.md | DewaldV/kata02-karate-chop | 923216d7c13c34f39af16db333b7805f7d18e3c7 | [
"MIT"
] | null | null | null | day_2/README.md | DewaldV/kata02-karate-chop | 923216d7c13c34f39af16db333b7805f7d18e3c7 | [
"MIT"
] | null | null | null | # Day 2 - Python, Iterative
**Language**: Python
**Style**: Iterative
## Requirements
* Python 3.6+
* pytest
## Instructions
```
pip install -r requirements.txt
pytest
```
| 11 | 31 | 0.664773 | eng_Latn | 0.584149 |
a68952006c86a4e1f2fbc453b13ad9e009309fe4 | 125 | md | Markdown | docs/README.md | turlodales/FrameGrabber | 922ec44e5371e9160f92f6b32a0d2dad33a79b20 | [
"MIT"
] | null | null | null | docs/README.md | turlodales/FrameGrabber | 922ec44e5371e9160f92f6b32a0d2dad33a79b20 | [
"MIT"
] | null | null | null | docs/README.md | turlodales/FrameGrabber | 922ec44e5371e9160f92f6b32a0d2dad33a79b20 | [
"MIT"
] | null | null | null | # Docs
This folder serves the privacy policy for the app with GitHub pages, at https://arthurhammer.github.io/FrameGrabber.
| 31.25 | 116 | 0.792 | eng_Latn | 0.964688 |
a68b7958815122109431184f04d7a88eaea9522f | 243 | md | Markdown | _posts/2019-11-24-v1-1-15.md | Bililive/rec.danmuji.org | e57bf26390f3e5b6ec00d9985260826b4c48ba19 | [
"MIT"
] | 26 | 2018-04-14T10:38:07.000Z | 2021-12-15T14:42:22.000Z | _posts/2019-11-24-v1-1-15.md | Bililive/rec.danmuji.org | e57bf26390f3e5b6ec00d9985260826b4c48ba19 | [
"MIT"
] | 8 | 2018-11-15T02:00:58.000Z | 2021-01-07T08:50:18.000Z | _posts/2019-11-24-v1-1-15.md | Bililive/rec.danmuji.org | e57bf26390f3e5b6ec00d9985260826b4c48ba19 | [
"MIT"
] | 7 | 2019-12-23T15:51:52.000Z | 2021-06-18T14:30:42.000Z | ---
title: "B站录播姬 v1.1.15"
date: 2019-11-24 00:00:00
author: Genteure
---
### 1.1.15 更新日志
- 修改了下载速度的单位,现在的单位是更常用一点的 Mbps
- 删除了录制落后主播自动重连的功能。发现这个功能有时会起到反作用。
- 调整时间戳修复逻辑,允许写入负数时间戳(方便时间戳修复工具进行正确修复)
- 修复了选择录播地址时的随机选择逻辑错误 (@HMBSbige)
- 其他细节修改
| 17.357143 | 38 | 0.728395 | zho_Hans | 0.366144 |
a68bb0c43a42d8511e6ffa757200212f09436709 | 2,332 | md | Markdown | microsoft-365/compliance/microsoft-365-compliance-center-redirection.md | michaelcurnutt/microsoft-365-docs | 6761fbe001f128b773e11b1725506c7116c0f8b4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | microsoft-365/compliance/microsoft-365-compliance-center-redirection.md | michaelcurnutt/microsoft-365-docs | 6761fbe001f128b773e11b1725506c7116c0f8b4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | microsoft-365/compliance/microsoft-365-compliance-center-redirection.md | michaelcurnutt/microsoft-365-docs | 6761fbe001f128b773e11b1725506c7116c0f8b4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Redirection of users from the Office 365 Security and Compliance center to the Microsoft 365 compliance center"
f1.keywords:
- NOCSH
ms.author: robmazz
author: robmazz
manager: laurawi
ms.service: O365-seccomp
audience: ITPro
ms.topic: article
ms.localizationpriority: medium
description: Learn about automatic redirection of users from the Office 365 Security and Compliance center users to the Microsoft 365 compliance center.
ms.collection: M365-security-compliance
ms.custom: admindeeplinkCOMPLIANCE
---
# Redirection of users from the Office 365 Security and Compliance center to the Microsoft 365 compliance center
This article explains how automatic redirection works for users accessing compliance solutions from the Office 365 Security and Compliance Center (protection.office.com) to the <a href="https://go.microsoft.com/fwlink/p/?linkid=2077149" target="_blank">Microsoft 365 compliance center</a>.
## What to expect
Automatic redirection is enabled by default for all users accessing compliance-related solutions in Office 365 Security and Compliance (protection.office.com):
- [Advanced eDiscovery](overview-ediscovery-20.md)
- [Communication compliance](communication-compliance.md)
- [Content search](search-for-content.md)
- [Core eDiscovery](get-started-core-ediscovery.md)
- [Data classification](data-classification-overview.md)
- [Data loss prevention (DLP)](dlp-learn-about-dlp.md)
- [Data subject requests](/compliance/regulatory/gdpr-manage-gdpr-data-subject-requests-with-the-dsr-case-tool)
- [Information governance](manage-information-governance.md)
- [Records management](records-management.md)
Users are automatically routed to the same compliance solutions in the <a href="https://go.microsoft.com/fwlink/p/?linkid=2077149" target="_blank">Microsoft 365 compliance center</a>.
This feature and associated controls does not enable the automatic redirection of Security features for Microsoft Defender for Office 365. To enable the redirection for security features, see [Redirecting accounts from Microsoft Defender for Office 365 to the Microsoft 365 security center](/microsoft-365/security/defender/microsoft-365-security-mdo-redirection) for details.
## Related information
- [Microsoft 365 compliance center overview](/microsoft-365/compliance/microsoft-365-compliance-center)
| 55.52381 | 376 | 0.809177 | eng_Latn | 0.875797 |
a68bd8c1faf87681f35707e7d6a05a36435fe78b | 4,476 | md | Markdown | data-categories/9c41efd5.md | axibase/open-data-catalog | 18210b49b6e2c7ef05d316b6699d2f0778fa565f | [
"Apache-2.0"
] | 7 | 2017-05-02T16:08:17.000Z | 2021-05-27T09:59:46.000Z | data-categories/9c41efd5.md | axibase/open-data-catalog | 18210b49b6e2c7ef05d316b6699d2f0778fa565f | [
"Apache-2.0"
] | 5 | 2017-11-27T15:40:39.000Z | 2017-12-05T14:34:14.000Z | data-categories/2103a1.md | axibase/open-data-catalog | 18210b49b6e2c7ef05d316b6699d2f0778fa565f | [
"Apache-2.0"
] | 3 | 2017-03-03T14:48:48.000Z | 2019-05-23T12:57:42.000Z | # FOIA
Name | Agency | Published
---- | ---- | ---------
[FOIA Request Log - 311](../socrata/j2p9-gdf5.md) | data.cityofchicago.org | 2017-03-24
[FOIA Request Log - Administrative Hearings](../socrata/t58s-ja5s.md) | data.cityofchicago.org | 2017-04-19
[FOIA Request Log - Animal Care and Control](../socrata/b989-387c.md) | data.cityofchicago.org | 2016-05-02
[FOIA Request Log - Aviation](../socrata/jbth-h7cm.md) | data.cityofchicago.org | 2016-08-22
[FOIA Request Log - Budget & Management](../socrata/c323-fb2i.md) | data.cityofchicago.org | 2012-02-29
[FOIA Request Log - Buildings](../socrata/ucsz-xe6d.md) | data.cityofchicago.org | 2016-11-22
[FOIA Request Log - Business Affairs & Consumer Protection](../socrata/ybhx-inqb.md) | data.cityofchicago.org | 2017-03-01
[FOIA Request Log - Chicago Public Library](../socrata/n379-5uzu.md) | data.cityofchicago.org | 2016-06-27
[FOIA Request Log - Chicago Treasurer's Office](../socrata/8gyi-fawp.md) | data.cityofchicago.org | 2015-01-12
[FOIA Request Log - City Clerk](../socrata/72qm-3bwf.md) | data.cityofchicago.org | 2017-01-05
[FOIA Request Log - Community Development - Historical](../socrata/rpya-q7ut.md) | data.cityofchicago.org | 2011-04-17
[FOIA Request Log - Compliance](../socrata/pv55-neb6.md) | data.cityofchicago.org | 2011-04-17
[FOIA Request Log - Cultural Affairs & Special Events](../socrata/ikdf-ernx.md) | data.cityofchicago.org | 2017-04-04
[FOIA Request Log - Cultural Affairs - Historical](../socrata/npw8-6cq9.md) | data.cityofchicago.org | 2017-02-02
[FOIA Request Log - Environment - Historical](../socrata/s7ek-ru5b.md) | data.cityofchicago.org | 2011-04-17
[FOIA Request Log - Ethics](../socrata/fhb6-wwuu.md) | data.cityofchicago.org | 2017-03-24
[FOIA Request Log - Family and Support Services](../socrata/yfhi-bd8g.md) | data.cityofchicago.org | 2016-01-27
[FOIA Request Log - Finance](../socrata/7avf-ek45.md) | data.cityofchicago.org | 2017-04-13
[FOIA Request Log - Fire](../socrata/un3c-ixb7.md) | data.cityofchicago.org | 2017-04-06
[FOIA Request Log - Fleet & Facility Management](../socrata/nd4p-ckx9.md) | data.cityofchicago.org | 2017-02-15
[FOIA Request Log - Fleet Management - Historical](../socrata/ten5-q8vs.md) | data.cityofchicago.org | 2011-04-17
[FOIA Request Log - Graphics and Reproduction Center - Historical](../socrata/57s6-wkzs.md) | data.cityofchicago.org | 2011-04-17
[FOIA Request Log - Health](../socrata/4h87-zdcp.md) | data.cityofchicago.org | 2016-04-08
[FOIA Request Log - Human Relations](../socrata/52q7-yupi.md) | data.cityofchicago.org | 2017-04-20
[FOIA Request Log - Human Resources](../socrata/7zkx-3pp7.md) | data.cityofchicago.org | 2015-06-12
[FOIA Request Log - Independent Police Review Authority](../socrata/gzxp-vdqf.md) | data.cityofchicago.org | 2016-01-14
[FOIA Request Log - Innovation and Technology](../socrata/4nng-j9hd.md) | data.cityofchicago.org | 2017-04-19
[FOIA Request Log - Law](../socrata/44bx-ncpi.md) | data.cityofchicago.org | 2017-01-20
[FOIA Request Log - License Appeal Commission](../socrata/4nkr-n688.md) | data.cityofchicago.org | 2016-01-27
[FOIA Request Log - Mayor's Office for People with Disabilities](../socrata/fazv-a8mb.md) | data.cityofchicago.org | 2013-04-11
[FOIA Request Log - OEMC](../socrata/8pxc-mzcv.md) | data.cityofchicago.org | 2016-01-08
[FOIA Request Log - Office of the Mayor](../socrata/srzw-dcvg.md) | data.cityofchicago.org | 2016-09-12
[FOIA Request Log - Planning and Development](../socrata/5ztz-espx.md) | data.cityofchicago.org | 2017-04-12
[FOIA Request Log - Police](../socrata/wjkc-agnm.md) | data.cityofchicago.org | 2016-01-11
[FOIA Request Log - Procurement Services](../socrata/bcyv-67qk.md) | data.cityofchicago.org | 2017-03-13
[FOIA Request Log - Public Building Commission](../socrata/ngnv-dvxx.md) | data.cityofchicago.org | 2016-06-29
[FOIA Request Log - Revenue](../socrata/zrv6-shhf.md) | data.cityofchicago.org | 2012-02-29
[FOIA Request Log - Special Events - Historical](../socrata/kpzx-wx3r.md) | data.cityofchicago.org | 2011-04-17
[FOIA Request Log - Streets & Sanitation](../socrata/zpd8-zq4w.md) | data.cityofchicago.org | 2017-04-19
[FOIA Request Log - Transportation](../socrata/u9qt-tv7d.md) | data.cityofchicago.org | 2016-05-17
[FOIA Request Log - Water Management](../socrata/cxfr-dd4a.md) | data.cityofchicago.org | 2017-03-24
[FOIA Request Log - Zoning and Land Use Planning - Historical](../socrata/2nra-kpzu.md) | data.cityofchicago.org | 2011-04-17
| 93.25 | 129 | 0.728105 | kor_Hang | 0.191207 |
a68befcd85436e2ae2b988df10de0fafc00d286e | 7,109 | md | Markdown | _posts/2016-06-22-visit-to-nasa-hanger.md | ocelotsloth/ocelotslothweb | b52beb885b02a1b1ac891fa8c26f095fe0f82e3b | [
"MIT"
] | 1 | 2016-12-17T21:29:03.000Z | 2016-12-17T21:29:03.000Z | _posts/2016-06-22-visit-to-nasa-hanger.md | ocelotsloth/ocelotslothweb | b52beb885b02a1b1ac891fa8c26f095fe0f82e3b | [
"MIT"
] | 11 | 2016-06-03T15:26:13.000Z | 2022-02-26T02:13:18.000Z | _posts/2016-06-22-visit-to-nasa-hanger.md | ocelotsloth/ocelotslothweb | b52beb885b02a1b1ac891fa8c26f095fe0f82e3b | [
"MIT"
] | null | null | null | ---
title: "A Visit to NASA LaRC's Hanger"
layout: single
header:
teaser: blog/visit-to-nasa-hanger/feature-thumb.jpg
image: blog/visit-to-nasa-hanger/feature.jpg
twitterImage: "blog/visit-to-nasa-hanger/feature.jpg"
ogImage:
- url: "blog/visit-to-nasa-hanger/feature.jpg"
categories:
- random
tags:
- photography
- photo
- NASA
- LaRC
- Langley
- NIFS
- Hanger
excerpt: "In which I discuss and show my experience in NASA LaRC's Research Hanger."
autonomous-aircraft:
- url : "blog/visit-to-nasa-hanger/autonomous-aircraft.jpg"
image_path : "blog/visit-to-nasa-hanger/autonomous-aircraft.jpg"
alt : "A fully autonomous aircraft that is similar to what Google is trying to do wtih cars."
title : "Fully Autonomous Aircraft"
other-aircraft:
- url : "blog/visit-to-nasa-hanger/other-aircraft.jpg"
image_path : "blog/visit-to-nasa-hanger/other-aircraft.jpg"
alt : "The center has many other different vehicles available for
many different types of missions."
title : "Other Aircraft"
other-aircraft-2:
- url : "blog/visit-to-nasa-hanger/ac130.jpg"
image_path : "blog/visit-to-nasa-hanger/ac130.jpg"
alt : "The AC130 is the largest of the aircraft currently stored in
the hanger and is slated to go on a research mission in the
next couple of months."
title : "The AC130"
- url : "blog/visit-to-nasa-hanger/jet_1.jpg"
image_path : "blog/visit-to-nasa-hanger/jet_1.jpg"
alt : "One of two jet aircraft the center have for use with
experiments."
title : "One of Two Jets"
- url : "blog/visit-to-nasa-hanger/jet_2.jpg"
image_path : "blog/visit-to-nasa-hanger/jet_2.jpg"
alt : "Detail of one of the low IR Jet Engines"
title : "Detail of low IR Jet Engine"
flight-simulator:
- url : "blog/visit-to-nasa-hanger/motion-simulator.jpg"
image_path : "blog/visit-to-nasa-hanger/motion-simulator.jpg"
alt : "The 6 Degree Motion Simulator for the Early Space Program"
title : "6 Degree Motion Simulator"
flight-simulator-2:
- url : "blog/visit-to-nasa-hanger/motion-simulator_1.jpg"
image_path : "blog/visit-to-nasa-hanger/motion-simulator_1.jpg"
alt : "Close up on the 6 degree motion simulator for the early space
program"
title : "Close Up 6 Degree Motion Simulator"
- url : "blog/visit-to-nasa-hanger/motion-simulator_2.jpg"
image_path : "blog/visit-to-nasa-hanger/motion-simulator_2.jpg"
alt : "Close up on the linkages"
title : "Close up Motion Simulator Linkages"
---
{% include toc %}
Today I had an amazing opportunity to take a tour and take photographs inside of NASA
Langley Reasearch Center's aircraft hanger. I thought I would share some of my
photos and share some information on each one because some of it was really
interesting.
What I'll do is go through the pictures I have and explain some things about them
right after each, so scroll down for some cool science equipment!
## Autonomous Aircraft
{% include gallery id="autonomous-aircraft" %}
This composite Cessna aircraft has been outfitted with what is currently the most
advance aircraft guidance system in existence. Similar to what Google is trying to
accomplish with their self-driving cars, this aircraft can take in data from RADAR,
altimeters, airspeed sensors, and directional sensors and use the information to
guide the aircraft from point A to B with no pilot interaction required. The system
is currently being tested (with a test pilot onboard of course) with eventual goals
being the inclusion of these types of systems in manned commercial and private
flights.
Combined with the automation of air traffic control systems, this aircraft navigation
method can allow for aircraft to communicate with one another and automatically file
in line for landings, greatly increasing the number of airports that could be used
and reducing the needed capacity at the limited airports available for use today.
## Other Aircraft
{% include gallery id="other-aircraft" %}
The center has many other different aircraft in use for various different
applications. The aircraft above has an advanced version of the sensor used on all
aircraft to determine air velocity, altitude, and direction of the aircraft, and has
been used to learn more about the atmosphere in addition to being used to test
theories to help soved problems such as the [737 rudder issue](https://en.wikipedia.org/wiki/Boeing_737_rudder_issues).
The aircraft can all be outfitted and customized for any particular mission that
needed for researchers, even including mounting large equipment to the insides and
bottoms of aircraft to take measurements. While I was in the hanger, there was a
team prepping a propeller driven aircraft to fly a CO2 and Ethene measurement
experiment that will span huge parts of the Eastern US in order to analyze our impact
on the environment more closely. This mission also includes placing the same
equipment inside of the AC130 for a higher altitude reading to go wtih the aircraft
they were working on today.
{% include gallery id="other-aircraft-2" %}
Because the aircraft typically come second hand from military operations, some of
the aircraft have unique properties, such as their jet engines which are ultra-low IR
engines. These aircraft were originally used by the Coast Guard and needed to have a
small heat signature so the engines used flow the exaust back into the engine. You
could place your hand directly behind the engine at full throttle and not burn
yourself.
## Flight Simulator
{% include gallery id="flight-simulator" %}
On the roof of the hanger is mounted a large crane apparatus that goes along with a
large picture of the earth hung over the NASA-side bay doors. This is actually a
flight simulator that was used in the early Mercury and Gemini Missions to help
NASA learn how to pilot their spacecraft, as it was something that had never been
done before. This simulator simulates all six degrees of motion and was the source
of all of the procedures for the first docking in space. All of the astronauts who
flew had practice time in either this simulator or the second one that was built
later at the gantry to accomodate the larger lunar lander. This peice of equipment
ran in real time on purely analog circutry via the linkages that hang from the
ceiling that you can see in the picture above. This is a national historic landmark.
{% include gallery id="flight-simulator-2" %}
## Conclusion
This was an awesome, educational experience. I hope you enjoy the pictures, I
certainly enjoyed taking them and learning about what NASA is doing with their
equipment today.
© Mark Stenglein 2016 \| [Contact](mailto:{{ author.email }}) me for licensing. | 48.691781 | 119 | 0.725559 | eng_Latn | 0.998919 |
a68c1cb7e646dcbc68103b7b67043149d51fe7ad | 229 | md | Markdown | src/emdk-for-android/3-1/api/barcode/index.md | developer-zebra/developer-zebra-site | 8b3ecb3fafc41d1bc9aa0e52ad07d66ae359902d | [
"Fair",
"Unlicense"
] | 2 | 2016-03-21T11:00:57.000Z | 2016-04-27T06:46:52.000Z | src/emdk-for-android/3-1/api/barcode/index.md | developer-zebra/developer-zebra-site | 8b3ecb3fafc41d1bc9aa0e52ad07d66ae359902d | [
"Fair",
"Unlicense"
] | 7 | 2016-03-17T20:28:36.000Z | 2020-07-07T19:02:59.000Z | src/emdk-for-android/3-1/api/barcode/index.md | developer-zebra/developer-zebra-site | 8b3ecb3fafc41d1bc9aa0e52ad07d66ae359902d | [
"Fair",
"Unlicense"
] | 8 | 2016-10-25T10:29:49.000Z | 2021-06-04T03:34:40.000Z | ---
title: Barcode APIs
layout: list-content-api.html
product: EMDK For Android
productversion: '2.3'
---
>Supported Devices:
* MC18KK
* MC32N0JB
* MC40JB
* MC40KK
* MC67JB
* MC92KK
* TC55JB
* TC55KK
* TC70KK
* TC75KK
| 7.896552 | 29 | 0.676856 | kor_Hang | 0.419919 |
a68c46429ceb94c7840eb23ba07990409f7c3cac | 946 | md | Markdown | content/news/murder-suspect-arrested.md | hack3r3d/pico-composer | 743f690c0068f5a0af197eecd91aca9b4b3a2b95 | [
"MIT"
] | null | null | null | content/news/murder-suspect-arrested.md | hack3r3d/pico-composer | 743f690c0068f5a0af197eecd91aca9b4b3a2b95 | [
"MIT"
] | null | null | null | content/news/murder-suspect-arrested.md | hack3r3d/pico-composer | 743f690c0068f5a0af197eecd91aca9b4b3a2b95 | [
"MIT"
] | null | null | null | ---
Title: Murder Suspect Arrested
Description: A suspect in the shooting death of Darius Cooper on January 10, 2021 in Gaithersburg taken into custody.
Author: Keith
Date: 2021-07-08 23:49:21
Img: https://gaithersburgnewsletter.org/assets/355-sign-dark.png
Template: post
---
[On June 15, Montgomery County Police Detectives arrested](https://www2.montgomerycountymd.gov/mcgportalapps/Press_Detail_Pol.aspx?Item_ID=36527 "Montgomery County Press Release About Arrest") a suspect in the murder of Darius Cooper on January 10, 2021.
Cooper, 25, of Germantown, and the suspect Jakel Delante Stone, 31, of Glen Burnie apparently got into a fight in a South Frederick Avenue apartment building parking lot in Gaithersburg and Cooper was shot.
According to police, Stone’s DNA was found at the scene of the murder. Based on the evidence, police arrested Stone in Anne Arundel County. He has been transported central processing and held without bail. | 72.769231 | 254 | 0.802326 | eng_Latn | 0.967401 |
a68c58e63589bc8267e46e9dbf82c0f07af7e79f | 395 | markdown | Markdown | _posts/2020-06-26-move-windows-window-back-onto-screen-after-deteching-monitor.markdown | 2341/2341.github.io | c82e7cf9b1b4d8ba67be662d127caebdc819f276 | [
"MIT"
] | null | null | null | _posts/2020-06-26-move-windows-window-back-onto-screen-after-deteching-monitor.markdown | 2341/2341.github.io | c82e7cf9b1b4d8ba67be662d127caebdc819f276 | [
"MIT"
] | null | null | null | _posts/2020-06-26-move-windows-window-back-onto-screen-after-deteching-monitor.markdown | 2341/2341.github.io | c82e7cf9b1b4d8ba67be662d127caebdc819f276 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Move Windows window back onto screen after deteching monitor"
date: 2020-06-26 7:00:00 +0200
---
When you unplug a monitor on a Windows computer it can happen that some windows don't get switched over to your other monitor. Easy way around that is pressing `Windows` and `Tab` key, then `right click` on the small preview version of the window and select `pin left`. | 65.833333 | 270 | 0.75443 | eng_Latn | 0.997886 |
a68d6be97a3e4d8550d5a0ea2794a318a42ee8c8 | 8,286 | md | Markdown | CONTRIBUTING.md | bstriner/sagemaker-training-toolkit | 81a4323761a5327baaf0d24157b9428919b5cc67 | [
"Apache-2.0"
] | 248 | 2020-04-21T09:25:03.000Z | 2022-03-24T22:24:26.000Z | CONTRIBUTING.md | bstriner/sagemaker-training-toolkit | 81a4323761a5327baaf0d24157b9428919b5cc67 | [
"Apache-2.0"
] | 68 | 2020-04-22T09:31:18.000Z | 2022-03-19T06:44:36.000Z | CONTRIBUTING.md | bstriner/sagemaker-training-toolkit | 81a4323761a5327baaf0d24157b9428919b5cc67 | [
"Apache-2.0"
] | 60 | 2020-06-02T20:52:24.000Z | 2022-03-16T18:20:41.000Z | # Contributing Guidelines
Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional
documentation, we greatly value feedback and contributions from our community.
Please read through this document before submitting any issues or pull requests to ensure we have all the necessary
information to effectively respond to your bug report or contribution.
## Submitting bug reports and feature requests
We welcome you to use the GitHub issue tracker to report bugs or suggest features.
When filing an issue, please check [existing open](https://github.com/aws/sagemaker-training-toolkit/issues), or [recently closed](https://github.com/aws/sagemaker-training-toolkit/issues?utf8=%E2%9C%93&q=is%3Aissue%20is%3Aclosed%20), issues to make sure somebody else hasn't already
reported the issue. To create a new issue, select the template that most closely matches what you're writing about (ie. "Bug report", "Documentation request", or "Feature request"). Please fill out all information requested in the issue template.
## Contributing via pull requests
Contributions via pull requests are much appreciated. Before sending us a pull request, please ensure that:
- You are working against the latest source on the *master* branch.
- You check existing open, and recently merged, pull requests to make sure someone else hasn't addressed the problem already.
- You open an issue to discuss any significant work - we would hate for your time to be wasted.
To send us a pull request, please:
1. Fork the repository.
2. Modify the source; please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change.
3. Ensure local tests pass.
4. Commit to your fork using [clear commit messages](#committing-your-change).
5. Send us a pull request, answering any default questions in the pull request interface.
6. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation.
The [sagemaker-bot](https://github.com/sagemaker-bot) will comment on the pull request with a link to the build logs.
GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and
[creating a pull request](https://help.github.com/articles/creating-a-pull-request/).
### Running the unit tests
1. Install tox using `pip install tox`
1. Install coverage using `pip install .[test]`
1. cd into the sagemaker-training-toolkit folder: `cd sagemaker-training-toolkit`
1. Run the following tox command and verify that all code checks and unit tests pass: `tox test/unit`
You can also run a single test with the following command: `tox -e py36 -- -s -vv test/unit/test_entry_point.py::test_install_module`
* Note that the coverage test will fail if you only run a single test, so make sure to surround the command with `export IGNORE_COVERAGE=-` and `unset IGNORE_COVERAGE`
* Example: `export IGNORE_COVERAGE=- ; tox -e py36 -- -s -vv test/unit/test_entry_point.py::test_install_module ; unset IGNORE_COVERAGE`
### Running the integration tests
Our CI system runs integration tests (the ones in the `test/integration` directory), in parallel, for every pull request.
You should only worry about manually running any new integration tests that you write, or integration tests that test an area of code that you've modified.
1. Follow the instructions at [Set Up the AWS Command Line Interface (AWS CLI)](https://docs.aws.amazon.com/polly/latest/dg/setup-aws-cli.html).
1. To run a test, specify the test file and method you want to run per the following command: `tox -e py36 -- -s -vv test/integration/local/test_dummy.py::test_install_requirements`
* Note that the coverage test will fail if you only run a single test, so make sure to surround the command with `export IGNORE_COVERAGE=-` and `unset IGNORE_COVERAGE`
* Example: `export IGNORE_COVERAGE=- ; tox -e py36 -- -s -vv test/integration/local/test_dummy.py::test_install_requirements ; unset IGNORE_COVERAGE`
### Making and testing your change
1. Create a new git branch:
```shell
git checkout -b my-fix-branch master
```
1. Make your changes, **including unit tests** and, if appropriate, integration tests.
1. Include unit tests when you contribute new features or make bug fixes, as they help to:
1. Prove that your code works correctly.
1. Guard against future breaking changes to lower the maintenance cost.
1. Please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change.
1. Run all the unit tests as per [Running the unit tests](#running-the-unit-tests), and verify that all checks and tests pass.
1. Note that this also runs tools that may be necessary for the automated build to pass (ex: code reformatting by 'black').
### Committing your change
We use commit messages to update the project version number and generate changelog entries, so it's important for them to follow the right format. Valid commit messages include a prefix, separated from the rest of the message by a colon and a space. Here are a few examples:
```
feature: support VPC config for hyperparameter tuning
fix: fix flake8 errors
documentation: add MXNet documentation
```
Valid prefixes are listed in the table below.
| Prefix | Use for... |
|----------------:|:-----------------------------------------------------------------------------------------------|
| `breaking` | Incompatible API changes. |
| `deprecation` | Deprecating an existing API or feature, or removing something that was previously deprecated. |
| `feature` | Adding a new feature. |
| `fix` | Bug fixes. |
| `change` | Any other code change. |
| `documentation` | Documentation changes. |
Some of the prefixes allow abbreviation ; `break`, `feat`, `depr`, and `doc` are all valid. If you omit a prefix, the commit will be treated as a `change`.
For the rest of the message, use imperative style and keep things concise but informative. See [How to Write a Git Commit Message](https://chris.beams.io/posts/git-commit/) for guidance.
### Sending a pull request
GitHub provides additional document on [creating a pull request](https://help.github.com/articles/creating-a-pull-request/).
Please remember to:
* Use commit messages (and PR titles) that follow the guidelines under [Committing your change](#committing-your-change).
* Send us a pull request, answering any default questions in the pull request interface.
* Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation.
## Finding contributions to work on
Looking at the [existing issues](https://github.com/aws/sagemaker-training-toolkit/issues) is a great place to start.
## Code of Conduct
This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct).
For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact
[email protected] with any additional questions or comments.
## Security issue notifications
If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue.
## Licensing
See the [LICENSE](https://github.com/aws/sagemaker-training-toolkit/blob/master/LICENSE) file for our project's licensing. We will ask you to confirm the licensing of your contribution.
We may ask you to sign a [Contributor License Agreement (CLA)](http://en.wikipedia.org/wiki/Contributor_License_Agreement) for larger changes.
| 64.232558 | 283 | 0.713734 | eng_Latn | 0.992821 |
a68dc2568c79b74645ee50a0c617996e5045b589 | 3,567 | md | Markdown | Readme.md | ZachPlunkett/hammer | 2e83e118b788c6ce340000e13d674a768938e174 | [
"MIT"
] | 53 | 2020-09-29T18:48:42.000Z | 2021-08-22T06:49:41.000Z | Readme.md | ZachPlunkett/hammer | 2e83e118b788c6ce340000e13d674a768938e174 | [
"MIT"
] | 14 | 2020-08-25T10:34:00.000Z | 2020-10-26T06:53:07.000Z | Readme.md | ZachPlunkett/hammer | 2e83e118b788c6ce340000e13d674a768938e174 | [
"MIT"
] | 3 | 2020-10-03T05:13:10.000Z | 2020-10-20T13:30:43.000Z | # Hammer [](https://opensource.org/licenses/MIT) [](https://travis-ci.org/ShaileshSurya/hammer) [](https://coveralls.io/github/ShaileshSurya/hammer?branch=master)
Golang's Fluent HTTP Request Client

## Recipes
```go
client := hammer.New()
request, err := hammer.RequestBuilder().
<HttpVerb>().
WithURL("http://localhost:8081/employee").
WithContext(context.Background()).
WithHeaders("Accept", "application/json").
WithHeaders("user-id", "10062").
WithRequestParams("department", "HR").
Build()
resp, err:= client.Execute(request)
// or
responseMap := make(map[string]interface{})
err:= client.ExecuteInto(request, &responseMap)
// or
responseModel := Employee{}
err:= client.ExecuteInto(request, &responseModel)
```
### Supported HTTP Verbs
```go
Get()
Head()
Post()
PostForm()
Put()
Patch()
Delete()
Connect()
Options()
Trace()
```
### Hammer Client Api's
```go
// New intializes and returns new Hammer Client
New()
// WithHTTPClient returns Hammer client with custom HTTPClient
WithHTTPClient(*http.Client)
// Execute the Request
Execute(*Request)
// Execute the Request and unmarshal into map or struct provided with unmarshalInto. Please See recipes.
ExecuteInto(*Request,unmarshalInto interface{})
```
### RequestBuilder Api's
```go
// WithRequestBody struct or map can be sent
WithRequestBody(body interface{})
// WithContext ...
WithContext(ctx context.Context)
// WithHeaders ...
WithHeaders(key string, value string)
// WithRequestParams ...
WithRequestParams(key string, value string)
// WithRequestBodyParams ...
WithRequestBodyParams(key string, value interface{})
// WithFormValues ...
WithFormValues(Key string, value interface{})
// WithURL ...
WithURL(value string)
// WithBasicAuth ...
WithBasicAuth(username, password string)
// WithTemplate will create a request with already created request. See example below.
WithTemplate(tempRequest *Request)
```
```go
client := hammer.New()
request, err := hammer.RequestBuilder().
Get().
WithURL("http://localhost:8081/employee").
WithHeaders("Accept", "application/json").
WithHeaders("user-id", "10062").
WithRequestParams("department", "HR").
Build()
employeeList := []Employee{}
err := client.ExecuteInto(request,&EmployeeList)
```
```go
client := hammer.New()
reqTemp, err := hammer.RequestBuilder().
Get().
WithURL("http://localhost:8081/employee").
WithHeaders("Accept", "application/json").
WithHeaders("user-id", "10062").
WithRequestParams("department", "HR").
Build()
request, err := hammer.RequestBuilder().
WithTemplate(reqTemp).
WithRequestParams("post","manager").
WithRequestParams("centre","pune")
Build()
employeeList := []Employee{}
err:= client.ExecuteInto(request,&EmployeeList)
```
## Contributing
1. Fork the repo and create your branch from master.
2. If you've added code that should be tested, add tests.
3. If you've changed APIs, update the documentation.
4. Ensure the test suite passes.
5. Make sure your code lints.
6. Issue that pull request!
View [CONTRIBUTING.md](CONTRIBUTING.md) to learn more about how to contribute.
## License
This project is open source and available under the [MIT License](LICENSE).
| 23.78 | 405 | 0.71741 | eng_Latn | 0.317885 |
a68dc5196868279448e444f8104f2ba01ea6f68c | 53 | md | Markdown | README.md | cahergil/MusicApp | a3c48b5e127df3aa8de76c3434bda3edf2e63c62 | [
"Apache-2.0"
] | null | null | null | README.md | cahergil/MusicApp | a3c48b5e127df3aa8de76c3434bda3edf2e63c62 | [
"Apache-2.0"
] | null | null | null | README.md | cahergil/MusicApp | a3c48b5e127df3aa8de76c3434bda3edf2e63c62 | [
"Apache-2.0"
] | null | null | null | # MusicApp
App to learn Kotlin programming language
| 13.25 | 40 | 0.811321 | eng_Latn | 0.838319 |
a68ef7d6bb80f37d462c998be2abcf08602353a9 | 922 | md | Markdown | README.md | dalto2wk/THECAPSTONE | e10bf0f67e2e75c86837e25d51a4040ec071824b | [
"MIT"
] | null | null | null | README.md | dalto2wk/THECAPSTONE | e10bf0f67e2e75c86837e25d51a4040ec071824b | [
"MIT"
] | 1 | 2019-03-17T19:55:59.000Z | 2019-03-17T19:55:59.000Z | README.md | dalto2wk/THECAPSTONE | e10bf0f67e2e75c86837e25d51a4040ec071824b | [
"MIT"
] | null | null | null | # THECAPSTONE
This is our repository for JMU's CIS capstone project. We had 6 weeks to develop a working system for a client.
Our team developed a system that earned us top 5 in the system competition out of 15 teams. The technology stack that we used included the following technologies:
HTML5, CSS4, JavaScript, C#, ASP.Net, SQL SERVER, Bootstrap, Amazon Web services RDS, Amazon Web services Elastic Beanstalk, Amazon Web Services EC2, Git and Github.
Our team had to create a system that would help bridge the gap between High school students and local employers. This would help High school students get connected with local employers and provide employers opportunities to work with or hire these students. Our system had the business employer user focus. We created a system where a busniess employer could login and see data and forms related to connecting with local schools, students, and other businesses.
| 115.25 | 462 | 0.804772 | eng_Latn | 0.999778 |
a68f467eab5ff8efc2d7c8514411f154a1fabe97 | 9,741 | md | Markdown | docs/ready/landing-zone/migrate-landing-zone.md | milanhybner/cloud-adoption-framework.cs-cz | 6f1b4a99b5ce58ac39facab09293e300d022182e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ready/landing-zone/migrate-landing-zone.md | milanhybner/cloud-adoption-framework.cs-cz | 6f1b4a99b5ce58ac39facab09293e300d022182e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ready/landing-zone/migrate-landing-zone.md | milanhybner/cloud-adoption-framework.cs-cz | 6f1b4a99b5ce58ac39facab09293e300d022182e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Nasazení cílové zóny migrace v Azure
description: Přečtěte si, jak cílovou zónu migrace v Azure.
author: BrianBlanchard
ms.author: brblanch
ms.date: 02/25/2020
ms.topic: conceptual
ms.service: cloud-adoption-framework
ms.subservice: ready
ms.openlocfilehash: 924be38b70013a570f83c5fc156c28c051ec468d
ms.sourcegitcommit: afe10f97fc0e0402a881fdfa55dadebd3aca75ab
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 03/31/2020
ms.locfileid: "80431934"
---
<!-- cSpell:ignore vCPUs jumpbox -->
# <a name="deploy-a-migration-landing-zone"></a>Nasazení cílové zóny migrace
Termín *cílová zóna migrace* se používá k popisu prostředí, které je zřízené a připravené k hostování úloh migrovaných z místního prostředí do Azure.
## <a name="deploy-the-first-landing-zone"></a>Nasazení první cílové zóny
Předtím, než použijete cílovou zónu migrace v rámci architektury pro přijetí do cloudu, prostudujte si následující předpoklady, rozhodnutí a pokyny k implementaci. Pokud se tyto pokyny zarovnají s požadovaným plánem přijetí do cloudu, je možné pomocí [kroků nasazení][deploy-sample]nasadit [cílovou zónu migrace](https://docs.microsoft.com/azure/governance/blueprints/samples/caf-migrate-landing-zone/index) .
> [!div class="nextstepaction"]
> [Nasazení ukázky podrobného plánu][deploy-sample]
## <a name="assumptions"></a>Předpoklady
Tato počáteční cílová zóna zahrnuje následující předpoklady nebo omezení. Pokud tyto předpoklady odpovídají vašim omezením, můžete podrobný plán použít k vytvoření první cílové zóny. Plán lze také rozšířit a vytvořit tak podrobný plán cílové zóny, který vyhovuje vašim jedinečným omezením.
- **Omezení předplatného:** Toto úsilí o přijetí neočekává překročení [limitů předplatného](https://docs.microsoft.com/azure/azure-subscription-service-limits).
- **Dodržování předpisů:** V této cílové zóně nejsou potřeba žádné požadavky na dodržování předpisů od jiných výrobců.
- **Složitost architektury:** Složitost architektury nevyžaduje další produkční odběry.
- **Sdílené služby:** V Azure nejsou žádné sdílené služby, které vyžadují, aby se toto předplatné zpracovalo jako paprskový uzel v architektuře hvězdicové architektury.
- **Omezený rozsah produkčního prostředí:** Tato cílová zóna by mohla být hostitelem produkčních úloh. Nejedná se však o vhodné prostředí pro citlivá data ani pro klíčové úlohy.
Pokud tyto předpoklady odpovídají vašim požadavkům na přijetí, může být tento plán výchozím bodem pro vytváření cílové zóny.
## <a name="decisions"></a>Rozhodnutí
V podrobném plánu cílové zóny jsou zastoupena následující rozhodnutí.
| Komponenta | Rozhodnutí | Alternativní přístupy |
|------------------------------|---------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| Nástroje pro migraci | Nasadí se Azure Site Recovery a vytvoří se projekt Azure Migrate. | [Průvodce rozhodováním ohledně nástrojů pro migraci](../../decision-guides/migrate-decision-guide/index.md) |
| Protokolování a monitorování | Bude zřízený pracovní prostor Operational Insights a účet úložiště pro diagnostiku. | |
| Síť | Vytvoří se virtuální síť s podsítěmi pro bránu, firewall, jumpbox a cílovou zónu. | [Rozhodnutí o síti](../considerations/networking-options.md) |
| Identita | Předpokládá se, že předplatné je už přidružené k instanci Azure Active Directory. | [Osvědčené postupy správy identit](https://docs.microsoft.com/azure/security/azure-security-identity-management-best-practices?toc=https://docs.microsoft.com/azure/cloud-adoption-framework/toc.json&bc=https://docs.microsoft.com/azure/cloud-adoption-framework/_bread/toc.json) |
| Zásada | Tento podrobný plán v současné době předpokládá, že se nemají použít žádné zásady Azure. | |
| Návrh předplatného | Neuvedeno – Navrženo pro jedno produkční předplatné. | [Vytvoření počátečních předplatných](../azure-best-practices/initial-subscriptions.md) |
| Skupiny prostředků | Neuvedeno – Navrženo pro jedno produkční předplatné. | [Škálování předplatných](../azure-best-practices/scale-subscriptions.md) |
| Skupiny pro správu | Neuvedeno – Navrženo pro jedno produkční předplatné. | [Uspořádání a Správa předplatných](../azure-best-practices/organize-subscriptions.md) |
| Data | neuvedeno | [Výběr správné možnosti SQL Server v dokumentaci k Azure](https://docs.microsoft.com/azure/sql-database/sql-database-paas-vs-sql-server-iaas) a [Azure Data Store](https://docs.microsoft.com/azure/architecture/guide/technology-choices/data-store-overview) |
| Úložiště | neuvedeno | [Pokyny k Azure Storage](../considerations/storage-options.md) |
| Standardy pojmenování a označování | neuvedeno | [Osvědčené postupy pojmenování a označování](../azure-best-practices/naming-and-tagging.md) |
| Správa nákladů | neuvedeno | [Sledování nákladů](../azure-best-practices/track-costs.md) |
| Compute | neuvedeno | [Možnosti služby Compute](../considerations/compute-options.md) |
## <a name="customize-or-deploy-a-landing-zone"></a>Přizpůsobení nebo nasazení cílové zóny
Přečtěte si další informace a Stáhněte si referenční ukázku v tématu Postup migrace cílové zóny pro nasazení nebo přizpůsobení z [Azure Blueprint Samples][deploy-sample].
> [!div class="nextstepaction"]
> [Nasazení ukázky podrobného plánu][deploy-sample]
Pokyny k přizpůsobení, která by se měla provádět v tomto podrobném plánu nebo v výsledné cílové zóně, najdete v tématu věnovaném [hlediskům cílové zóny](../considerations/index.md).
## <a name="next-steps"></a>Další kroky
Po nasazení první cílové zóny budete připraveni [rozšířit cílovou zónu](../considerations/index.md) .
> [!div class="nextstepaction"]
> [Rozšířit cílovou zónu](../considerations/index.md)
<!-- links -->
[deploy-sample]: https://docs.microsoft.com/azure/governance/blueprints/samples/caf-migrate-landing-zone/deploy
| 120.259259 | 433 | 0.460836 | ces_Latn | 0.999352 |
a68fa1e2b11663e77e50c553181bd198f4b30eb5 | 858 | md | Markdown | Readme.md | aki237/dibba | ee4c513ba0003393a0ddca3d2621762ed09cda13 | [
"BSD-3-Clause"
] | null | null | null | Readme.md | aki237/dibba | ee4c513ba0003393a0ddca3d2621762ed09cda13 | [
"BSD-3-Clause"
] | null | null | null | Readme.md | aki237/dibba | ee4c513ba0003393a0ddca3d2621762ed09cda13 | [
"BSD-3-Clause"
] | null | null | null | # dibba - package format reader
[](https://goreportcard.com/report/github.com/aki237/dibba)
Dibba is a small go library for a new TLV based file format for storing just files.
Refer [godoc](https://godoc.org/github.com/aki237/dibba) for more.
See Examples for sample usage.
+ Packager : Packages files into a single dibba file.
```
$ cd $GOPATH/github.com/aki237/dibba/examples/packager/
$ go build
$ ./packager outFile.dib example/*
35
4259516
$ ls
. .. example outFile.dib package.go packager
```
+ Parser : Parse the dibba file and print out the contents of a given file from the package
```
$ cd $GOPATH/github.com/aki237/dibba/examples/parser/
$ go build
$ ./parser ../packager/outFile.dib Readme.md
# Readme
This is a sample readme.
```
| 29.586207 | 132 | 0.707459 | eng_Latn | 0.692993 |
a68fb95dad46f31db976ad11b07dd7ec147a9b64 | 1,515 | md | Markdown | desktop-src/Tapi/linetermdev--constants.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | 4 | 2021-07-26T16:18:49.000Z | 2022-02-19T02:00:21.000Z | desktop-src/Tapi/linetermdev--constants.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-04-09T17:00:51.000Z | 2020-04-09T18:30:01.000Z | desktop-src/Tapi/linetermdev--constants.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-07-19T02:58:48.000Z | 2021-03-06T21:09:47.000Z | ---
Description: The LINETERMDEV\_ bit-flag constants describe different types of terminal devices.
ms.assetid: 3444d022-8225-4956-89a1-721b4662d557
title: LINETERMDEV_ Constants (Tapi.h)
ms.topic: reference
ms.date: 05/31/2018
---
# LINETERMDEV\_ Constants
The **LINETERMDEV\_** bit-flag constants describe different types of terminal devices.
<dl> <dt>
<span id="LINETERMDEV_HEADSET"></span><span id="linetermdev_headset"></span>**LINETERMDEV\_HEADSET**
</dt> <dd> <dl> <dt>
The terminal is a headset.
</dt> </dl> </dd> <dt>
<span id="LINETERMDEV_PHONE"></span><span id="linetermdev_phone"></span>**LINETERMDEV\_PHONE**
</dt> <dd> <dl> <dt>
The terminal is a phone set.
</dt> </dl> </dd> <dt>
<span id="LINETERMDEV_SPEAKER"></span><span id="linetermdev_speaker"></span>**LINETERMDEV\_SPEAKER**
</dt> <dd> <dl> <dt>
The terminal is an external speaker and microphone.
</dt> </dl> </dd> </dl>
## Remarks
No extensibility. All 32 bits are reserved.
These constants are used to characterize a line's terminal device and help an application to determine the nature of a terminal device.
## Requirements
| | |
|-------------------------|-----------------------------------------------------------------------------------|
| TAPI version<br/> | Requires TAPI 2.0 or later<br/> |
| Header<br/> | <dl> <dt>Tapi.h</dt> </dl> |
| 21.956522 | 135 | 0.567657 | eng_Latn | 0.380044 |
a690424897aa2d1cab5121d4183e8be5f559acc8 | 539 | md | Markdown | README.md | Sinba7/Movie-Recommender | 802c9c66739ef1c49287038aca06cb6943a79bd8 | [
"MIT"
] | 1 | 2021-01-28T06:29:31.000Z | 2021-01-28T06:29:31.000Z | README.md | Sinba7/Movie-Recommender | 802c9c66739ef1c49287038aca06cb6943a79bd8 | [
"MIT"
] | 1 | 2020-12-18T03:58:04.000Z | 2020-12-18T03:58:04.000Z | README.md | Sinba7/Movie-Recommender | 802c9c66739ef1c49287038aca06cb6943a79bd8 | [
"MIT"
] | null | null | null | # Movie Recommendation System
This project is an app serves for movie recommendation. I deployed this app on heroku platform: [Nostalgia Movie Recommender](https://ny-movie-recommender.herokuapp.com/). To run the app on your local machine, open terminal and go to under Movie-Recommender folder and run ./startup in your terminal. If you have permission deny issue, please run the pip command in startup.sh file to install required packages and run gunicorn command to start the app. The url of the app should be provide in your terminal.
| 179.666667 | 508 | 0.801484 | eng_Latn | 0.993218 |
a6909085ec9d146a84c6a1d80ab6a32b215ea142 | 92 | md | Markdown | README.md | XigenIO/Docker-CentosMirror | b1eeacfeb9687b9198136f76c155a75e895be17a | [
"MIT"
] | null | null | null | README.md | XigenIO/Docker-CentosMirror | b1eeacfeb9687b9198136f76c155a75e895be17a | [
"MIT"
] | null | null | null | README.md | XigenIO/Docker-CentosMirror | b1eeacfeb9687b9198136f76c155a75e895be17a | [
"MIT"
] | null | null | null | # Docker-ArchMirror
A project to run a centos linux mirror using a single alpine container🐋
| 30.666667 | 71 | 0.804348 | eng_Latn | 0.934348 |
a690a2552f2233c382697a395fa2f2756740494c | 1,399 | md | Markdown | docs/ado/reference/ado-md-api/membertypeenum.md | ZubriQ/sql-docs.ru-ru | 50559946dabe5fce9eef251a637dc2e3fd305908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/reference/ado-md-api/membertypeenum.md | ZubriQ/sql-docs.ru-ru | 50559946dabe5fce9eef251a637dc2e3fd305908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/reference/ado-md-api/membertypeenum.md | ZubriQ/sql-docs.ru-ru | 50559946dabe5fce9eef251a637dc2e3fd305908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Мембертипинум | Документация Майкрософт
ms.prod: sql
ms.prod_service: connectivity
ms.technology: connectivity
ms.custom: ''
ms.date: 01/19/2017
ms.reviewer: ''
ms.topic: conceptual
apitype: COM
f1_keywords:
- MemberTypeEnum
helpviewer_keywords:
- MemberTypeEnum enumeration [ADO MD]
ms.assetid: 5d8132c0-7ca2-4f86-8336-1b34213869ad
author: MightyPen
ms.author: genemi
ms.openlocfilehash: da396bd71e64925bcd8fb74f71f8e334bf7f1d7e
ms.sourcegitcommit: e042272a38fb646df05152c676e5cbeae3f9cd13
ms.translationtype: MT
ms.contentlocale: ru-RU
ms.lasthandoff: 04/27/2020
ms.locfileid: "67949403"
---
# <a name="membertypeenum"></a>MemberTypeEnum
Задает параметр для свойства [Type](../../../ado/reference/ado-md-api/type-property-ado-md.md) объекта- [члена](../../../ado/reference/ado-md-api/member-object-ado-md.md) .
|Константа|Значение|Описание|
|--------------|-----------|-----------------|
|**adMemberAll**|4|Указывает, что объект **member** представляет все элементы уровня.|
|**adMemberFormula**|3|Указывает, что объект- **член** вычисляется с помощью выражения формулы.|
|**adMemberMeasure**|2|Указывает, что объект- **член** принадлежит измерению Measures и представляет количественный атрибут.|
|**adMemberRegular**|1|По умолчанию. Указывает, что объект **member** представляет экземпляр бизнес-сущности.|
|**adMemberUnknown**|0|Не удается определить тип элемента.|
| 39.971429 | 174 | 0.744818 | rus_Cyrl | 0.281956 |
a690a65ec20ba29e06f36e6d209bba6123907f43 | 1,444 | md | Markdown | content/post/flowingdata-com-2021-06-10-all-the-passes-in-soccer-visualized-at-once.md | chuxinyuan/daily | dc201b9ddb1e4e8a5ec18cc9f9b618df889b504c | [
"MIT"
] | 8 | 2018-03-27T05:17:56.000Z | 2021-09-11T19:18:07.000Z | content/post/flowingdata-com-2021-06-10-all-the-passes-in-soccer-visualized-at-once.md | chuxinyuan/daily | dc201b9ddb1e4e8a5ec18cc9f9b618df889b504c | [
"MIT"
] | 16 | 2018-01-31T04:27:06.000Z | 2021-10-03T19:54:50.000Z | content/post/flowingdata-com-2021-06-10-all-the-passes-in-soccer-visualized-at-once.md | chuxinyuan/daily | dc201b9ddb1e4e8a5ec18cc9f9b618df889b504c | [
"MIT"
] | 12 | 2018-01-27T15:17:26.000Z | 2021-09-07T04:43:12.000Z | ---
title: All the passes in soccer visualized at once
date: '2021-06-10'
linkTitle: https://flowingdata.com/2021/06/10/all-the-passes-in-soccer-visualized-at-once/
source: FlowingData
description: <p><a href="https://flowingdata.com/2021/06/10/all-the-passes-in-soccer-visualized-at-once/"><img
width="750" height="487" src="https://flowingdata.com/wp-content/uploads/2021/06/all-the-passes-750x487.jpeg"
class="attachment-medium size-medium wp-post-image" alt="" loading="lazy" srcset="https://flowingdata.com/wp-content/uploads/2021/06/all-the-passes-750x487.jpeg
750w, https://flowingdata.com/wp-content/uploads/2021/06/all-the-passes-1090x708.jpeg
1090w, https://flowingdata.com/wp-content/uploads/2021/06/all-the-passes-210x136.jpeg
210w, https://flowingdata.com/wp-content/uploads/2021/06/al ...
disable_comments: true
---
<p><a href="https://flowingdata.com/2021/06/10/all-the-passes-in-soccer-visualized-at-once/"><img width="750" height="487" src="https://flowingdata.com/wp-content/uploads/2021/06/all-the-passes-750x487.jpeg" class="attachment-medium size-medium wp-post-image" alt="" loading="lazy" srcset="https://flowingdata.com/wp-content/uploads/2021/06/all-the-passes-750x487.jpeg 750w, https://flowingdata.com/wp-content/uploads/2021/06/all-the-passes-1090x708.jpeg 1090w, https://flowingdata.com/wp-content/uploads/2021/06/all-the-passes-210x136.jpeg 210w, https://flowingdata.com/wp-content/uploads/2021/06/al ... | 103.142857 | 604 | 0.76662 | yue_Hant | 0.454744 |
a690c85ac7f1e125c5810370cc7b3e3db9c7b67a | 2,767 | md | Markdown | static/site/content/help/domain-names/contact-details/index.md | OxfordInfoLabs/kinicart-example | 53702a2334c7e44a2878cf8328c26e26a02c47f4 | [
"MIT"
] | null | null | null | static/site/content/help/domain-names/contact-details/index.md | OxfordInfoLabs/kinicart-example | 53702a2334c7e44a2878cf8328c26e26a02c47f4 | [
"MIT"
] | 1 | 2022-03-02T03:44:39.000Z | 2022-03-02T03:44:39.000Z | static/site/content/help/domain-names/contact-details/index.md | OxfordInfoLabs/kinicart-example | 53702a2334c7e44a2878cf8328c26e26a02c47f4 | [
"MIT"
] | null | null | null | {
"title": "Domain Contact Details",
"date": "2019-01-28T15:39:12Z",
"description": "Netistrar's Help Documentation about Domain Name Contact Details, contact types, bulk tools, and more ",
"categories": [],
"weight": 12
}
From Dashboard > Domains > YOURDOMAIN.COM > Contacts
Here you can edit and review contact information for a domain name. The information is separated into four sections: Registrant contact information (the owner of a domain names), and technical, administation and billing contact information.
Note, contact information is strictly recorded on a domain by domain basis. There is no sharing of contact information between domain names. However, we do have tools to assist with managing contact information across multiple domains: bulk tools, and contact templates.
## Contact types
The visibility of these records on WHOIS servers is affected by your [privacy settings](/help/domain-names/privacy-settings/).
#### Owner/Registrant
The Registrant of Record is the official Registrant of the domain name. Note, contact information for this record cannot be changed without due process. Following ICANN rules, before affecting any changes to this record we must notify and seek the consent of the current Registrant using the contact details we have on file (ie. the old contact information).
We must receive confirmation allowing the change within 14 days of any change request, and following that we must locked the domain name, to prevent it from being transfered out of the system for 60 days following any change applied.
When you attempt to make a change to this record the ability to make further changes to the Registrant will be locked and you will be taken through the steps of the confirmation process.
#### Administration, Technical & Billing Contacts
These contacts are optional, for your own use, and will be published on WHOIS servers as specified in your privacy settings.
Note, top-level Domain Name providers have their own implementation for publishing WHOIS information, and may not not support publishing of all of the contact information you supply.
## Bulk tools
Bulk tools enable you to apply contact information edits to a subset of one or more domains under your management. [Review our bulk tools help section](/help/hosting-providers-and-trade-customers/bulk-operations/).
#### Contact templates
You can think of contact templates as blue prints of contact information that can be applied to domain name records, new domain name purchases, and bulk operations. Contact templates enable you to maintain consistent records across a number of assets in your account, and speed the process of data entry. [Review our contact templates help section](/help/domain-names/contact-templates/).
| 69.175 | 390 | 0.792555 | eng_Latn | 0.998473 |
a6918ef13a7d3fd4441837d89c590f3a0c75c77f | 5,992 | md | Markdown | v5.0/user-manual/app-creation/language-support/ruby/lang-ruby-rails-puma.md | qdsang/rainbond-docs | 1e3d8bf81b98b1f2cb668427b3f78f802f94dada | [
"CC-BY-4.0"
] | null | null | null | v5.0/user-manual/app-creation/language-support/ruby/lang-ruby-rails-puma.md | qdsang/rainbond-docs | 1e3d8bf81b98b1f2cb668427b3f78f802f94dada | [
"CC-BY-4.0"
] | null | null | null | v5.0/user-manual/app-creation/language-support/ruby/lang-ruby-rails-puma.md | qdsang/rainbond-docs | 1e3d8bf81b98b1f2cb668427b3f78f802f94dada | [
"CC-BY-4.0"
] | null | null | null | ---
title: 使用Puma部署Rails应用
summary: 使用Puma部署Rails应用
toc: true
---
## 前言
###WEBrick
Ruby 标准库有一个名为 [WEBrick](http://ruby-doc.org/stdlib-2.1.1/libdoc/webrick/rdoc/WEBrick.html) 的默认 web server。它会随着Ruby的安装自动安装到系统中。大多数开发框架,如Rack和Rails默认使用 WEBrick 作为开发环境的web server。
虽然说 WEBrick 在开发过程极大的方便了开发者调试程序,但它并不能处理大规模的并发任务,因此在生产环境中需要用其它的web server来代替WEBrick。
### 生产环境为什么不能用 WEBrick
默认情况下 WEBrick 是单任务,单线程的。这意味着,如果在同一时刻来了2个请求,第二个请求必须等待第一个请求处理完成后才能被执行。
如果用户没有在 Procfile 文件中指定web server,那么平台会在生产环境中默认运行 WEBrick ,这样一来,如果您的服务有大批量用户访问的话,用户会感觉速度很慢,或者会超时,无法打开,即便是将应用水平扩容到10个节点也无法解决这种问题。
因此,如果您在云帮运行生产环境的应用,一定要更换默认的web server。
### 生产环境 web server
生产环境的 web server 应该具备处理大规模并发任务的能力。我们强烈建议使用 Puma web server。
用户需要将 puma 添加到 Gemfile 文件中,然后在 Procfile 文件中指定用puma来运行Ruby 应用,最后提交代码并在平台上部署,接下来就给大家介绍puma部署Rails应用。
## Puma部署Rails应用
Puma 是 Unicorn 的有力竞争者,他们都可以处理大规模的并发请求。
除了worker进程外,Puma 默认使用线程来处理请求,这样可以更加充分的利用CPU资源。但这要求用户的所有代码都必须是线程安全的,即便不是也没关系,用户仍然可以通过横向(水平)扩展worker进程的方式来达到处理高并发请求的目的。
本篇文档将一步步指导您利用 puma web server 将 Rails 应用部署到云帮。
{{site.data.alerts.callout_danger}}
将程序部署到生产环境之前,一定要确保再测试环境中可以正常运行。
{{site.data.alerts.end}}
### 添加 Puma 到程序中
#### Gemfile 文件
首先,添加 puma 到应用的`Gemfile`文件中:
{% include copy-clipboard.html %}
```bash
echo "gem 'puma'" >> Gemfile
```
在本地安装puma
{% include copy-clipboard.html %}
```bash
bundler install
```
#### Procfile 文件
将 Puma 作为应用的 web 处理程序,用户可以在一行中设置多个参数:
{% include copy-clipboard.html %}
```bash
web: bundle exec puma -t 5:5 -p ${PORT:-3000} -e ${RACK_ENV:-development}
```
通常情况下我们建议将puma的配置存成配置文件:
{% include copy-clipboard.html %}
```bash
web: bundle exec puma -C config/puma.rb
```
请确保Procfile的内容正确,并添加到git代码仓库中。
#### 配置文件
可以将puma的配置文件存在`config/puma.rb`或者你喜欢的位置。下面是一个简单的Rails应用配置文件:
{% include copy-clipboard.html %}
```bash
workers Integer(ENV['WEB_CONCURRENCY'] || 2)
threads_count = Integer(ENV['MAX_THREADS'] || 5)
# 若程序不是线程安全的需要将threads_count设置为1
# threads_count = 1
threads threads_count, threads_count
preload_app!
rackup DefaultRackup
port ENV['PORT'] || 3000
environment ENV['RACK_ENV'] || 'development'
on_worker_boot do
# Worker specific setup for Rails 4.1+
# See: http://docs.goodrain.com/ruby/rails-puma.html#On_worker_boot
ActiveRecord::Base.establish_connection
end
```
用户需要确认配置了正确的数据库连接并可以正常的连接到数据库。下文会介绍如何设置数据库连接。
#### Workers
{% include copy-clipboard.html %}
```bash
workers Integer(ENV['WEB_CONCURRENCY'] || 2)
```
{{site.data.alerts.callout_danger}}
WEB_CONCURRENCY
环境变量根据应用的单个实例的内存大小而设置,默认情
况下一个128内存的应用实例默认WEB_CONCURRENCY=2
目前云帮的一个实例的内存固定为128M,后续会支>持自定义调整。同时WEB_CONCURRENCY 变量会随着内存的调整而调整。
{{site.data.alerts.end}}
#### Threads
{% include copy-clipboard.html %}
```bash
threads_count = Integer(ENV['MAX_THREADS'] || 5)
threads threads_count, threads_count
```
Puma 可以将请求交给一个内部的线程池去处理,这样可以处理更多的并发请求,减少响应时间,同时占用内存更小也可以最大化的使用CPU资源。
Puma 允许用户对线程数做最大和最小的限制,动态的调整操作由Puma主实例来完成。最小的线程数的意思就是当没有请求时线程数最小数目,最大就是说请求量上来后可以开线程的最大数。默认是 0:16 也就是 最小0,最大16,用户可以自由设置。我们建议将最大值和最小值设置为一样的,这样可以有效的处理请求,且不必因为动态调整线程数而浪费CPU资源。
#### Preload app
```bash
preload_app!
```
预加载应用减少了单个Puma worker处理进程的启动时间,可以利用独立的worker使用 on_worker_boot 方法来管理外部的连接。这样的配置可以保证每个worker进程可以使用优雅的方式进行数据库连接。
#### Rackup
```bash
rackup DefaultRackup
```
使用rackup命令来告诉 Puma如何启动你的rack应用。这个配置默认情况下Rails会自动写入到config.ru文件中。因此后续的Puma可能不需要设置了
#### Port
```bash
port ENV['PORT'] || 3000
```
云帮应用启动后回自动设置PORT变量,以便可以将应用添加到负载均衡中。本地默认设置为3000 这也是Rails默认值。
#### Environment
```bash
environment ENV['RACK_ENV'] || 'development'
```
设置Puma的环境变量,在上运行的Rails应用默认将ENV['RACK_ENV']设置为'production'
#### On worker boot
`on_worker_boot` 定义部分 是worker启动后未接受请求前执行的部分。主要是解决数据库连接并发问题,如果用户使用的是Rails 4.1+ 则可以直接在 database.yml 设置连接池大小来解决。
```bash
on_worker_boot do
# Valid on Rails 4.1+ using the `config/database.yml` method of setting `pool` size
ActiveRecord::Base.establish_connection
end
```
否则必须对这部分进行特定的设置:
```bash
on_worker_boot do
# Valid on Rails up to 4.1 the initializer method of setting `pool` size
ActiveSupport.on_load(:active_record) do
config = ActiveRecord::Base.configurations[Rails.env] ||
Rails.application.config.database_configuration[Rails.env]
config['pool'] = ENV['MAX_THREADS'] || 5
ActiveRecord::Base.establish_connection(config)
end
end
```
默认情况下我们都需要设置数据库连接池大小
如果程序中使用了其他的数据存储,如redis,memcached,postgres等,也需要在pre-load区域设置活动记录的重连机制。如果使用Resque连接到Redis需要使用重连机制:
```bash
on_worker_boot do
# ...
if defined?(Resque)
Resque.redis = ENV["<redis-uri>"] || "redis://127.0.0.1:6379"
end
end
```
## 线程安全
线程安全的代码可以在多个线程无差错地运行。并非所有的Ruby代码都是线程安全的。如果你的代码和库正在多个线程中运行,很难检测到是否是线程安全的。
直到 Rails 4,出现了一种可以自动切换的线程安全兼容模式。虽然Rails是线程安全的,但并不能保证用户的代码也是。如果你的程序没有在多线程环境中运行过,我们建议部署的时候将最大线程和最小线程数同时设置为1,也就是禁用线程模式
```bash
threads_count = 1
threads threads_count, threads_count
```
即便禁用了线程模式,用户仍然可以调整worker的数量来增加并非处理能力。每个进程是使用独立内存的,即便代码是非线程安全的也可以跨多个进程处理。
## 数据库连接
当调整线程数或进程数后,程序对数据库的连接也会增加,当检测到数据库连接丢失或时断时续情况时,就需要调整数据库的连接池大小。当Rails应用遇到连接池占满的时候会有如下报错:
```bash
ActiveRecord::ConnectionTimeoutError - could not obtain a database connection within 5 seconds
```
这意味着Rails的连接池设置得不够合理
## 部署到云帮
- 添加`puma`到`Gemfile`:
```bash
echo "gem 'puma'" >> Gemfile
# 本地安装puma并重新生成Gemfile.lock文件
bundler install
```
- `puma.ru`配置文件
```bash
$ cat ./config/puma.rb
workers Integer(ENV['WEB_CONCURRENCY'] || 2)
threads_count = Integer(ENV['MAX_THREADS'] || 5)
# 若程序不是线程安全的需要将threads_count设置为1
# threads_count = 1
threads threads_count, threads_count
preload_app!
rackup DefaultRackup
port ENV['PORT'] || 3000
environment ENV['RACK_ENV'] || 'development'
on_worker_boot do
# Worker specific setup for Rails 4.1+
# See: https://devcenter.heroku.com/articles/deploying-rails-applications-with-the-puma-web-server#on-worker-boot
ActiveRecord::Base.establish_connection
end
```
- 填写Procfile
{% include copy-clipboard.html %}
```bash
echo "web: bundle exec puma -C config/puma.rb" > ./Procfile
```
- 提交代码
```bash
git add .
git commit -m "add puma"
git push origin master
```
| 21.173145 | 176 | 0.765854 | yue_Hant | 0.836899 |
a691cd08c8cf0ffd42bcf2491a1caae91d0b6d1b | 8,487 | md | Markdown | README.md | DosMike/WebBooks | e9798bdea420387f5a8632095e1e7fdead4c45f5 | [
"MIT"
] | 2 | 2017-09-14T12:47:32.000Z | 2020-04-26T13:49:11.000Z | README.md | DosMike/WebBooks | e9798bdea420387f5a8632095e1e7fdead4c45f5 | [
"MIT"
] | 1 | 2020-04-26T13:53:17.000Z | 2021-04-12T14:35:36.000Z | README.md | DosMike/WebBooks | e9798bdea420387f5a8632095e1e7fdead4c45f5 | [
"MIT"
] | 2 | 2018-02-18T13:04:25.000Z | 2018-10-02T11:50:24.000Z | # WebBooks
Open websites in Minecraft
The main command is `/webbook` or `/wbk` for short. Or `/url` if you fancy that.
The syntax is as follows: `/<command> [-s [-a <author>] | -c] <url> [target]`.
* `url` is the website to load and display
* `target` is a optional parameter specifying who's supposed to see the website.
This parameter requires the permission `webbooks.url.other`
* `-c` will open the website paginated into the chat instead of as a book.
following links will open a book reguardless
* `-s` this can not be used with `-c` as it supressed any output. Instead the website will be stored away in a physical book.
Links that execute server-commands will not work well with those!<br>This option requires the permission `webbooks.url.save`
* `-a <author>` only affects `-s`. Will set the author of the book item. Supports color codes.
This option requires the permission `webbooks.url.author`
The base permission to use the command is `webbooks.url.base`
In order to browse domains, the permission `webbooks.broswe.<domain>` is required as well. The permission requires the domain backwards to allow the permission system to automatically manage subdomains. For example: `example.com` would require the permission `webbooks.browse.com.example`. That permission would also allow `minecraft.example.com`. And due to how the permission system works `webbooks.browse` handles all domains.
### Config
The config file provides options to `Proxy` the requests. This is usefull as every website will be loaded by the game-server. So going to any website will expose the ip to it. Not like you can just lookup the ip by the server-name, but it's there as a feature.
Keep in mind tho that any response from the web-server must happen within 3 seconds before timing out!
With the `MOTD` you can specify a url to display when a player joins your server. This might be usefull to have a dynamic greeting message, automatically updated rules or what ever you come up with to write on your webserver.
The `ExtendedTooltips` option is on by default. It add an extra line to ever link, showing the command that will be run or the URL that will be opened. Tooltips can also be hidden per link (see the following section).
Specifying a `DefaultAuthor` is only required if you don't like the default author ('Saved Website') when `-s` is used to save the website to an item.
The `PageSelector` is probably the most important part in your project, as it determines how pages are split up. Minecraft has no way of doing automatic page breaks unfortunately, and this solution seems fine. The default selector is `ul.book li`.
Various player information is sent to the server, depending on the specified `TransportMethod`.
Valid values are described below.
## Web-Part:
Pages are by default selected with the css-selector `ul.book li`. This can be changed in the config tho, if you don't like it.
Player data are sent according to the transport method. If you're writing a service and you want to be able to service all types, you can distinguish them by request method and Content-Type header.
Example PHP showcasing how to server `post/formdata` request with the default `ul.book li` selector:
```
<?PHP
//Preparing data for display, split them on the delimiter '/'
$worldData = explode('/', $_POST['World']);
$worldName = $worldData[0];
$statusData = explode('/', $_POST['Status']);
$health = $statusData[0];
$level = $statusData[2];
$playerName = $_POST['Name'];
?>
<!DOCTYPE html>
<html>
<head>
<title>Test Book</title>
</head>
<body>
<ul class="book">
<li><u>This is <span class="mc-m">website</span> book!</u>
<br>
<br>Your User-Agent: <?= $_SERVER['HTTP_USER_AGENT'] ?>
<br>
<br><i>1</i> <a href="#2">2</a> <a href="#3">3</a>
<li>Hello, <?= $playerName ?> level <?= $level ?>
<br>
<br>You are currently in <?= $worldName ?> with <?= $health ?> HP
<br>
<br><a href="#1">1</a> <i>2</i> <a href="#3">3</a>
<li>Test some links:
<br>
<br><a href="kill" target="_player">Die now</a>
<br><a href="test.php">Reload site</a>
<br><a href="stop" target="_server" data-permission="webbooks.links.admin" title="Please dont :<">Kill the server</a>
<br><a href="http://www.google.com" target="_blank">Go to google</a>
<br>
<br><a href="#1">1</a> <a href="#2">2</a> <i>3</i>
</ul>
</body>
</html>
```
### Formatting your website
To reflect the limited set for formats available to minecraft a fix set of style classes has to be used.
Those can in return be defined in your stylesheet as well to support displaying the content in an actual web-browser.
The class names have a `mc-`-Prefix followed by the format-code. Some bold red text could be:
```
<span class="mc-c mc-l">Text</span> or
<b><span class="mc-c">Text</span></b> or
<b class="mc-c">Text</b>
```
### Adding hover text
You can attach a line of text to be displayed on hovering to pretty much any element using the `title` attribute for the html node in case a element needs further explanation.
### Links and special targets
Links will work as expected, loading the website and viewing it as another book.
As you might expect if you add `target="_blank"` to your link Minecraft will ask the player to open the link in the system web-browser.
But it would be pretty boring if that was everything a book could do, so there's a little bit more you can do with links:
* `<a href="#1">` will not jump to sections with the specified id, but instead jump to the given pagenumber
* `<a href="command" target="_player">` will execute the command in href as if the player typed it. the `/` is optional.
* `<a href="command" target="_server">` will execute the command in href as the server with op-powers. These links will break if you save the website, but not cause any big errors (Besides telling the player that the callback stopped working)
Links with target "_player" and "_server" allow for a additional attribute `data-permission` to restrict usage.
Using this parameter on "_player" commands will break the link in saved book in the same way "_sever" commands will break in saved books.
An example would be `<a href="stop" target="_server" data-permission="server.admin.stop">Stop the server</a>`
The target along with the url will be shown in a hover text for each link like:
* `URL: url` for normal links
* `Extern: url` for links with target `_blank`
* `Run: command` for links with target `_player`
* `¶ Run: command` for links with target `_player` and a required permission
* `Server: command` for links with target `_server`
* `¶ Server: command` for links with target `_server` and a required permission
If you don't want this target text you can disable `ExtendedTooltips` in the config or add the attribute `data-title-hide-href`.
### Available player-data:
For `post/json` the request method will be POST and player data will be a json like this:
```json
{
"subject": {
"name": String, "uuid": UUID,
"health": Number, "foodLevel": Number,
"expLevel": Number, "gameMode": String
},
"location": {
"world": { "name": String, "uuid": UUID },
"position": { "x": Number, "y": Number, "z": Number }
},
"connection": {
"ip": String, "port": Number, "latency": Number,
"joined": { "first": IsoDate, "last": IsoDate }
}
}
```
For `post/formdata` the request method will be POST and the data will be url-form endcoded. (New Lines are only for readability)
```
Name=<NAME>&
UUID=<UUID>&
World=<NAME>/<UUID>&
Location=<X>/<Y>/<Z>&
Connection=<IP>:<PORT>/<LATENCY>ms&
Joined=<FIRST>/<LAST>&
Status=<HEALTH>/<FOODLEVEL>/<EXPLEVEL>/<GAMEMODE>
```
For `get/header` the request method will be GET and the data will be sent in headers:
```
X-WebBook-User=<NAME>; <UUID>
X-WebBook-World=<NAME>; <UUID>
X-WebBook-Location=<X>; <Y>; <Z>
X-WebBook-Connection=<IP>:<PORT>; <LATENCY>ms
X-WebBook-Joined=<FIRST>; <LAST>
X-WebBook-Status=<HEALTH>; <FOODLEVEL>; <EXPLEVEL>; <GAMEMODE>
```
### User-Agent string:
##### Before 1.2:
`<MinecraftName>(<ExecutionType>/<Type>) <MinecraftVersion>/<SpongeName> <SpongeVersion>/WebBooks(webbook) <WebBooksVersion>`
##### Since 1.2:
`<MinecraftName>/<MinecraftVersion> <SpongeName>/<SpongeVersion>(<SpongePlatform>; <SpongeType>) WebBooks/<WebBooksVersion> (webbook; by DosMike)`
# This Plugin utilizes JSoup
Jsoup is licensed under MIT-License
[Poject Homepage](https://jsoup.org/)   [License-Text](https://jsoup.org/license)
| 47.679775 | 429 | 0.713444 | eng_Latn | 0.985323 |
a691dfd3c1fde506088be151b3be0bf5a3c98183 | 8,755 | md | Markdown | libs/IconFontCppHeaders/README.md | thomasgeissl/lucediretta | 03e5dd17995fabd5d11009ea05440fade2bba2c4 | [
"MIT"
] | 1 | 2021-05-18T16:20:57.000Z | 2021-05-18T16:20:57.000Z | README.md | GloriousBreenie/IconFontCppHeaders | 6a7610757c02b83f43ce767105c010b46899af62 | [
"Zlib"
] | 17 | 2021-02-14T21:14:54.000Z | 2021-02-27T16:00:31.000Z | README.md | GloriousBreenie/IconFontCppHeaders | 6a7610757c02b83f43ce767105c010b46899af62 | [
"Zlib"
] | null | null | null | Support development of IconFontCppHeaders through [GitHub Sponsors](https://github.com/sponsors/dougbinks) or [Patreon](https://www.patreon.com/enkisoftware)
[<img src="https://img.shields.io/static/v1?logo=github&label=Github&message=Sponsor&color=#ea4aaa" width="200"/>](https://github.com/sponsors/dougbinks) [<img src="https://c5.patreon.com/external/logo/[email protected]" alt="Become a Patron" width="150"/>](https://www.patreon.com/enkisoftware)
# IconFontCppHeaders
[https://github.com/juliettef/IconFontCppHeaders](https://github.com/juliettef/IconFontCppHeaders)
C, C++ headers and C# classes for icon fonts Font Awesome, Fork Awesome, Google Material Design icons, Kenney game icons and Fontaudio.
A set of header files and classes for using icon fonts in C, C++ and C#, along with the python generator used to create the files.
Each header contains defines for one font, with each icon code point defined as ICON_*, along with the min and max code points for font loading purposes.
In addition the python script can be used to convert ttf font files to C and C++ headers.
Each ttf icon font file is converted to a C and C++ header file containing a single array of bytes.
To enable conversion, run the GenerateIconFontCppHeaders.py script with 'ttf2headerC = True'.
## Icon Fonts
### [Font Awesome](https://fontawesome.com)
* [https://github.com/FortAwesome/Font-Awesome](https://github.com/FortAwesome/Font-Awesome)
#### Font Awesome 4
* [icons.yml](https://github.com/FortAwesome/Font-Awesome/blob/fa-4/src/icons.yml)
* [fontawesome-webfont.ttf](https://github.com/FortAwesome/Font-Awesome/blob/fa-4/fonts/fontawesome-webfont.ttf)
#### Font Awesome 5 - see [notes below](#notes-about-font-awesome-5)
* [icons.yml](https://github.com/FortAwesome/Font-Awesome/blob/master/metadata/icons.yml)
* [fa-brands-400.ttf](https://github.com/FortAwesome/Font-Awesome/blob/master/webfonts/fa-brands-400.ttf)
* [fa-regular-400.ttf](https://github.com/FortAwesome/Font-Awesome/blob/master/webfonts/fa-regular-400.ttf)
* [fa-solid-900.ttf](https://github.com/FortAwesome/Font-Awesome/blob/master/webfonts/fa-solid-900.ttf)
#### Font Awesome 5 Pro - this is a paid product, see [notes below](#notes-about-font-awesome-5)
Files downloaded from [fontawesome.com](https://fontawesome.com)
* ..\fontawesome-pro-n.n.n-web\metadata\icons.yml
* ..\fontawesome-pro-n.n.n-web\webfonts\fa-brands-400.ttf
* ..\fontawesome-pro-n.n.n-web\webfonts\fa-light-300.ttf
* ..\fontawesome-pro-n.n.n-web\webfonts\fa-regular-400.ttf
* ..\fontawesome-pro-n.n.n-web\webfonts\fa-solid-900.ttf
### [Fork Awesome](https://forkawesome.github.io/Fork-Awesome)
* [https://github.com/ForkAwesome/Fork-Awesome](https://github.com/ForkAwesome/Fork-Awesome)
* [icons.yml](https://github.com/ForkAwesome/Fork-Awesome/blob/master/src/icons/icons.yml)
* [forkawesome-webfont.ttf](https://github.com/ForkAwesome/Fork-Awesome/blob/master/fonts/forkawesome-webfont.ttf)
### [Google Material Design icons](https://design.google.com/icons) - see [Issue #19](https://github.com/juliettef/IconFontCppHeaders/issues/19)
* [https://github.com/google/material-design-icons](https://github.com/google/material-design-icons)
* [codepoints](https://github.com/google/material-design-icons/blob/master/iconfont/codepoints)
* [MaterialIcons-Regular.ttf](https://github.com/google/material-design-icons/blob/master/iconfont/MaterialIcons-Regular.ttf)
### [Kenney Game icons](http://kenney.nl/assets/game-icons) and [Game icons expansion](http://kenney.nl/assets/game-icons-expansion)
* [https://github.com/nicodinh/kenney-icon-font](https://github.com/nicodinh/kenney-icon-font)
* [kenney-icons.css](https://github.com/nicodinh/kenney-icon-font/blob/master/css/kenney-icons.css)
* [kenney-icon-font.ttf](https://github.com/nicodinh/kenney-icon-font/blob/master/fonts/kenney-icon-font.ttf)
### [Fontaudio](https://github.com/fefanto/fontaudio)
* [https://github.com/fefanto/fontaudio](https://github.com/fefanto/fontaudio)
* [fontaudio.css](https://github.com/fefanto/fontaudio/blob/master/font/fontaudio.css)
* [fontaudio.ttf](https://github.com/fefanto/fontaudio/blob/master/font/fontaudio.ttf)
## Notes about Font Awesome 5
### Codepoints grouping
Font Awesome 5 splits the different styles of icons into different font files with identical codepoints for *light*, *regular* and *solid* styles, and a different set of codepoints for *brands*. We have put the brands into a separate header file.
### Generating Pro header files
Download the Font Awesome Pro Web package. To generate the headers, drop *icons.yml* in the same directory as *GenerateIconFontCppHeaders.py* before running the script. The file *icons.yml* is under *..\fontawesome-pro-n.n.n-web\metadata\icons.yml* where *n.n.n* is the version number.
## Ionicons and webfont Material Design Icons
Unsupported as of 29 Apr 2020. See [Issue #16](https://github.com/juliettef/IconFontCppHeaders/issues/16).
## Example Code
Using [Dear ImGui](https://github.com/ocornut/imgui) as an example UI library:
```Cpp
#include "IconsFontAwesome5.h"
ImGuiIO& io = ImGui::GetIO();
io.Fonts->AddFontDefault();
// merge in icons from Font Awesome
static const ImWchar icons_ranges[] = { ICON_MIN_FA, ICON_MAX_FA, 0 };
ImFontConfig icons_config; icons_config.MergeMode = true; icons_config.PixelSnapH = true;
io.Fonts->AddFontFromFileTTF( FONT_ICON_FILE_NAME_FAS, 16.0f, &icons_config, icons_ranges );
// use FONT_ICON_FILE_NAME_FAR if you want regular instead of solid
// in an imgui window somewhere...
ImGui::Text( ICON_FA_PAINT_BRUSH " Paint" ); // use string literal concatenation
// outputs a paint brush icon and 'Paint' as a string.
```
## Projects using the font icon header files
### [Avoyd](https://www.enkisoftware.com/avoyd)
Avoyd is a 6 degrees of freedom voxel game that includes a voxel editor tool.
[www.avoyd.com](https://www.avoyd.com)
The voxel editor's UI uses Dear ImGui with Font Awesome icon fonts.

### [bgfx](https://github.com/bkaradzic/bgfx)
Cross-platform rendering library.
[bkaradzic.github.io/bgfx/overview](https://bkaradzic.github.io/bgfx/overview.html)
### [glChAoS.P](https://github.com/BrutPitt/glChAoS.P)
Real time 3D strange attractors scout.
[www.michelemorrone.eu/glchaosp](https://www.michelemorrone.eu/glchaosp)

### [iPlug2](https://github.com/iplug2/iplug2)
Cross platform C++ audio plug-in framework
[iplug2.github.io](https://iplug2.github.io)
### [Tracy Profiler](https://bitbucket.org/wolfpld/tracy/src/master/)
Real time, nanosecond resolution, remote telemetry frame profiler for games and other applications.
[](https://www.youtube.com/watch?v=uJkrFgriuOo)
### [Visual 6502 Remix](https://github.com/floooh/v6502r)
Transistor level 6502 Hardware Simulation
[floooh.github.io/visual6502remix](https://floooh.github.io/visual6502remix)
## Credits
Development - [Juliette Foucaut](http://www.enkisoftware.com/about.html#juliette) - [@juliettef](https://github.com/juliettef)
Requirements - [Doug Binks](http://www.enkisoftware.com/about.html#doug) - [@dougbinks](https://github.com/dougbinks)
[None](https://bitbucket.org/duangle/nonelang/src) language implementation and [refactoring](https://gist.github.com/paniq/4a734e9d8e86a2373b5bc4ca719855ec) - [Leonard Ritter](http://www.leonard-ritter.com/) - [@paniq](https://github.com/paniq)
Suggestion to add a define for the ttf file name - [Sean Barrett](https://nothings.org/) - [@nothings](https://github.com/nothings)
Initial Font Awesome 5 implementation - [Codecat](https://codecat.nl/) - [@codecat](https://github.com/codecat)
Suggestion to add Fork Awesome - [Julien Deswaef](http://xuv.be/) - [@xuv](https://github.com/xuv)
Suggestion to add Ionicons - [Omar Cornut](http://www.miracleworld.net/) - [@ocornut](https://github.com/ocornut)
C# language implementation - Rokas Kupstys - [@rokups](https://github.com/rokups)
Suggestion to add Material Design Icons - Gustav Madeso - [@madeso](https://github.com/madeso)
Fontaudio implementation - [Oli Larkin](https://www.olilarkin.co.uk/) - [@olilarkin](https://github.com/olilarkin)
Initial ttf to C and C++ headers conversion implementation - Charles Mailly - [@Caerind](https://github.com/Caerind)
| 61.223776 | 310 | 0.758766 | yue_Hant | 0.203519 |
a6920b94a5227d25f79662606d83049671b9e8b2 | 238 | md | Markdown | README.md | iamtalhaasghar/ghnotifier | 7bbcbc32abc8ad923bff64055cb19ac042a03764 | [
"MIT"
] | 1 | 2022-02-03T05:30:22.000Z | 2022-02-03T05:30:22.000Z | README.md | iamtalhaasghar/ghnotifier | 7bbcbc32abc8ad923bff64055cb19ac042a03764 | [
"MIT"
] | 5 | 2018-10-30T13:03:24.000Z | 2022-02-03T06:06:08.000Z | README.md | iamtalhaasghar/ghnotifier | 7bbcbc32abc8ad923bff64055cb19ac042a03764 | [
"MIT"
] | 1 | 2022-02-03T06:02:02.000Z | 2022-02-03T06:02:02.000Z | If you want to use this check my [Rust Port](https://github.com/kunicmarko20/ghnotifier-rs) instead.
Python app that will show you number of notifications in your status bar and
send you github notifications to your Linux notifications.
| 47.6 | 100 | 0.802521 | eng_Latn | 0.98862 |
a6921737ded297677df7532f33e9eb4d88f81825 | 1,461 | md | Markdown | 2020/12/30/2020-12-30 18:00.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/12/30/2020-12-30 18:00.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/12/30/2020-12-30 18:00.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年12月30日18时数据
Status: 200
1.张若昀首谈女儿
微博热度:2830268
2.妻子擅自终止妊娠是否侵犯丈夫生育权
微博热度:1739175
3.小黑瓶国风玫瑰限定版
微博热度:1732766
4.李易峰对金晨说明年再约
微博热度:1720325
5.明确人工授精生子法律地位
微博热度:1652459
6.刘昊然最难忘的一课
微博热度:1555072
7.豆瓣崩了
微博热度:1327835
8.北极圈是划到烟台了吗
微博热度:1031729
9.李沁的保暖衣能充电
微博热度:1023813
10.以色列两老人接种美产疫苗后死亡
微博热度:1010120
11.李易峰金晨
微博热度:991647
12.台湾出现首例新冠病毒变种病例
微博热度:979479
13.辽宁沈阳立即启动应急响应
微博热度:974849
14.沈阳疫情
微博热度:951188
15.婚前父母帮买房的出资属于个人财产
微博热度:949046
16.金晨 以后去朋友家吃饭都不敢有异性
微博热度:935986
17.人社局介入员工连上2次厕所被罚款
微博热度:915302
18.青簪行
微博热度:900901
19.被周迅唱哭了
微博热度:896526
20.美出售香港总领馆房产计划未能达成
微博热度:890484
21.张雨绮熬夜写了篇长文
微博热度:889193
22.杨幂假装自己在夏天
微博热度:733886
23.2男子酒后割手腕结拜血流不止
微博热度:621856
24.丁飞俊退出创造营
微博热度:565837
25.特朗普当选全美最受钦佩男性
微博热度:561242
26.关晓彤军大衣裹羽绒服
微博热度:559622
27.中国方言真是博大精深
微博热度:558381
28.乐山
微博热度:414568
29.海南新增6家离岛免税店
微博热度:396494
30.淘宝和微信是不是可以相互接入
微博热度:393439
31.青你3制服有四种款式
微博热度:392053
32.阳光之下
微博热度:361511
33.Gugudan解散
微博热度:342295
34.上海将向医院推送医闹人员信息
微博热度:313200
35.父母不得因子女变更姓氏而拒付抚养费
微博热度:305407
36.被泼硫酸受伤严重女生已转广州治疗
微博热度:301558
37.从王源的发型看出上海风有多大
微博热度:297653
38.晴雅集片段
微博热度:290491
39.西安不夜城夜景有多美
微博热度:289128
40.首部抗疫纪录电影定档
微博热度:285421
41.警方通报乐山职院女生遇害案
微博热度:282875
42.鼓励员工在工作地休假
微博热度:280883
43.花小猪打车在京暂停服务一周
微博热度:277016
44.2020年度迷惑新闻
微博热度:275860
45.股市
微博热度:271745
46.河南购买进口冷链食品需实名
微博热度:268766
47.聊斋画风老君山
微博热度:267576
48.庞星火详解北京顺义疫情传播链
微博热度:264903
49.12月新增本土确诊病例104例
微博热度:263147
50.沈阳全面进入战时状态
微博热度:260448
| 7.161765 | 20 | 0.790554 | yue_Hant | 0.352166 |
a6922c7b1f57d590bc2348e514c2267ddd457cb1 | 848 | md | Markdown | list/front_end/jaywcjlove.md | Lawrence-zxc/programmer-list | f00974f4c8abf9cbd4326f982855c36c8cd5f535 | [
"Apache-2.0"
] | 39 | 2017-09-13T12:17:05.000Z | 2018-08-24T05:10:54.000Z | list/front_end/jaywcjlove.md | justinscript/programmer-list | f00974f4c8abf9cbd4326f982855c36c8cd5f535 | [
"Apache-2.0"
] | 4 | 2017-09-13T11:15:13.000Z | 2018-05-22T08:02:16.000Z | list/front_end/jaywcjlove.md | justinscript/programmer-list | f00974f4c8abf9cbd4326f982855c36c8cd5f535 | [
"Apache-2.0"
] | 7 | 2017-09-13T14:52:17.000Z | 2017-10-16T03:54:41.000Z | # 博客地址
[https://jaywcjlove.github.io/](https://jaywcjlove.github.io/)
前端开发工程师。敏捷实践者。生活在浮躁的上海。
人生苦短,我用JSLite.io;时光荏苒,光阴不再,建立梦想,以Coding为生,以Coding为乐。
[我的Github仓库](https://github.com/jaywcjlove)
## 我熟悉
* HTML: 这…我写出来好吗?
* javascript: 好像挺熟练的。
* CoffeeScript: 偶尔使用一下,并非我写js的担当,不是重度使用者。
* CSS: 我选择使用它Stylus
* ejs: 虽然 jade 也挺好,但是我只想要一个简单的帮我填充数据的模板。
* Git / Svn: 越来越喜欢Git
* grunt / gulp: 暂时喜欢grunt
* Photoshop: 以前拿它混饭吃
* Flash/ActionScript: 如果你拿刀架在我脖子上说不定我会捡起来。
## 我略知
* Node.js [express/mongodb/ejs/grunt/…]
* iOS/Mac开发 [做过一些小应用]
* PHP
* …
## 爱好
* 旅游
* 看电视
## 联系我
```
OS X: echo d293b2hvb0BxcS5jb20K | base64 -D
Linux: echo d293b2hvb0BxcS5jb20K | base64 -d
```
## 简历
[wcj for github](https://github.com/jaywcjlove/wcj) 输出很多资料都是真的哦,嘿嘿!您识别一下?
# 全局安装,安装报错是需要前面加上sudo
$ sudo npm install -g wcj
## 我常去这里逛逛
* mozilla
* segmentfault
* 前端乱炖
* GITHUB | 16.307692 | 73 | 0.712264 | yue_Hant | 0.568584 |
a692485820d51350b1a79f81502399d0a6a64b29 | 1,890 | md | Markdown | WindowsServerDocs/administration/windows-commands/manage-bde-changepassword.md | eltociear/windowsserverdocs.ru-ru | b3680abe906fa1141b42e297a8178cee4f81cf7d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/administration/windows-commands/manage-bde-changepassword.md | eltociear/windowsserverdocs.ru-ru | b3680abe906fa1141b42e297a8178cee4f81cf7d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/administration/windows-commands/manage-bde-changepassword.md | eltociear/windowsserverdocs.ru-ru | b3680abe906fa1141b42e297a8178cee4f81cf7d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Manage-bde ChangePassword
description: Справочный раздел по команде Manage-bde ChangePassword, который изменяет пароль для диска данных.
ms.prod: windows-server
ms.technology: manage-windows-commands
ms.topic: article
ms.assetid: b174e152-8442-4fba-8b33-56a81ff4f547
author: coreyp-at-msft
ms.author: coreyp
manager: dongill
ms.date: 10/16/2017
ms.openlocfilehash: 28cc97165bfc33809c187630e37ad9b9bd24d7c6
ms.sourcegitcommit: 29bc8740e5a8b1ba8f73b10ba4d08afdf07438b0
ms.translationtype: MT
ms.contentlocale: ru-RU
ms.lasthandoff: 05/30/2020
ms.locfileid: "84222930"
---
# <a name="manage-bde-changepassword"></a>Manage-bde ChangePassword
Изменяет пароль для диска данных. Пользователю будет предложено ввести новый пароль.
## <a name="syntax"></a>Синтаксис
```
manage-bde -changepassword [<drive>] [-computername <name>] [{-?|/?}] [{-help|-h}]
```
### <a name="parameters"></a>Параметры
| Параметр | Описание |
| --------- | ----------- |
| `<drive>` | Представляет букву диска, за которой следует двоеточие. |
| -ComputerName | Указывает, что Manage-bde. exe будет использоваться для изменения защиты BitLocker на другом компьютере. Можно также использовать параметр **-CN** в качестве сокращенной версии этой команды. |
| `<name>` | Представляет имя компьютера, на котором необходимо изменить защиту BitLocker. Допустимые значения включают имя NetBIOS компьютера и IP-адрес компьютера. |
| -? или/? | Отображает краткую справку в командной строке. |
| -Help или-h | Отображает полную справку в командной строке. |
### <a name="examples"></a>Примеры
Чтобы изменить пароль, используемый для разблокировки BitLocker на диске данных D, введите:
```
manage-bde –changepassword D:
```
## <a name="additional-references"></a>Дополнительные ссылки
- [Условные обозначения синтаксиса команд командной строки](command-line-syntax-key.md)
- [Команда Manage-bde](manage-bde.md)
| 36.346154 | 210 | 0.756085 | rus_Cyrl | 0.713514 |
a69297e21e2f23d6975b2a792190b8904e284dca | 2,638 | md | Markdown | articles/vpn-gateway/point-to-site-how-to-vpn-client-install-azure-cert.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | 43 | 2017-08-28T07:44:17.000Z | 2022-02-20T20:53:01.000Z | articles/vpn-gateway/point-to-site-how-to-vpn-client-install-azure-cert.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | 676 | 2017-07-14T20:21:38.000Z | 2021-12-03T05:49:24.000Z | articles/vpn-gateway/point-to-site-how-to-vpn-client-install-azure-cert.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | 153 | 2017-07-11T00:08:42.000Z | 2022-01-05T05:39:03.000Z | ---
title: Installer un certificat client point à site
titleSuffix: Azure VPN Gateway
description: Découvrez comment installer des certificats clients pour l’authentification par certificat P2S (Windows, Mac et Linux).
services: vpn-gateway
author: cherylmc
ms.service: vpn-gateway
ms.topic: how-to
ms.date: 09/03/2021
ms.author: cherylmc
ms.openlocfilehash: 917f60440d98924e5339f29fb99587eacf40b415
ms.sourcegitcommit: 0770a7d91278043a83ccc597af25934854605e8b
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 09/13/2021
ms.locfileid: "124766252"
---
# <a name="install-client-certificates-for-p2s-certificate-authentication-connections"></a>Installer des certificats client pour des connexions d’authentification par certificat P2S
Quand une passerelle VPN P2S est configurée pour exiger l’authentification par certificat, un certificat client doit être installé localement sur chaque ordinateur client. Cet article vous aide à installer un certificat client localement sur un ordinateur client. Vous pouvez également utiliser [Intune](/mem/intune/configuration/vpn-settings-configure) pour installer certains profils et certificats de client VPN.
Si vous souhaitez générer un certificat client à partir d’un certificat racine auto-signé, consultez l’un des articles suivants :
* [Générer des certificats - PowerShell](vpn-gateway-certificates-point-to-site.md)
* [Générer des certificats - MakeCert](vpn-gateway-certificates-point-to-site-makecert.md)
* [Générer des certificats - Linux](vpn-gateway-certificates-point-to-site-linux.md)
## <a name="windows"></a><a name="installwin"></a>Windows
[!INCLUDE [Install on Windows](../../includes/vpn-gateway-certificates-install-client-cert-include.md)]
## <a name="mac"></a><a name="installmac"></a>Mac
>[!NOTE]
>Les clients VPN Mac sont pris en charge avec le [modèle de déploiement de Resource Manager](../azure-resource-manager/management/deployment-models.md) seulement. Ils ne sont pas pris en charge avec le modèle de déploiement Classic.
>
>
[!INCLUDE [Install on Mac](../../includes/vpn-gateway-certificates-install-mac-client-cert-include.md)]
## <a name="linux"></a><a name="installlinux"></a>Linux
Le certificat client Linux est installé sur le client dans le cadre de la configuration du client. Pour obtenir des instructions, voir [Configuration client - Linux](point-to-site-vpn-client-configuration-azure-cert.md#linuxinstallcli).
## <a name="next-steps"></a>Étapes suivantes
Passez aux étapes de configuration point à site pour [Créer et installer les fichiers de configuration du client VPN](point-to-site-vpn-client-configuration-azure-cert.md). | 56.12766 | 415 | 0.791509 | fra_Latn | 0.643409 |
a692ba25ad07f1aad663e9dc6bad44294506f86a | 2,245 | md | Markdown | README.md | JungJongSeok/JS-Android-Voice-Player | 987fafca1637f23a654466e44b9cd1431f67fef4 | [
"Apache-2.0"
] | 1 | 2019-08-05T08:20:36.000Z | 2019-08-05T08:20:36.000Z | README.md | JungJongSeok/JS-Android-Voice-Player | 987fafca1637f23a654466e44b9cd1431f67fef4 | [
"Apache-2.0"
] | null | null | null | README.md | JungJongSeok/JS-Android-Voice-Player | 987fafca1637f23a654466e44b9cd1431f67fef4 | [
"Apache-2.0"
] | null | null | null | # JS-Android-Utils

### You can Illegalagment Exception Safe Voice Recorder, Voice Player
# First
### Get Permission
AndroidManifest
```
<uses-permission android:name="android.permission.RECORD_AUDIO" />
```
code
```
ActivityCompat.requestPermissions(this, arrayOf(Manifest.permission.RECORD_AUDIO), REQUEST_PERMISSION_CODE)
```
```
override fun onRequestPermissionsResult(requestCode: Int, permissions: Array<out String>, grantResults: IntArray) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults)
if (requestCode == REQUEST_PERMISSION_CODE) {
grantResults.find { it == PackageManager.PERMISSION_DENIED }?.run {
// Denied something
} ?: run {
// Success something
}
}
}
```
# Function
### Voice Recoder
```
// if you want define path
val jsAudioRecorder = JSAudioRecorder(path, RECORD_DURATION_MILLIS.toInt())
// else ( absolute path = cacheDir.absolutePath + "/AudioRecording.mp4" )
val jsAudioRecorder = JSAudioRecorder(context, RECORD_DURATION_MILLIS.toInt())
```
### Voice Player
```
// if you want define path
val jsAudioPlayer = JSAudioPlayer(path, MediaPlayer.OnPreparedListener {
// do init something
}, MediaPlayer.OnCompletionListener {
// do complete something
})
// else ( absolute path = cacheDir.absolutePath + "/AudioRecording.mp4" )
val jsAudioPlayer = JSAudioPlayer(this, MediaPlayer.OnPreparedListener {
// do init something
}, MediaPlayer.OnCompletionListener {
// do complete something
})
```
### Option : CountDownTimer - You can show the text when media player is in progress.
```
val jsCountDownPlayerTimer = JSCountDownTimer(duration, interval, object : JSCountDownTimer.Listener {
override fun onFinish() {
// do finish something
}
override fun onTick(millisUntilFinished: Long) {
// do tick something
}
})
```
# How to use ?
### Add Project build/gradle
```
allprojects {
repositories {
...
maven { url "https://jitpack.io" }
}
}
```
### Add defendency
```
dependencies {
implementation 'com.github.JungJongSeok:JS-Android-Utils:1.0.0'
}
```
| 27.716049 | 115 | 0.69265 | yue_Hant | 0.329163 |
a693712976b603e6393b609291acf2ce1be42326 | 1,246 | md | Markdown | CHANGELOG.md | vincentqb/bashdot | f0823560147b128cef43c72dfad1d241a2cde3fd | [
"MIT"
] | 1 | 2019-03-08T19:49:40.000Z | 2019-03-08T19:49:40.000Z | CHANGELOG.md | weavenet/bashdot | 07ed373bc7d1a756156894b6a2fcf8d51349a963 | [
"MIT"
] | null | null | null | CHANGELOG.md | weavenet/bashdot | 07ed373bc7d1a756156894b6a2fcf8d51349a963 | [
"MIT"
] | null | null | null | # Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [4.1.3] - 2019-06-15
### Changed
- Bug fix adding quotes to some variables
## [4.1.2] - 2019-04-10
### Changed
- Bug fix when removing rendered templates
## [4.1.1] - 2019-04-09
### Changed
- Support dotfiles directories with . (~/.dotfiles)
## [4.1.0] - 2019-03-30
### Changed
- Updated ignore files
- Error if varialbles are not set in rendered template
## [4.0.1] - 2019-03-28
### Changed
- Add ignored files
## [4.0.0] - 2019-03-28
### Changed
- Removed requirement for profiles directory
## [3.0.0] - 2019-03-26
### Added
- Template support
## [2.1.0] - 2019-03-25
### Changed
- Check for valid profile name
- Switch to log output
- Bug fix installing multiple profiles with similiar path prefixes
## [2.0.0] - 2019-03-09
### Changed
- Renamed shared profile to default
- Renamed ls to profiles
- Added links
- Refactored to support profiles from multiple directories
- Uninstall required directory and profile to uninstall
## [1.0.0] - 2019-03-09
### Added
- Initial Release
| 23.509434 | 87 | 0.697432 | eng_Latn | 0.949978 |
a693b7904addfba80774fc0698193004838c5bae | 769 | md | Markdown | dm_1/early_design.md | cyf-gh/cyf-cloud.back | afa2bbce52633be32e7a41f7321d3f82359c1523 | [
"MIT"
] | 1 | 2020-04-20T11:16:42.000Z | 2020-04-20T11:16:42.000Z | dm_1/early_design.md | cyf-gh/Vt.Server | 4578d2b1874cabe469b844d70870e6026a97792c | [
"MIT"
] | null | null | null | dm_1/early_design.md | cyf-gh/Vt.Server | 4578d2b1874cabe469b844d70870e6026a97792c | [
"MIT"
] | null | null | null | ## 概述
开发代号:Dossier Manager
### 技术栈
* SQLite
* Golang
### 核心
核心是http服务器端,返回文件流与文件的信息。
## 资源分类
资源(Resource)指任何的有价值的文件或文件夹和备份的总称。
资源分为:
* 目标资源(Target Resource):指任何的有价值的文件或文件夹
* 备份资源(Backup Resource):指任何的有价值的文件或文件夹的备份
### 特:通用属性
所有的目标资源(Target Resource)都具有这些属性:
* 标签(Tag):用于快速寻找某一类资源
* 元数据(Meta):一个资源本身的数据:如修改日期、文件大小、文件MD5校验值
* 备份列表(Backup List):所有资源都可以进行备份,应当列出资源的所有备份资源列表
* 文件描述(Description):为用户添加的文件描述文本,为Markdown格式
### 备份资源(Backup Resource)
一定是一个文件。
备份资源可以是一个安装包,也可以是一个压缩包。
#### 文件类型
\*msi,\*exe, \*pkg,
### 二进制包(Binary Package)
一定是一个文件夹。
二进制包代表某个文件夹下为一个可执行程序的实体,它包含以下额外属性。
* 运行兼容情况
* 目标操作系统二进制兼容性(从操作系统列表中选择一个操作系统为他可运行的目标平台)
#### 接口
返回:
* 二进制包的递归目录
#### ~~文件类型~~
### 图片(Image)
图片格式。
#### 文件类型
见常见的图片格式一览
## Tag
必须先添加tag才可以勾选tag | 10.118421 | 47 | 0.708713 | yue_Hant | 0.898451 |
a694637aa43863e8b2490d355eac67540286311a | 2,575 | md | Markdown | CHANGELOG.md | jdnavarro/smallcheck-instances | 87373ac6b66bec55eb1b88f72e6923addfaf130a | [
"BSD-3-Clause"
] | 4 | 2015-04-28T13:33:34.000Z | 2020-10-06T13:09:42.000Z | CHANGELOG.md | jdnavarro/smallcheck-instances | 87373ac6b66bec55eb1b88f72e6923addfaf130a | [
"BSD-3-Clause"
] | 14 | 2015-04-27T14:54:03.000Z | 2021-02-17T19:04:45.000Z | CHANGELOG.md | jdnavarro/smallcheck-instances | 87373ac6b66bec55eb1b88f72e6923addfaf130a | [
"BSD-3-Clause"
] | 6 | 2016-06-11T22:01:01.000Z | 2021-01-16T17:42:42.000Z | # Change Log
All notable changes to this project will be documented in this file. This file
follows the formatting recommendations from [Keep a
CHANGELOG](http://keepachangelog.com/). This project adheres to [Semantic
Versioning](http://semver.org/).
## [0.7.0.0] — Tuesday, 15 of September 2020
### Added
- An instance `Serial` for `Set`.
## [0.6.1.1] - Tuesday, 15 of September 2020
### Fixed
- Allow `base` versions up to `4.15`.
- Run continuous integration with more compiler versions, up to `GHC-8.10.2`
## [0.6.1] - 2019-02-17
### Fixed
- `base` version bump to `4.13`.
- Conditionally exclude instances already supported by `smallcheck-1.1.3`.
## [0.6] - 2016-07-03
### Added
- Support for `base-4.9` which comes bundled with `GHC-8.0.1`.
### Removed
- Support for `GHC-7.8.4`. For some reason with this version the `transformers`
dependency is pinned to 0.3.0.0. I don't have time to fix this issue but if
you know how to fix it PRs are welcome.
- Support for stack.
## [0.5.1] - 2015-09-01
### Fixed
- Intances of `Word` and `Int` now stop generating at its maxium bound.
## [0.5] - 2015-08-31
### Changed
- `Text` and `ByteString` Serial instances are now exhaustive.
### Added
- `Serial` and `CoSerial` instances for `Word`, `Word8`, `Int16`.
- `Serial` and `CoSerial` instances for `Int8`, `Int16`.
## [0.4] - 2015-08-06
### Added
- Support for stack.
- `CoSerial` instances.
## [0.3] - 2015-05-25
### Added
- Serial instance for `Map`.
- `zipLogic` for *zipping* instances. Thanks to Roman Cheplyaka
[@feuerbach](https://github.com/feuerbach).
### Fixed
- Compatibility with GHC < 7.10.
## [0.2] - 2015-04-28
### Changed
- General renaming to, hopefully, make functions more clear.
### Fixed
- Series don't repeat elements anymore. Kudos to Roman Cheplyaka for
reporting this issue.
## [0.1] - 2015-04-27
### Added
- Initial set of utilities for creating `ByteString` and `Text` `Series`.
- `Serial` `ByteString` and `Text` instances.
[0.6.1]: https://github.com/jdnavarro/smallcheck-series/compare/v0.6...v0.6.1
[0.6]: https://github.com/jdnavarro/smallcheck-series/compare/v0.5.1...v0.6
[0.5.1]: https://github.com/jdnavarro/smallcheck-series/compare/v0.5...v0.5.1
[0.5]: https://github.com/jdnavarro/smallcheck-series/compare/v0.4...v0.5
[0.4]: https://github.com/jdnavarro/smallcheck-series/compare/v0.3...v0.4
[0.3]: https://github.com/jdnavarro/smallcheck-series/compare/v0.2...v0.3
[0.2]: https://github.com/jdnavarro/smallcheck-series/compare/v0.1...v0.2
[0.1]: https://github.com/jdnavarro/smallcheck-series/compare/49b5b0...v0.1
| 33.441558 | 79 | 0.693592 | eng_Latn | 0.79446 |
a6946e477d50465e915bdeaf7021eb429a95435f | 799 | md | Markdown | README.md | SHiziw/lily-pad | a3fdaa42d2c726ca8b904624c33f2b4941168bc3 | [
"MIT"
] | 128 | 2015-03-15T16:05:02.000Z | 2022-03-15T22:16:04.000Z | README.md | SHiziw/lily-pad | a3fdaa42d2c726ca8b904624c33f2b4941168bc3 | [
"MIT"
] | 17 | 2015-06-05T18:08:18.000Z | 2021-10-05T17:51:29.000Z | README.md | SHiziw/lily-pad | a3fdaa42d2c726ca8b904624c33f2b4941168bc3 | [
"MIT"
] | 70 | 2015-03-16T13:47:50.000Z | 2022-02-10T03:53:19.000Z | #  Lily Pad
### Test platform for real-time two-dimensional fluid/structure interaction simulations written in Processing.
[](http://dx.doi.org/10.5281/zenodo.16065)
Follow these [simple instructions to install and run Lily Pad](INSTALL_AND_RUN.md).
Here is the [guideline for contributing](CONTRIBUTING.md) to the project.
The [wiki](https://github.com/weymouth/lily-pad/wiki) has further information such as:
* [Documentation on the solver](https://github.com/weymouth/lily-pad/wiki/documentation).
* [Help with non-dimensionalization](https://github.com/weymouth/lily-pad/wiki/non-dimensional) can also be handy.

| 49.9375 | 114 | 0.772215 | eng_Latn | 0.428972 |
a694802ef00dfb991da579d812b8c8778593a0c9 | 2,061 | md | Markdown | docs/data/material/components/box/box.md | paales/material-ui | 80569a2609df79caec9f58164f45e1d258efba6e | [
"MIT"
] | 2,045 | 2022-02-06T15:55:53.000Z | 2022-03-31T23:38:57.000Z | docs/data/material/components/box/box.md | paales/material-ui | 80569a2609df79caec9f58164f45e1d258efba6e | [
"MIT"
] | 1,472 | 2022-02-06T14:49:30.000Z | 2022-03-31T21:38:20.000Z | docs/data/material/components/box/box.md | paales/material-ui | 80569a2609df79caec9f58164f45e1d258efba6e | [
"MIT"
] | 1,127 | 2022-02-06T17:57:41.000Z | 2022-03-31T22:24:17.000Z | ---
product: material-ui
title: React Box
components: Box
githubLabel: 'component: Box'
---
# Box
<p class="description">The Box component serves as a wrapper component for most of the CSS utility needs.</p>
The Box component packages [all the style functions](/system/basics/#all-inclusive) that are exposed in `@mui/system`.
{{"component": "modules/components/ComponentLinkHeader.js", "design": false}}
## Example
[The palette](/system/palette/) style function.
## The `sx` prop
All system properties are available via the [`sx` prop](/system/basics/#the-sx-prop).
In addition, the `sx` prop allows you to specify any other CSS rules you may need. Here's an example of how you can use it:
{{"demo": "BoxSx.js", "defaultCodeOpen": true }}
## Overriding MUI components
The Box component wraps your component.
It creates a new DOM element, a `<div>` that by default can be changed with the `component` prop.
Let's say you want to use a `<span>` instead:
{{"demo": "BoxComponent.js", "defaultCodeOpen": true }}
This works great when the changes can be isolated to a new DOM element.
For instance, you can change the margin this way.
However, sometimes you have to target the underlying DOM element.
As an example, you may want to change the border of the Button.
The Button component defines its own styles. CSS inheritance doesn't help.
To workaround the problem, you can use the [`sx`](/system/basics/#the-sx-prop) prop directly on the child if it is a MUI component.
```diff
-<Box sx={{ border: '1px dashed grey' }}>
- <Button>Save</Button>
-</Box>
+<Button sx={{ border: '1px dashed grey' }}>Save</Button>
```
For non-MUI components, use the `component` prop.
```diff
-<Box sx={{ border: '1px dashed grey' }}>
- <button>Save</button>
-</Box>
+<Box component="button" sx={{ border: '1px dashed grey' }}>Save</Box>
```
## System props
As a CSS utility component, the `Box` also supports all [`system`](/system/properties/) properties. You can use them as prop directly on the component.
For instance, a margin-top:
```jsx
<Box mt={2}>
```
| 30.761194 | 151 | 0.712761 | eng_Latn | 0.986904 |
a695f5756d7159a5a1dd8e900b0b6e92a041b5cd | 10,235 | md | Markdown | articles/active-directory/user-help/my-apps-portal-end-user-update-profile.md | FloKlaffenbach/azure-docs.de-de | 938684e8ca7296a11e35f658e389a5b952f786a7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/user-help/my-apps-portal-end-user-update-profile.md | FloKlaffenbach/azure-docs.de-de | 938684e8ca7296a11e35f658e389a5b952f786a7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/user-help/my-apps-portal-end-user-update-profile.md | FloKlaffenbach/azure-docs.de-de | 938684e8ca7296a11e35f658e389a5b952f786a7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Aktualisieren von Profil- und Kontoinformationen über das Portal „Meine Apps“ – Azure AD
description: Erfahren Sie, wie Sie Ihr Profil und Ihre Geschäfts-, Schul- oder Unikontoinformationen aktualisieren können, einschließlich Ändern Ihres Kennworts, Zurücksetzen des Kennworts, Aktualisieren Ihrer Sicherheitsüberprüfungsmethoden, Anzeigen des Hinweises mit den Nutzungsbedingungen Ihres Unternehmens und Abmelden von überall dort, wo Sie sich mit Ihrem Geschäfts-, Schul- oder Unikonto angemeldet haben.
services: active-directory
author: curtand
manager: daveba
ms.service: active-directory
ms.subservice: user-help
ms.workload: identity
ms.topic: conceptual
ms.date: 02/03/2020
ms.author: curtand
ms.reviewer: kasimpso
ms.custom: user-help, seo-update-azuread-jan
ms.openlocfilehash: a6bcfa7fc58d47e64bff0838ff698bc59eda4e70
ms.sourcegitcommit: 21e33a0f3fda25c91e7670666c601ae3d422fb9c
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 02/05/2020
ms.locfileid: "77022278"
---
# <a name="update-your-profile-and-account-information-on-the-my-apps-portal"></a>Aktualisieren Ihrer Profil- und Kontoinformationen im Portal „Meine Apps“
Sie können Ihr Geschäfts-, Schul- oder Unikonto mit dem webbasierten Portal **Meine Apps** für folgende Aufgaben verwenden:
- Anzeigen und Starten vieler cloudbasierter Apps Ihrer Organisation
- Aktualisieren einiger Ihrer Profil- und Kontoinformationen
- Anzeigen von Informationen zu Ihren **Gruppen**
- Ausführen von **Zugriffsüberprüfungen** für Ihre Apps und Gruppen
Sollten Sie keinen Zugriff auf das Portal **Meine Apps** haben, wenden Sie sich an Ihren Helpdesk.
[!INCLUDE [preview-notice](../../../includes/active-directory-end-user-my-apps-portal.md)]
> [!Important]
> Dieser Inhalt richtet sich an Benutzer von **Meine Apps**. Administratoren können sich in der [Dokumentation zur Anwendungsverwaltung](https://docs.microsoft.com/azure/active-directory/manage-apps) über die Einrichtung und Verwaltung cloudbasierter Apps informieren.
## <a name="view-your-organization-related-profile-information"></a>Anzeigen der auf Ihre Organisation bezogenen Profilinformationen
Je nachdem, was Ihre Organisation auf der Seite **Profil** des Portals **Meine Apps** ausgewählt hat, sehen Sie möglicherweise Ihre spezifischen arbeitsbezogenen Details sowie Ihre Geräte und Aktivitäten und alle weiteren Organisationen, denen Sie angehören.
### <a name="to-view-your-profile-information"></a>So zeigen Sie Ihre Profilinformationen an
1. Melden Sie sich bei Ihrem Geschäfts-, Schul- oder Unikonto an, und [wechseln Sie zum Portal **Meine Apps**](my-apps-portal-end-user-access.md).
2. Wählen Sie rechts oben auf der Seite **Apps** an der Stelle mit Ihrem Namen und Ihrer Organisation Ihr Profilbild und dann **Profil** aus.
Die Seite **Profil** mit Ihren Profilinformationen wird angezeigt.

3. Auf der Seite **Profil** haben Sie folgende Möglichkeiten:
- **Überprüfen der auf Ihre Organisation bezogenen Details.** Sie können Ihr Foto, Ihren Namen, Titel, zugehörige E-Mail-Adressen und arbeitsplatzbezogene Informationen einsehen. Diese Informationen werden von Ihrer Organisation verwaltet und können nicht von Ihnen geändert werden. Wenn Sie einen Fehler erkennen, wenden Sie sich an Ihren Helpdesk.
- **Überprüfen Ihrer Geräte und Aktivitäten**. Stellen Sie sicher, dass jedes Gerät Ihnen bekannt und ordnungsgemäß mit Ihrer Organisation verbunden ist. Wenn Sie ein Gerät nicht erkennen, wählen Sie **Gerät deaktivieren** aus, um die Zuordnung zu Ihrem Konto aufzuheben. Nachdem Sie ein Gerät deaktiviert haben, wird es von dieser Seite entfernt.
- **Überprüfen Ihrer Organisationen**. Vergewissern Sie sich, dass Sie weiterhin für jede der angegebenen Organisationen tätig sind. Wenn Sie nicht mehr für eine Organisation arbeiten, sollten Sie unbedingt auf **Zum Verlassen der Organisation anmelden** klicken. Nachdem Sie die Organisation verlassen haben, wird sie von dieser Seite entfernt.
## <a name="manage-your-work-or-school-account-information"></a>Verwenden der Informationen zu Ihrem Geschäfts-, Schul- oder Unikonto
Im Portal **Meine Apps** können Sie auf der Seite **Profil** die Informationen zu Ihrem Geschäfts-, Schul- oder Unikonto aktualisieren und verwalten. Auf dieser Seite haben Sie folgende Möglichkeiten:
- Ändern des Kennworts Ihres Geschäfts-, Schul- oder Unikontos
- Aktivieren der Zurücksetzung des Kennworts (sofern Ihr Administrator diese Möglichkeit aktiviert hat)
- Angeben zusätzlicher Informationen zur Sicherheitsüberprüfung
- Überprüfen der Nutzungsbedingungen Ihrer Organisation
- Überall abmelden
## <a name="change-your-password"></a>Ändern des Kennworts
Wenn Sie das Kennwort Ihres Geschäfts-, Schul- oder Unikonto ändern möchten, können Sie auf der Seite **Profil** im Bereich **Konto verwalten** den Befehl **Kennwort ändern** auswählen.
### <a name="to-change-your-password"></a>So ändern Sie das Kennwort
1. Wählen Sie auf der Seite **Profil** im Bereich **Konto verwalten** die Option **Kennwort ändern**.
2. Stellen Sie auf der Seite **Kennwort ändern** sicher, dass Ihre Benutzer-ID stimmt, und geben Sie dann Ihr altes und neues Kennwort in die Felder ein.

3. Klicken Sie auf **Submit** (Senden).
Ihr Kennwort wurde geändert. Sie müssen sich mit Ihrem Geschäfts-, Schul- oder Unikonto bei allen Apps anmelden, bei denen Sie sich zuvor angemeldet haben.
## <a name="set-up-and-use-password-reset"></a>Einrichten und Nutzen der Kennwortzurücksetzung
Wenn Sie Ihr Kennwort vergessen haben, vom Support Ihres Unternehmens keines erhalten haben oder aus Ihrem Konto ausgesperrt wurden, können Sie Ihr eigenes Kennwort zurückzusetzen.
>[!Important]
>Ihr Administrator muss diese Möglichkeit aktivieren, und Sie müssen sich für die Teilnahme registrieren. Weitere Informationen zum Registrieren und Zurücksetzen des Kennworts finden Sie unter [Registrieren für die Self-Service-Kennwortzurücksetzung](active-directory-passwords-reset-register.md) und [Zurücksetzen des Kennworts Ihres Geschäfts-, Schul- oder Unikontos](active-directory-passwords-update-your-own-password.md).
## <a name="change-your-security-verification-information"></a>Ändern der Informationen zur Sicherheitsüberprüfung
Wenn Ihr Unternehmen von Ihnen eine zweistufige Überprüfung verlangt, können Sie Ihre zugehörigen Sicherheitsinformationen auf der Seite **Zusätzliche Sicherheitsprüfung** hinzufügen, aktualisieren und löschen.
Bei der zweistufigen Überprüfung müssen Sie zwei Überprüfungsinformationen verwenden, wie z.B. ein Kennwort und eine PIN, bevor Sie auf Ihr Konto oder die Informationen Ihrer Organisation zugreifen können. Weitere Informationen zur zweistufigen Überprüfung finden Sie unter [Einrichten meines Kontos für die Überprüfung in zwei Schritten](multi-factor-authentication-end-user-first-time.md).
### <a name="to-change-your-security-information"></a>So ändern Sie Ihre Sicherheitsinformationen
1. Wählen Sie auf der Seite **Profil** im Bereich **Konto verwalten** die Option **Zusätzliche Sicherheitsüberprüfung** aus.

2. Auf der Seite **Zusätzliche Sicherheitsüberprüfung** können Sie die folgenden Informationen hinzufügen, ändern oder löschen:
- **Standardüberprüfungsoption**. Wählen Sie die standardmäßige sekundäre Methode für die zweistufige Überprüfung aus. Diese Methode wird automatisch verwendet, wenn eine zweistufige Überprüfung erforderlich ist, nachdem Sie Ihren Benutzernamen und Ihr Kennwort eingegeben haben.
- **Hinzufügen, Aktualisieren oder Entfernen von Überprüfungsmethoden**. Sie können neue Informationen hinzufügen, bestehende aktualisieren oder alte Informationen löschen, die nicht mehr gültig sind.
- **Einrichten der Microsoft Authenticator-App**. Sie können die Microsoft Authenticator-App so einrichten, dass sie als Ihre Überprüfungsmethode fungiert. Weitere Informationen zur Microsoft Authenticator-App finden Sie im Artikel [Was ist die Microsoft Authenticator-App?](user-help-auth-app-overview.md).
3. Wählen Sie **Speichern**, um Ihre Änderungen zu speichern.
## <a name="review-your-organizations-terms-of-use-statement"></a>Überprüfen des Hinweises mit den Nutzungsbedingungen Ihrer Organisation
Sie können den Hinweis mit den Nutzungsbedingungen Ihrer Organisation überprüfen, sofern verfügbar.
1. Wählen Sie auf der Seite **Profil** im Bereich **Konto verwalten** die Option **Nutzungsbedingungen lesen** aus.
2. Überprüfen Sie die Nutzungsbedingungen Ihres Unternehmens, und klicken Sie auf **Akzeptieren**, um zu bestätigen, dass Sie die Nutzungsbedingungen Ihrer Organisation gelesen und verstanden haben.

Wenn Ihr Unternehmen keine Nutzungsbedingungen hat, können Sie auf **Fertig** klicken, um zur Seite **Profil** zurückzukehren.
## <a name="sign-out-of-everywhere"></a>Überall abmelden
Sie müssen sich mit Ihrem Geschäfts-, Schul- oder Unikonto bei allen Apps abmelden, bei denen Sie derzeit angemeldet sind. Dies schließt alle Apps und Geräte ein.
### <a name="to-sign-out-of-everywhere"></a>So melden Sie sich überall ab
1. Wählen Sie auf der Seite **Profil** im Bereich **Konto verwalten** die Option **Überall abmelden**.
2. Klicken Sie im Bestätigungsfeld **Überall abmelden** auf **Ja**, um zu bestätigen, dass Sie sich von allen Sitzungen und Geräten abmelden möchten. Klicken Sie auf **Nein**, sollten Sie Ihre Meinung ändern.
## <a name="next-steps"></a>Nächste Schritte
Nach Abschluss der Änderungen auf der Seite **Profil** haben Sie folgende Möglichkeiten:
- [Zugreifen auf und Verwenden von Apps im Portal „Meine Apps“](my-apps-portal-end-user-access.md)
- [Anzeigen und Aktualisieren gruppenbezogener Informationen](my-apps-portal-end-user-groups.md)
- [Durchführen eigener Zugriffsüberprüfungen](my-apps-portal-end-user-access-reviews.md)
| 68.233333 | 426 | 0.795213 | deu_Latn | 0.997883 |
a696dde722f1af43bf2a44a92f37ca2d524bc962 | 5,172 | md | Markdown | README.md | lijijordan/hifish-ui | 0b0f5416ec401467ccf28e5e05fa6d8d43163d61 | [
"MIT"
] | null | null | null | README.md | lijijordan/hifish-ui | 0b0f5416ec401467ccf28e5e05fa6d8d43163d61 | [
"MIT"
] | null | null | null | README.md | lijijordan/hifish-ui | 0b0f5416ec401467ccf28e5e05fa6d8d43163d61 | [
"MIT"
] | null | null | null | # Kite: Responsive Admin Template #
Kite is a multipurpose and fully responsive admin template. It is ideal for any type of admin applications from custom admin panels and dashboards, to eCommerce and CMS backends. It has a clean and intuitive layout that is easy to work with. Built with Bootstrap v3, it looks perfect on all major browsers, tablets, and phones. Kite comes with several ready pages you can start your work with, and a complete set of custom UI elements that will make it easy for you to adjust the template for your needs.
### Key features ###
* Built with Bootstrap v3
* Fully responsive layout
* Vanilla CSS and source LESS files included
* Datatables plugin included
* Chart.js plugin included
* Functioning contact form included
* Complete set of custom UI elements included
* Clean and developer friendly code
* Free updates
* Free support
### Pages ###
* UI elements documentation
* Account: Profile
* Account: Edit profile
* Account: Inbox
* Account: Sign In (and password recovery)
* Account: Sign Up
* Orders
* FAQ’s
* Contact us
* Dummy page
### UI elements ###
* Tables
* Forms
* Charts
* Alerts
* Buttons
* Pagination
* Panels
* Progress bars
* Tabs
* Typography
Current release is 1.0.0. Buying this template now you become eligible to free download all of its future updates including new pages and neat options.
### Stay tuned ###
Follow us on Twitter and Facebook to instantly know of new templates and updates released:
https://twitter.com/YevSim, https://www.facebook.com/simpleqode
If you like our works, feel free to contact us with new feature requests or ideas for future templates:
http://simpleqode.com/#contact
Your feedback would be highly appreciated.
# General instructions #
Template structure:
/assets/bootstrap -- original Bootstrap files
/assets/css -- compiled CSS files
/assets/less -- source LESS files
/assets/js -- custom JS scripts
/assets/plugins -- third-party plugins
/assets/img -- images
/assets/ico -- favicon image
Q: How do I create a new page?
A: You can start with dummy.html. It contains the general wireframe, which includes navbar, sidebar, and footer.
Q: How do I edit the styles?
A: You can either work with vanilla CSS or source LESS files included. For vanilla CSS, please open assets/css/styles.css. Source LESS files are located at assets/less/. If you work with LESS files, after the changes are done, you only need to compile the main styles.less file located at assets/less/styles.less. This is the main LESS file that imports all of the other LESS files (including original Bootstrap source LESS files). Please visit http://lesscss.org/usage/index.html#third-party-compilers to find out how you can compile LESS files to CSS.
Q: How do I change the color scheme?
A: Changing a color scheme is easy with LESS. Please open /assets/less/components/colors.less, change the value of the @brand-primary and @brand-accent variables (also light and dark versions of those) and recompile the main styles.less file.
Q: How do I set up the contact form?
A:
This template contains a fully functioning PHP contact form with spam protection powered by reCaptcha. Note: The contact form will not work in your local environment without a server that supports PHP. In order to set up the contact form, please follow the steps below:
1) Open config.php and fill out the required information:
- reCaptcha Site ($publickey) and Secret ($privatekey) keys
Please go to https://www.google.com/recaptcha/admin/create if you don't have the keys yet.
- Sender name and email address ($mail_sender)
This is a name and email address you will see in the "From:" line of new emails you will receive.
- Your email address ($to_email)
This is an email address new emails will be sent to.
- Email subject ($mail_subject)
This is a subject of new emails you will receive.
2) Insert your reCaptcha Site key (see Step 1) in /contact.html:
```
#!html
<div class="g-recaptcha" data-sitekey="YOUR_SITE_KEY"></div>
<!-- (e.g. <div class="g-recaptcha" data-sitekey="09sdv0sf9v0sdf9b0df9b09dfb"></div>) -->
```
3) Save all files.
# Sources and credits #
### General files ###
* Twitter Bootstrap
URL: http://getbootstrap.com/
AUTHOR: @mdo and @fat
LICENSE: MIT
### Plugins ###
* Font Awesome
URL: http://fontawesome.io/
AUTHOR: Dave Gandy
LICENSE: MIT
* Chart.js
URL: http://www.chartjs.org/
AUTHOR: https://github.com/chartjs/Chart.js/graphs/contributors
LICENSE: MIT
* CountTo
URL: https://github.com/mhuggins/jquery-countTo
AUTHOR: Matt Huggins
LICENSE: MIT
* Datatables
URL: https://www.datatables.net/
AUTHOR: SpryMedia Ltd
LICENSE: MIT
* Perfect scrollbar
URL: https://noraesae.github.io/perfect-scrollbar/
AUTHOR: Hyunje Alex Jun
LICENSE: MIT
### Images ###
URL: https://www.flickr.com/photos/meunier_prof/galleries/72157632677105635/
LICENSE: Creative Commons
# Changelog #
Version 1.0.0 - May 06, 2016
* Initial release | 29.724138 | 554 | 0.720224 | eng_Latn | 0.964831 |
a6974a6fe7a1c4898250e2bee5298f9391f6a828 | 14,103 | md | Markdown | reference/7.1/PowerShellGet/Find-Script.md | sashizaki/PowerShell-Docs.ja-jp | ba9baa612916ad75ceab1407fcc66d51f6120277 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | reference/7.1/PowerShellGet/Find-Script.md | sashizaki/PowerShell-Docs.ja-jp | ba9baa612916ad75ceab1407fcc66d51f6120277 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | reference/7.1/PowerShellGet/Find-Script.md | sashizaki/PowerShell-Docs.ja-jp | ba9baa612916ad75ceab1407fcc66d51f6120277 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
external help file: PSModule-help.xml
keywords: powershell,コマンドレット
Locale: en-US
Module Name: PowerShellGet
ms.date: 06/09/2017
online version: https://docs.microsoft.com/powershell/module/powershellget/find-script?view=powershell-7.1&WT.mc_id=ps-gethelp
schema: 2.0.0
title: Find-Script
ms.openlocfilehash: 443795160fcf5b11ffdc2d7d4e6f5d265b160002
ms.sourcegitcommit: 22c93550c87af30c4895fcb9e9dd65e30d60ada0
ms.translationtype: MT
ms.contentlocale: ja-JP
ms.lasthandoff: 11/19/2020
ms.locfileid: "94892421"
---
# Find-Script
## 概要
スクリプトを検索します。
## SYNTAX
```
Find-Script [[-Name] <String[]>] [-MinimumVersion <String>] [-MaximumVersion <String>]
[-RequiredVersion <String>] [-AllVersions] [-IncludeDependencies] [-Filter <String>] [-Tag <String[]>]
[-Includes <String[]>] [-Command <String[]>] [-Proxy <Uri>] [-ProxyCredential <PSCredential>]
[-Repository <String[]>] [-Credential <PSCredential>] [-AllowPrerelease] [<CommonParameters>]
```
## Description
**検索スクリプト** コマンドレットは、登録されているリポジトリ内の指定したスクリプトを検索します。
## 例
### 例 1: 使用可能なすべてのスクリプトを検索する
```
PS C:\> Find-Script
Version Name Type Repository Description
------- ---- ---- ---------- -----------
2.5 Fabrikam-ClientScript Script LocalRepo1 Description for the Fabrikam-ClientScript script
2.5 Fabrikam-Script Script LocalRepo1 Description for the Fabrikam-Script script
2.5 Fabrikam-ServerScript Script LocalRepo1 Description for the Fabrikam-ServerScript script
2.5 Required-Script1 Script LocalRepo1 Description for the Required-Script1 script
2.5 Required-Script2 Script LocalRepo1 Description for the Required-Script2 script
2.5 Required-Script3 Script LocalRepo1 Description for the Required-Script3 script
2.0 Script-WithDependencies1 Script LocalRepo1 Description for the Script-WithDependencies1 script
2.0 Script-WithDependencies2 Script LocalRepo1 Description for the Script-WithDependencies2 script
2.0 Start-WFContosoServer Script LocalRepo1 Start-WFContosoServer Script example
2.1 Test-Script1 Script LocalRepo1 Test-Script1 Script example
2.0 Test-Script2 Script LocalRepo1 Test-Script2 Script example
1.0 TestRunbook Script LocalRepo1 Contoso Script example
```
このコマンドは、使用可能なすべてのスクリプトを検索します。
### 例 2: 名前を指定してスクリプトを検索する
```
PS C:\> Find-Script -Name "Start-WFContosoServer"
Version Name Type Repository Description
------- ---- ---- ---------- -----------
2.0 Start-WFContosoServer Script LocalRepo1 Start-WFContosoServer Script example
```
このコマンドは、WFContosoServer という名前のスクリプトを検索します。
### 例 3: 名前、必要なバージョン、および指定されたリポジトリからスクリプトを検索する
```
PS C:\> Find-Script -Name "Required-Script2" -RequiredVersion 2.0 -Repository "LocalRepo01"
```
このコマンドは、LocalRepo01 リポジトリで名前と必要なバージョンを使ってスクリプトを検索します。
### 例 4: スクリプトを検索し、出力を一覧として書式設定する
```
PS C:\> Find-Script -Name "Required-Script2" -RequiredVersion 2.0 -Repository "LocalRepo1" | Format-List * -Force
Name : Required-Script2
Version : 2.0
Type : Script
Description : Description for the Required-Script2 script
Author : pattif
CompanyName : Microsoft Corporation
Copyright : 2015 Microsoft Corporation. All rights reserved.
PublishedDate : 8/14/2015 2:37:01 PM
LicenseUri : http://required-script2.com/license
ProjectUri : http://required-script2.com/
IconUri : http://required-script2.com/icon
Tags : {, Tag1, Tag2, Tag-Required-Script2-2.0...}
Includes : {Function, DscResource, Cmdlet, Command}
PowerShellGetFormatVersion :
ReleaseNotes : Required-Script2 release notes
Dependencies : {}
RepositorySourceLocation : C:\MyLocalRepo
Repository : LocalRepo01
PackageManagementProvider : NuGet
```
このコマンドは、LocalRepo1 リポジトリ内の Required-Script2 を検索し、結果として得られる **できる psrepositoryiteminfo** オブジェクトを Format-List コマンドレットに渡します。
### 例 5: 指定したバージョンの範囲でスクリプトを検索する
```
PS C:\> Find-Script -Name "Required-Script2" -MinimumVersion 2.1 -MaximumVersion 2.5 -Repository "LocalRepo1"
Version Name Type Repository Description
------- ---- ---- ---------- -----------
2.5 Required-Script2 Script LocalRepo1 Description for the Required-Script2 script
```
このコマンドは、LocalRepo1 リポジトリ内のバージョン2.1 と2.5 の間の RequiredScript2 のすべてのバージョンを検索します。
### 例 6: スクリプトのすべてのバージョンを検索する
```
PS C:\> Find-Script -Name "Required-Script02" -AllVersions
Version Name Type Repository Description
------- ---- ---- ---------- -----------
1.0 Required-Script2 Script LocalRepo1 Description for the Required-Script2 script
1.5 Required-Script2 Script LocalRepo1 Description for the Required-Script2 script
2.0 Required-Script2 Script LocalRepo1 Description for the Required-Script2 script
2.5 Required-Script2 Script LocalRepo1 Description for the Required-Script2 script
```
このコマンドは、Script02 のすべてのバージョンを検索します。
### 例 7: スクリプトとその依存関係を検索する
```
PS C:\> Find-Script -Name "Script-WithDependencies1" -IncludeDependencies -Repository "LocalRepo1"
Version Name Type Repository Description
------- ---- ---- ---------- -----------
1.0 Script-WithDependencies1 Script LocalRepo1 Description for the Script-WithDependencies1 script
2.0 RequiredModule3 Script LocalRepo1 RequiredModule3 module
2.5 Required-Script1 Script LocalRepo1 Description for the Required-Script1 script
2.5 Required-Script2 Script LocalRepo1 Description for the Required-Script2 script
```
このコマンドは、スクリプトとその依存関係を検索します。
### 例 8: 指定したタグを持つスクリプトを検索する
```
PS C:\> Find-Script -Tag "Tag1" -Repository "LocalRepo1"
Version Name Type Repository Description
------- ---- ---- ---------- -----------
1.0 Fabrikam-ClientScript Script LocalRepo1 Description for the Fabrikam-ClientScript script
```
このコマンドは、LocalRepo1 リポジトリに Tag1 タグがあるスクリプトを検索します。
### 例 9: 指定したコマンド名を使用してスクリプトを検索する
```
PS C:\> Find-Script -Command Test-FunctionFromScript_Required-Script3 -Repository "LocalRepo1"
Version Name Type Repository Description
------- ---- ---- ---------- -----------
2.5 Required-Script3 Script LocalRepo1 Description for the Required-Script3 script
```
このコマンドは、指定されたコマンド名を含むスクリプトを検索します。
### 例 10: ワークフローを使用してスクリプトを検索する
```
PS C:\> Find-Script -Includes "Workflow" -Repository "LocalRepo1"
Version Name Type Repository Description
------- ---- ---- ---------- -----------
2.5 Fabrikam-ClientScript Script LocalRepo1 Description for the Fabrikam-ClientScript script
1.0 Fabrikam-Script Script LocalRepo1 Description for the Fabrikam-Script script
```
このコマンドは、LocalRepo1 リポジトリ内のワークフロースクリプトを検索します。
### 例 11: ワイルドカードを使用してスクリプトを検索する
```
PS C:\> Find-Script -Name "Required-Script*" -Repository "LocalRepo1"
Version Name Type Repository Description
------- ---- ---- ---------- -----------
2.5 Required-Script1 Script local1 Description for the Required-Script1 script
2.5 Required-Script2 Script local1 Description for the Required-Script2 script
2.5 Required-Script3 Script local1 Description for the Required-Script3 script
```
このコマンドは、ワイルドカード文字 (*) を使用して、Required-Script で始まるスクリプトを検索します。
## PARAMETERS
### -AllowPrerelease リリース
プレリリースとしてマークされた結果スクリプトにを含めます。
```yaml
Type: System.Management.Automation.SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -AllVersions
この操作がすべてのスクリプトのバージョンを検索することを示します。
```yaml
Type: System.Management.Automation.SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Command
スクリプト内で検索するコマンドの配列を指定します。
コマンドは、関数またはワークフローにすることができます。
```yaml
Type: System.String[]
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -Credential
```yaml
Type: System.Management.Automation.PSCredential
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -Filter
PackageManagement プロバイダー固有の検索構文に基づいてスクリプトを検索します。
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -IncludeDependencies
この操作で、 *Name* パラメーターに指定されたスクリプトに依存するすべてのスクリプトを取得することを示します。
```yaml
Type: System.Management.Automation.SwitchParameter
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -インクルード
取得するスクリプトの種類を指定します。
このパラメーターに指定できる値は、Function、Workflow です。
```yaml
Type: System.String[]
Parameter Sets: (All)
Aliases:
Accepted values: Function, Workflow
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -MaximumVersion
検索するスクリプトの最大バージョンまたは最新バージョンを指定します。
*MaximumVersion* および *RequiredVersion* パラメーターは相互に排他的です。両方のパラメーターを同じコマンドで使用することはできません。
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -MinimumVersion
検索するスクリプトの最小バージョンを指定します。
*MinimumVersion* および *RequiredVersion* パラメーターは相互に排他的です。両方のパラメーターを同じコマンドで使用することはできません。
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -Name
検索するスクリプトの名前の配列を指定します。
```yaml
Type: System.String[]
Parameter Sets: (All)
Aliases:
Required: False
Position: 0
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: True
```
### -プロキシ
インターネットリソースに直接接続するのではなく、要求のプロキシサーバーを指定します。
```yaml
Type: System.Uri
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -ProxyCredential
**Proxy** パラメーターに指定したプロキシ サーバーを使用するアクセス許可を持つユーザー アカウントを指定します。
```yaml
Type: System.Management.Automation.PSCredential
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -リポジトリ
Register-psrepository を実行して登録されているリポジトリのフレンドリ名を指定します。
```yaml
Type: System.String[]
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -RequiredVersion
検索するスクリプトの正確なバージョン番号を指定します。
```yaml
Type: System.String
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: True (ByPropertyName)
Accept wildcard characters: False
```
### -Tag
タグの配列を指定します。
```yaml
Type: System.String[]
Parameter Sets: (All)
Aliases:
Required: False
Position: Named
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### 共通パラメーター
このコマンドレットは、一般的なパラメーターをサポートしています。-Debug、-ErrorAction、-ErrorVariable、-InformationAction、-InformationVariable、-OutVariable、-OutBuffer、-PipelineVariable、-Verbose、-WarningAction、-WarningVariable です。 詳細については、「[about_CommonParameters](https://go.microsoft.com/fwlink/?LinkID=113216)」を参照してください。
## 入力
### System.String[]
### System.String
### System.Uri
### システム.... PSCredential
## 出力
### できる psrepositoryiteminfo
## 注
> [!IMPORTANT]
> 2020年4月の時点で、PowerShell ギャラリーでは、トランスポート層セキュリティ (TLS) バージョン1.0 と1.1 がサポートされなくなりました。 TLS 1.2 以降を使用していない場合は、PowerShell ギャラリーにアクセスしようとするとエラーが発生します。 次のコマンドを使用して、TLS 1.2 を使用していることを確認します。
>
> `[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12`
>
> 詳細については、PowerShell ブログの [お知らせ](https://devblogs.microsoft.com/powershell/powershell-gallery-tls-support/) を参照してください。
## 関連リンク
[Install-Script](Install-Script.md)
[Publish-Script](Publish-Script.md)
[Save-Script](Save-Script.md)
[Uninstall-Script](Uninstall-Script.md)
[Update-Script](Update-Script.md)
| 29.753165 | 286 | 0.641566 | yue_Hant | 0.56539 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.