hexsha
stringlengths 40
40
| size
int64 5
1.04M
| ext
stringclasses 6
values | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 3
344
| max_stars_repo_name
stringlengths 5
125
| max_stars_repo_head_hexsha
stringlengths 40
78
| max_stars_repo_licenses
sequencelengths 1
11
| max_stars_count
int64 1
368k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 3
344
| max_issues_repo_name
stringlengths 5
125
| max_issues_repo_head_hexsha
stringlengths 40
78
| max_issues_repo_licenses
sequencelengths 1
11
| max_issues_count
int64 1
116k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 3
344
| max_forks_repo_name
stringlengths 5
125
| max_forks_repo_head_hexsha
stringlengths 40
78
| max_forks_repo_licenses
sequencelengths 1
11
| max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | content
stringlengths 5
1.04M
| avg_line_length
float64 1.14
851k
| max_line_length
int64 1
1.03M
| alphanum_fraction
float64 0
1
| lid
stringclasses 191
values | lid_prob
float64 0.01
1
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
93487291df79ba8d08e0a29054b89bc0c8c97aba | 8,086 | md | Markdown | articles/api-management/api-management-howto-properties.md | Sarah-Kianfar/azure-docs | 309a9d26f94ab775673fd4c9a0ffc6caa571f598 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-09-13T17:33:35.000Z | 2020-09-13T17:33:35.000Z | articles/api-management/api-management-howto-properties.md | Sarah-Kianfar/azure-docs | 309a9d26f94ab775673fd4c9a0ffc6caa571f598 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/api-management/api-management-howto-properties.md | Sarah-Kianfar/azure-docs | 309a9d26f94ab775673fd4c9a0ffc6caa571f598 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-12-14T05:37:51.000Z | 2020-12-14T05:37:51.000Z | ---
title: How to use named values in Azure API Management policies
description: Learn how to use named values in Azure API Management policies.
services: api-management
documentationcenter: ''
author: vladvino
manager: erikre
editor: ''
ms.service: api-management
ms.workload: mobile
ms.tgt_pltfrm: na
ms.topic: article
ms.date: 01/08/2020
ms.author: apimpm
---
# How to use named values in Azure API Management policies
API Management policies are a powerful capability of the system that allow the Azure portal to change the behavior of the API through configuration. Policies are a collection of statements that are executed sequentially on the request or response of an API. Policy statements can be constructed using literal text values, policy expressions, and named values.
Each API Management service instance has a collection of key/value pairs, which is called named values, that are global to the service instance. There is no imposed limit on the number of items in the collection. Named values can be used to manage constant string values across all API configuration and policies. Each named value may have the following attributes:
| Attribute | Type | Description |
| -------------- | --------------- | -------------------------------------------------------------------------------------------------------------------------------------- |
| `Display name` | string | Used for referencing the named value in policies. A string of one to 256 characters. Only letters, numbers, dot, and dash are allowed. |
| `Value` | string | Actual value. Must not be empty or consist only of whitespace. Maximum of 4096 characters long. |
| `Secret` | boolean | Determines whether the value is a secret and should be encrypted or not. |
| `Tags` | array of string | Used to filter the named value list. Up to 32 tags. |

Named values can contain literal strings and [policy expressions](/azure/api-management/api-management-policy-expressions). For example, the value of `Expression` is a policy expression that returns a string containing the current date and time. The named value `Credential` is marked as a secret, so its value is not displayed by default.
| Name | Value | Secret | Tags |
| ---------- | -------------------------- | ------ | ------------- |
| Value | 42 | False | vital-numbers |
| Credential | •••••••••••••••••••••• | True | security |
| Expression | @(DateTime.Now.ToString()) | False | |
> [!NOTE]
> Instead of named values stored within an API Management service, you can use values stored in the [Azure Key Vault](https://azure.microsoft.com/services/key-vault/) service as demonstrated by this [example](https://github.com/Azure/api-management-policy-snippets/blob/master/examples/Look%20up%20Key%20Vault%20secret%20using%20Managed%20Service%20Identity.policy.xml).
## To add and edit a named value

1. Select **APIs** from under **API MANAGEMENT**.
2. Select **Named values**.
3. Press **+Add**.
Name and Value are required values. If value is a secret, check the _This is a secret_ checkbox. Enter one or more optional tags to help with organizing your named values, and click Save.
4. Click **Create**.
Once the named value is created, you can edit it by clicking on it. If you change the named value name, any policies that reference that named value are automatically updated to use the new name.
For information on editing a named value using the REST API, see [Edit a named value using the REST API](/rest/api/apimanagement/2019-12-01/property?patch).
## To delete a named value
To delete a named value, click **Delete** beside the named value to delete.
> [!IMPORTANT]
> If the named value is referenced by any policies, you will be unable to successfully delete it until you remove the named value from all policies that use it.
For information on deleting a named value using the REST API, see [Delete a named value using the REST API](/rest/api/apimanagement/2019-12-01/property/delete).
## To search and filter named values
The **Named values** tab includes searching and filtering capabilities to help you manage your named values. To filter the named values list by name, enter a search term in the **Search property** textbox. To display all named values, clear the **Search property** textbox and press enter.
To filter the list by tag, enter one or more tags into the **Filter by tags** textbox. To display all named values, clear the **Filter by tags** textbox and press enter.
## To use a named value
To use a named value in a policy, place its name inside a double pair of braces like `{{ContosoHeader}}`, as shown in the following example:
```xml
<set-header name="{{ContosoHeader}}" exists-action="override">
<value>{{ContosoHeaderValue}}</value>
</set-header>
```
In this example, `ContosoHeader` is used as the name of a header in a `set-header` policy, and `ContosoHeaderValue` is used as the value of that header. When this policy is evaluated during a request or response to the API Management gateway, `{{ContosoHeader}}` and `{{ContosoHeaderValue}}` are replaced with their respective values.
Named values can be used as complete attribute or element values as shown in the previous example, but they can also be inserted into or combined with part of a literal text expression as shown in the following example: `<set-header name = "CustomHeader{{ContosoHeader}}" ...>`
Named values can also contain policy expressions. In the following example, the `ExpressionProperty` is used.
```xml
<set-header name="CustomHeader" exists-action="override">
<value>{{ExpressionProperty}}</value>
</set-header>
```
When this policy is evaluated, `{{ExpressionProperty}}` is replaced with its value: `@(DateTime.Now.ToString())`. Since the value is a policy expression, the expression is evaluated and the policy proceeds with its execution.
You can test this out in the developer portal by calling an operation that has a policy with named values in scope. In the following example, an operation is called with the two previous example `set-header` policies with named values. Note that the response contains two custom headers that were configured using policies with named values.
![Developer portal][api-management-send-results]
If you look at the [API Inspector trace](api-management-howto-api-inspector.md) for a call that includes the two previous sample policies with named values, you can see the two `set-header` policies with the named values inserted as well as the policy expression evaluation for the named value that contained the policy expression.
![API Inspector trace][api-management-api-inspector-trace]
While named values can contain policy expressions, they can't contain other named values. If text containing a named value reference is used for a value, such as `Text: {{MyProperty}}`, that reference won't be resolved and replaced.
## Next steps
- Learn more about working with policies
- [Policies in API Management](api-management-howto-policies.md)
- [Policy reference](/azure/api-management/api-management-policies)
- [Policy expressions](/azure/api-management/api-management-policy-expressions)
[api-management-send-results]: ./media/api-management-howto-properties/api-management-send-results.png
[api-management-properties-filter]: ./media/api-management-howto-properties/api-management-properties-filter.png
[api-management-api-inspector-trace]: ./media/api-management-howto-properties/api-management-api-inspector-trace.png
| 67.94958 | 370 | 0.701583 | eng_Latn | 0.997314 |
9348ab548925c9718f5cb9abb1b8cc84b99d7df2 | 2,548 | md | Markdown | README.md | hsaunders0788/comment-when-done | c7d3b11ce76ce60374805048c8c924d9c4c7d9a0 | [
"ISC"
] | null | null | null | README.md | hsaunders0788/comment-when-done | c7d3b11ce76ce60374805048c8c924d9c4c7d9a0 | [
"ISC"
] | null | null | null | README.md | hsaunders0788/comment-when-done | c7d3b11ce76ce60374805048c8c924d9c4c7d9a0 | [
"ISC"
] | null | null | null | # comment-when-done
This is GitHub App built with [Probot](https://github.com/probot/probot).
It makes comments on issues when they're _closed_ and pull requests when they're _merged_. Pull requests that are closed without being merged will not trigger the bot to make a comment.
## Usage
The comment contents should be placed between an opening sentinel- `{{whendone}}` and closing sentinel- `{{/whendone}}`.
* The opening `{{whendone}}` must be at the beginning of a line
* Single line commands like this will work
```
{{whendone}}Tell Sam this bug is fixed{{/whendone}}
```
* Multi-line commands like this will also work
```
{{whendone}}
Set up the new alerts.
Let Pat know they should adjust the dashboards.
{{/whendone}}
```
* You can "escape" slash commands (e.g. from [the Probot reminer app](https://github.com/probot/reminders)) and @mentions by wrapping them in backticks
```
{{whendone}}
`/remind` me to set up the new alerts in two weeks.
Let `@this-is-pat` know they should adjust the dashboards.
`/remind` @another-person to stretch 20 minutes.
{{/whendone}}
```
## Running your own instance as a [Google Cloud Function](https://cloud.google.com/functions/)
You can run this anywhere or any way you want, but it is set up to make a GCP deployment (hopefully) easy using the [serverless framework](https://serverless.com/).
You will need to have `node` and `npm` installed.
```sh
git clone [email protected]:C-Saunders/comment-when-done.git
cd comment-when-done
npm install
npm run start
```
`npm run start` will facilitate the authorization flow that results from `probot run` and guide you to set up a GitHub app. It will create and populate a local .env file containing the secrets you'll need to be available as environment variables when the bot runs.
The secrets in the .env file will be read by `serverless` when you deploy.
Follow the instructions [provided by serverless](https://serverless.com/framework/docs/providers/google/guide/quick-start#2-set-up-the-credentials) to set up your GCP credentials. Save your credentials JSON file in `~/.gcloud/comment-when-done.json` (or update this path in serverless.yml to meet your needs).
```
npm install -g serverless
serverless deploy
```
## Contributing
If you have suggestions for how this application could be improved, want to report a bug, have a question, etc., please open an issue! We love all and any contributions.
For more, check out the [Contributing Guide](CONTRIBUTING.md).
## License
[ISC](LICENSE) © 2018 Charlie Saunders <[email protected]>
| 40.444444 | 309 | 0.75314 | eng_Latn | 0.995705 |
9348e14da233bde3aeee41f85cc5d2b9ee0b179b | 3,895 | md | Markdown | byzer-lang/zh-cn/developer/api/run_script_api.md | huangfox/byzer-doc | 58a35548d3ea7b4d3a2ed87e1d99a5c072f73143 | [
"Apache-2.0"
] | null | null | null | byzer-lang/zh-cn/developer/api/run_script_api.md | huangfox/byzer-doc | 58a35548d3ea7b4d3a2ed87e1d99a5c072f73143 | [
"Apache-2.0"
] | null | null | null | byzer-lang/zh-cn/developer/api/run_script_api.md | huangfox/byzer-doc | 58a35548d3ea7b4d3a2ed87e1d99a5c072f73143 | [
"Apache-2.0"
] | null | null | null | # /run/script 接口
该接口用来执行 Byzer-lang 语句。
## 参数列表
| 参数 | 说明 | 示例值 |
|----|--------------------------------------------------------------------------------------------------------------------------|-----|
| sql | 需要执行的 Byzer-lang 内容 | |
| owner | 当前发起请求的租户 | |
| jobType | 任务类型 script/stream/sql 默认script | |
| executeMode | 如果是执行 Byzer-lang 则为 query, 如果是为了解析 Byzer-lang 则为 analyze。很多插件会提供对应的 executeMode 从而使得用户可以通过 HTTP 接口访问插件功能 | |
| jobName | 任务名称,一般用 uuid 或者脚本 id ,最好能带上一些信息,方便更好的查看任务 | |
| timeout | 任务执行的超时时间 | 单位毫秒 |
| silence | 最后一条SQL是否执行 | 默认为 false|
| sessionPerUser | 按用户创建 sesison | 默认为 true|
| sessionPerRequest | 按请求创建 sesison | 默认为 false,一般如果是调度请求,务必要将这个值设置为true|
| async | 请求是不是异步执行 | 默认为 false|
| callback | 如果是异步执行,需要设置回调URL | |
| skipInclude | 禁止使用 include 语法 | 默认false |
| skipAuth | 禁止权限验证 | 默认true |
| skipGrammarValidate | 跳过语法验证 | 默认true |
| includeSchema | 返回的结果是否包含单独的 schema 信息 | 默认false |
| fetchType | take/collect, take 在查看表数据的时候非常快 | 默认collect |
| defaultPathPrefix | 所有用户主目录的基础目录 | |
| `context.__default__include_fetch_url__` | Byzer-lang Engine 获取 include 脚本的地址 | |
| `context.__default__fileserver_url__` | 下载文件服务器地址,一般默认也是 Notebook 地址 | |
| `context.__default__fileserver_upload_url__` | 上传文件服务器地址,一般默认也是 Notebook 地址 | |
| `context.__auth_client__` | 权限认证客户端的类 | 默认是streaming.dsl.auth.meta.client.MLSQLConsoleClient |
| `context.__auth_server_url__` | 数据访问验证服务器地址 | |
| `context.__auth_secret__` | Byzer-lang engine 回访请求服务器的密钥。比如 Notebook 调用了 Byzer-lang engine,需要传递这个参数, 然后 Byzer-lang engine 要回调 Notebook , 那么需要将这个参数带回 | |
| 121.71875 | 208 | 0.27163 | yue_Hant | 0.522892 |
9348fde24f2397a76a3ff9b9a5c72f257d3bb5e5 | 790 | md | Markdown | doc/detail.md | a523/storagestatck | cea14d5e276af1d90f0fc8c0b00881fa7e2bff9a | [
"MIT"
] | 3 | 2019-11-19T09:38:51.000Z | 2020-12-21T13:23:52.000Z | doc/detail.md | a523/storagestatck | cea14d5e276af1d90f0fc8c0b00881fa7e2bff9a | [
"MIT"
] | 7 | 2020-01-20T04:02:23.000Z | 2021-06-10T17:54:36.000Z | doc/detail.md | a523/storagestack | cea14d5e276af1d90f0fc8c0b00881fa7e2bff9a | [
"MIT"
] | null | null | null | ### TODO:
- 支持debug模式启动?显示不同日志?
### HOWTO (实现细节)
#### 部署
##### 同步hosts
> 采用python-hosts第三方库
##### 管理节点到其他节点免密钥
>1. 通过http发送公钥到agent,agent填写密钥
>
>2. 通过`ssh-keygen -R 172.16.32.128`
删除对应的记录
>
>3. `ssh-keyscan -t ecdsa 172.16.32.128 >> known_hosts`
获取目标主机公钥并添加到known_hosts
#### 用户管理
系统安装时默认创建一个超级用户:
```
user: admin
password: storage_stack
```
首次登录超级用户建议立即修改密码。
其他用户分管理员用户和普通用户(非敏感只读用户)和无任何权限用户。
通过web创建新用户,不可以是超级用户,可以是管理员。
超级用户只允许通过命令行创建。
只有超级用户才可以指配谁为管理员,
管理员可以修改普通用户,可以新建,删除,修改用户分组,
但是管理员不可以把普通用户提升为管理员
只有超级用户才可以指定谁为管理员。
普通用户通过加入不同组可以获得不同权限,或添加单项权限。
普通用户可以无任何权限,即只可以看出自己的个人中心。
以编程方式创建权限:
动态地创建用户权限, 通过post_migrate信号
https://docs.djangoproject.com/zh-hans/3.0/ref/signals/#django.db.models.signals.post_migrate
- 是否可以删除超级用户?
>可以,(待修改)
### DONE
- webServer, 中间件, 出现未捕获的错误,写日志,返回http
| 17.954545 | 93 | 0.759494 | yue_Hant | 0.982833 |
93491e48e335966ce724d7e31ce8aec4949260fe | 2,939 | md | Markdown | README.md | linnerissa/fanyi | aa53f53f820bd7326d4b8c1e06e9eace40646d0f | [
"MIT"
] | 1 | 2022-01-08T00:31:58.000Z | 2022-01-08T00:31:58.000Z | README.md | linnerissa/fanyi | aa53f53f820bd7326d4b8c1e06e9eace40646d0f | [
"MIT"
] | null | null | null | README.md | linnerissa/fanyi | aa53f53f820bd7326d4b8c1e06e9eace40646d0f | [
"MIT"
] | null | null | null | <div align="center">
# Fanyi
A 🇨🇳 and 🇺🇸 translate tool in your command line.
[](https://npmjs.org/package/fanyi) [](https://travis-ci.org/afc163/fanyi) [](https://npmjs.org/package/fanyi)
[](https://david-dm.org/afc163/fanyi) [](https://david-dm.org/afc163/fanyi?type=dev) [](https://david-dm.org/afc163/fanyi?type=optional)

</div>
## Install
```bash
$ npm install fanyi -g
```
## Usage
```bash
$ fanyi word
```
For short:
```bash
$ fy word
```
Translation data is fetched from [iciba.com](http://iciba.com) and [fanyi.youdao.com](http://fanyi.youdao.com), and only support translation between Chinese and English.
In Mac/Linux bash, words will be pronounced by `say` command.
Translate one word.
```bash
$ fanyi love
```
```js
love [ lʌv ] ~ fanyi.youdao.com
- n. 恋爱;亲爱的;酷爱;喜爱的事物;爱情,爱意;疼爱;热爱;爱人,所爱之物
- v. 爱,热爱;爱戴;赞美,称赞;喜爱;喜好;喜欢;爱慕
- n. (英)洛夫(人名)
1. Love
爱,爱情,恋爱
2. Endless Love
无尽的爱,不了情,蓝色生死恋
3. puppy love
早恋,青春期恋爱,初恋
love [ lʌv ] [ lʌv ] ~ iciba.com
- vt.&vi. 爱,热爱;爱戴;喜欢;赞美,称赞;
- vt. 喜爱;喜好;喜欢;爱慕;
- n. 爱情,爱意;疼爱;热爱;爱人,所爱之物;
1. They happily reflect the desire for a fusional love that inspired the legendary LOVE bracelet Cartier.
快乐地反映出为富有传奇色彩的卡地亚LOVE手镯所赋予的水乳交融之爱恋情愫。
2. Love is the radical of lovely, loveliness, and loving.
Love是lovely,loveliness及loving的词根。
3. She rhymes"love"with"dove".
她将"love"与"dove"两字押韵。
4. In sports, love means nil.
体育中,love的意思是零。
5. Ludde Omholt with his son, Love, in S?derma a bohemian and culturally rich district in Stockholm.
LuddeOmholt和他的儿子Love在南城——斯德哥尔摩市的一个充满波西米亚风情的文化富饶区散步。
```
More words.
```bash
$ fanyi make love
```
Support Chinese, even sentence.
```bash
$ fanyi 和谐
```
```bash
$ fanyi 子非鱼焉知鱼之乐
```
## Configuration
A configuration file can be put into ~/.fanyirc, in the user's home directory
Use subcommand `fanyi config [options]`
Example:
```bash
# Turn off the pronunciation
$ fanyi config --no-say
# or
$ fanyi config -S
# Disable the dictionaryapi
$ fanyi config --no-dictionaryapi
# or
$ fanyi config -D
```
A sample ~/.fanyirc file:
```json
{
"iciba": true,
"youdao": true,
"dictionaryapi": false,
"say": false,
"color": true
}
```
## Error: spawn festival ENOENT
Try this workaround from [say.js](https://github.com/Marak/say.js#linux-notes) in Linux.
```
sudo apt-get install festival festvox-kallpc16k
```
| 22.607692 | 412 | 0.697856 | eng_Latn | 0.269462 |
9349651ff22bd909daf2161d9725d57cf7f6d953 | 2,634 | md | Markdown | pages/git/GithubPagesMaint.md | steeladept/steeladept.github.io | 9dbbb579917a6974529dd7baf4f165a9065ca2b3 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | pages/git/GithubPagesMaint.md | steeladept/steeladept.github.io | 9dbbb579917a6974529dd7baf4f165a9065ca2b3 | [
"MIT",
"BSD-3-Clause"
] | 1 | 2020-03-04T14:22:48.000Z | 2020-03-04T14:22:48.000Z | pages/git/GithubPagesMaint.md | steeladept/steeladept.github.io | 9dbbb579917a6974529dd7baf4f165a9065ca2b3 | [
"MIT",
"BSD-3-Clause"
] | null | null | null | ---
title: GitHub Pages Maintenance
keywords: GitHub Pages, jekyll, github, markdown
last_updated: Dec 13, 2018
tags: [getting_started, jekyll, github, markdown, github_pages]
summary: "Learning how to setup and start a blog hosted on GitHub Pages"
sidebar: git_sidebar
permalink: GithubPagesMaint.html
folder: git
---
So after all that was done, there is this simple little issue of using it. I started using Markdown when I started the project, and found it quite useful and easy to pickup. I was at a job where I had to use GIT a lot (not GitHub, but the actual software that runs the version control of the site), and so I started getting used to that too. So once I got all the details worked out on the setup it was easy, right? Well....
Honestly, the issues I have had are very minor and easy to resolve so far. They are known restrictions with GitHub Pages as detailed in the documentation and now I know it was because I just wasn't following them. But that doesn't mean it didn't drive me nutty working at it before I figured out I was the stupid one here (well I knew it before hand, I just didn't know how I was being stupid). Perhaps in your wisdom, dear reader, you will learn a little nugget here so you are not going as crazy as I did.
My first issue was linking. More specifically, my home page (the README.md if you didn't already know by now) was being used as an index of indexes. Using Markdown linking, I was able to make it work in VSCode just fine, and so I knew it was correctly formatted, yet when I pushed the updates up to the website...poof!... no pages.
Grrr... Time to go to Github and see what didn't make it. Wait, it is there. Oh, yeah. \<Refreash web page> There it...isn't? What? I mean it is there, but it isn't being prettified. It is pure text and that is it. Like the CSS isn't being applied. I don't get it. It worked in VSCode, it worked with the Markdown Syntax, it worked with everything, why isn't it working in GitHub?
My first thought is "Oh yeah, isn't one of those restrictions I spoke of earlier, one where you can't have subfolders?" So I qucikly moved all files to the same level. It is messy, and I need to rename the README.md files so they don't overwrite each other, but that was easy enough. Now it should work!
And in comes my second issue - or is it an extention of the first. Either way, it still doesn't work! The home page works fine, but any secondary page, nope! A quick google search showed me that it isn't going to be THAT easy. I actually have to LEARN Jekyll...But that will be for another post.
[Things that make you go Hmm....](https://binged.it/2Ae4ht6) | 109.75 | 507 | 0.759681 | eng_Latn | 0.999855 |
934a00c2e5e4b34e1ce78358582ddcb3cb69ed07 | 5,101 | md | Markdown | docs/md/zh/bm-local-search.md | Wilfredo-Ho/vue-baidu-map | d38acf51ee4d4ca2e38fb58a09b059185f4d298a | [
"MIT"
] | null | null | null | docs/md/zh/bm-local-search.md | Wilfredo-Ho/vue-baidu-map | d38acf51ee4d4ca2e38fb58a09b059185f4d298a | [
"MIT"
] | 1 | 2022-03-02T10:10:18.000Z | 2022-03-02T10:10:18.000Z | docs/md/zh/bm-local-search.md | gyjsuccess/vue-baidu-map | 106c69a7cae920822ca4156cd747891383888bb2 | [
"MIT"
] | 1 | 2020-12-13T14:23:35.000Z | 2020-12-13T14:23:35.000Z | <template lang="md">
# 地区检索
`BmLocalSearch`
## 属性
|属性名|类型 |默认值|描述|
|------|:---:|:----:|----|
|location|String, Point, None||location表示检索区域,其类型可为空、坐标点或城市名称的字符串。当参数为空时,检索位置由当前地图中心点确定,且搜索结果的标注将自动加载到地图上,并支持调整地图视野层级;当参数为坐标时,检索位置由该点所在位置确定;当参数为城市名称时,检索会在该城市内进行。|
|bounds|Bounds||限定检索的矩形区域。如果区域超出当前 location,将不会产生检索结果。当与 nearby 属性同时,以 nearby 的查询结果为准。|
|nearby|{center: Point, radius: Number}||限定检索的圆形区域,参数为由圆心和半径组成的对象。如果区域超出当前 location,将不会产生检索结果。当与 bounds 属性同时,以 nearby 的查询结果为准。|
|keyword|String, Array||搜索关键字。当keyword为数组时将同时执行多关键字的查询,最多支持10个关键字。|
|forceLocal|Boolean||表示是否将搜索范围约束在当前城市|
|customData|CustomData||表示检索lbs云服务的数据|
|panel|Boolean|true|是否选展现检索结果面板。|
|pageCapacity|Number||设置每页容量,取值范围:1 - 100,对于多关键字检索,每页容量表示每个关键字返回结果的数量(例如当用2个关键字检索时,实际结果数量范围为:2 - 200)。此值只对下一次检索有效|
|autoViewport|Boolean||检索结束后是否自动调整地图视野。|
|selectFirstResult|Boolean||是否选择第一个检索结果。|
## 事件
|事件名|参数|描述|
|------|:--:|----|
|markersset|{pois: Array}|标注添加完成后的回调函数。|
|infohtmlset|{poi: LocalResultPoi}|标注气泡内容创建后的回调函数。|
|resultshtmlset|{container: HTMLElement}|结果列表添加完成后的回调函数。|
|searchcomplete|{results: [LocalResult]}|检索完成后的回调函数。如果是多关键字检索,回调函数参数返回一个LocalResult的数组,数组中的结果顺序和检索中多关键字数组中顺序一致|
## 示例
### 对一个地图实例进行地区检索
#### 代码
```html
<template>
<label>关键词:<input v-model="keyword"></label>
<label>地区:<input v-model="location"></label>
<baidu-map>
<bm-view class="map"></bm-view>
<bm-local-search :keyword="keyword" :auto-viewport="true" :location="location"></bm-local-search>
</baidu-map>
</template>
<script>
export default {
data () {
return {
location: '北京',
keyword: '百度'
}
}
}
</script>
</style>
```
#### 预览
<doc-preview>
<baidu-map>
<bm-view class="map">
</bm-view>
<div class="toolbar">
<table>
<thead>
<tr>
<th>地区</th>
<th>关键词</th>
<tr>
</thead>
<tbody>
<tr>
<td><text-field v-model="location"></text-field></td>
<td><text-field v-model="keyword"></text-field></td>
</tr>
</tbody>
</table>
</div>
<bm-local-search :keyword="keyword" :auto-viewport="true" :location="location"></bm-local-search>
</doc-preview>
</baidu-map>
### 在一个矩形区域内进行当前地区检索
#### 代码
```html
<template>
<baidu-map>
<bm-view class="map"></bm-view>
<bm-local-search keyword="银行" :bounds="bounds" :auto-viewport="true" :panel="false"></bm-local-search>
<bm-polygon :path="polygonPath"></bm-polygon>
</baidu-map>
</template>
<script>
export default {
data () {
return {
pStart: {
lng: 116.294625,
lat: 39.961627
},
pEnd: {
lng: 116.357474,
lat: 39.988609
}
}
},
computed: {
bounds () {
const {pStart, pEnd} = this
return {
sw: {lng: pStart.lng, lat: pStart.lat},
ne:{lng: pEnd.lng, lat: pEnd.lat}
}
},
polygonPath () {
const {pStart, pEnd} = this
return [
{lng: pStart.lng, lat: pStart.lat},
{lng: pEnd.lng, lat: pStart.lat},
{lng: pEnd.lng, lat: pEnd.lat},
{lng: pStart.lng, lat: pEnd.lat}
]
}
}
}
</script>
```
#### 预览
<doc-preview>
<baidu-map :center="{lng: 116.274625, lat: 39.961627}" :zoom="11">
<bm-view class="map"></bm-view>
<bm-local-search keyword="银行" :bounds="bounds" :auto-viewport="true" :panel="false"></bm-local-search>
<bm-polygon :path="polygonPath"></bm-polygon>
</baidu-map>
</doc-preview>
### 在一个圆形区域内进行当前地区检索
#### 代码
```html
<template>
<baidu-map>
<bm-view class="map"></bm-view>
<bm-local-search keyword="餐馆" :nearby="nearby" :auto-viewport="true" :panel="false"></bm-local-search>
<bm-circle :center="nearby.center" :radius="nearby.radius"></bm-circle>
</baidu-map>
</template>
<script>
export default {
data () {
return {
nearby: {
center: {
lng: 116.404,
lat: 39.915
},
radius: 1000
}
}
}
}
</script>
```
#### 预览
<doc-preview>
<baidu-map :center="{lng: 116.404, lat: 39.915}" :zoom="15">
<bm-view class="map"></bm-view>
<bm-local-search keyword="餐馆" :nearby="nearby" :auto-viewport="true" :panel="false"></bm-local-search>
<bm-circle :center="nearby.center" :radius="nearby.radius"></bm-circle>
</baidu-map>
</doc-preview>
</template>
<script>
export default {
data () {
return {
location: '北京',
keyword: '百度',
pStart: {
lng: 116.294625,
lat: 39.961627
},
pEnd: {
lng: 116.357474,
lat: 39.988609
},
nearby: {
center: {
lng: 116.404,
lat: 39.915
},
radius: 1000
}
}
},
computed: {
bounds () {
const {pStart, pEnd} = this
return {
sw: {lng: pStart.lng, lat: pStart.lat},
ne:{lng: pEnd.lng, lat: pEnd.lat}
}
},
polygonPath () {
const {pStart, pEnd} = this
return [
{lng: pStart.lng, lat: pStart.lat},
{lng: pEnd.lng, lat: pStart.lat},
{lng: pEnd.lng, lat: pEnd.lat},
{lng: pStart.lng, lat: pEnd.lat}
]
}
}
}
</script> | 21.987069 | 162 | 0.582435 | yue_Hant | 0.445301 |
934a8485e77d38ec66b1c1201b8c512b4a6c139c | 1,411 | md | Markdown | 2020/12/28/2020-12-28 11:20.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/12/28/2020-12-28 11:20.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/12/28/2020-12-28 11:20.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年12月28日11时数据
Status: 200
1.警方通报王一博被报假警结果
微博热度:2985278
2.百香果女孩案凶手被改判死刑
微博热度:1705306
3.2021开年流金雨
微博热度:1664134
4.梁静茹唱分手快乐哭了
微博热度:1651835
5.蔡文静被薅头发
微博热度:1530702
6.1块钱包1年
微博热度:1069577
7.追光吧哥哥节目组道歉
微博热度:1029479
8.王子文离别戏哭了11小时
微博热度:834749
9.顺丰称确诊病例未接触任何快件
微博热度:831820
10.治疗新冠药品纳入国家医保目录
微博热度:713688
11.数百英国游客不愿隔离不告而别
微博热度:636532
12.姚晨回怼恶评
微博热度:619308
13.大连新增确诊中有一18个月男婴
微博热度:556505
14.王一博和舌尖上的中国适配度
微博热度:545035
15.韩国人造太阳打破世界纪录
微博热度:479839
16.陈小春GAI我们的歌冠军
微博热度:461023
17.辽宁新增6例确诊
微博热度:460810
18.药水哥击败刘洲成
微博热度:459991
19.安妮玫瑰 刘著
微博热度:458040
20.白酒股
微博热度:456976
21.一张毕业照激励小胖墩两个月减30斤
微博热度:456972
22.姚晨为杨笠发声
微博热度:455088
23.从戴口罩方式看出人格
微博热度:452206
24.饮料巨头打响冬季热饮战
微博热度:422005
25.杜兰特错失关键跳投
微博热度:414224
26.侍神令预告
微博热度:408533
27.西安地铁
微博热度:405570
28.杨笠
微博热度:374123
29.乱港分子林卓廷今早被拘捕
微博热度:372472
30.股市
微博热度:371361
31.秃鹫太冷了到牧民家求救
微博热度:371230
32.花小猪打车回应平台司机感染新冠
微博热度:366078
33.狗狗宠物医院上演大逃亡
微博热度:342911
34.库里投中2500个三分
微博热度:295530
35.张信哲太一用情
微博热度:288348
36.黄海设计送你一朵小红花海报
微博热度:280867
37.重庆的一楼有多高
微博热度:273700
38.阳光之下预告
微博热度:255989
39.顶楼
微博热度:234035
40.敦煌陷阱公厕涉事7人被提起公诉
微博热度:228955
41.我国网民近2.23亿是学生
微博热度:211748
42.孔雪儿熊梓淇沈月走秀
微博热度:198534
43.NBA
微博热度:196793
44.寒潮预警升级为橙色
微博热度:193859
45.GAI叫王源王甜甜
微博热度:193366
46.社区团购存在低价倾销挤压就业的问题
微博热度:165926
47.解剖辛选带货利润
微博热度:161934
48.阳光之下
微博热度:158179
49.在家隔离奶奶隔窗为志愿者打气
微博热度:142948
50.北京今冬最强寒潮正式开启
微博热度:132251
| 6.916667 | 20 | 0.782424 | yue_Hant | 0.287175 |
934ad9e327fcd39c169bd02105c91b2e6920d64b | 4,552 | md | Markdown | articles/cosmos-db/account-overview.md | KevinRohn/azure-docs.de-de | d1de2e14da9561d76f950faca887a3bab6f5fd2a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-04-03T08:58:02.000Z | 2020-04-03T08:58:02.000Z | articles/cosmos-db/account-overview.md | KevinRohn/azure-docs.de-de | d1de2e14da9561d76f950faca887a3bab6f5fd2a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/cosmos-db/account-overview.md | KevinRohn/azure-docs.de-de | d1de2e14da9561d76f950faca887a3bab6f5fd2a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Verwenden von Azure Cosmos DB-Konten
description: In diesem Artikel erfahren Sie, wie Sie Azure Cosmos-Konten erstellen und verwenden. Außerdem wird die Hierarchie der Elemente in einem Azure Cosmos-Konto gezeigt.
author: markjbrown
ms.author: mjbrown
ms.service: cosmos-db
ms.subservice: cosmosdb-sql
ms.topic: conceptual
ms.date: 07/23/2019
ms.reviewer: sngun
ms.openlocfilehash: d29ed68b2945b2473b33aa88176e6f5d832a0fba
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 03/28/2020
ms.locfileid: "79225754"
---
# <a name="work-with-azure-cosmos-account"></a>Arbeiten mit einem Azure Cosmos-Konto
Azure Cosmos DB ist ein vollständig verwalteter PaaS-Dienst (Platform-as-a-Service, PaaS). Um Azure Cosmos DB verwenden zu können, müssen Sie zunächst ein Azure Cosmos-Konto in Ihrem Azure-Abonnement erstellen. Ihr Azure Cosmos-Konto enthält einen eindeutigen DNS-Namen, und für die Kontoverwaltung können Sie das Azure-Portal, die Azure-Befehlszeilenschnittstelle oder verschiedene sprachspezifische SDKs verwenden. Weitere Informationen finden Sie unter [Verwalten von Datenbankkonten in Azure Cosmos DB](how-to-manage-database-account.md).
Das Azure Cosmos-Konto ist die grundlegende Einheit für globale Verteilung und Hochverfügbarkeit. Sie können Ihrem Azure Cosmos-Konto jederzeit Azure-Regionen hinzufügen oder Azure-Regionen daraus entfernen, um Ihre Daten und Ihren Durchsatz global auf mehrere Azure-Regionen zu verteilen. Ihr Azure Cosmos-Konto kann mit einer einzelnen Schreibregion oder mit mehreren Schreibregionen konfiguriert werden. Weitere Informationen finden Sie unter [Hinzufügen/Entfernen von Regionen für Ihr Datenbankkonto](how-to-manage-database-account.md). Sie können die [Standardkonsistenzebene](consistency-levels.md) für Ihr Azure Cosmos DB-Konto konfigurieren. Azure Cosmos DB bietet umfassende SLAs mit Durchsatz, Latenz im 99. Perzentil, Konsistenz und Hochverfügbarkeit. Weitere Informationen finden Sie unter [SLA für Azure Cosmos DB](https://azure.microsoft.com/support/legal/sla/cosmos-db/v1_2/).
Zur sicheren Verwaltung des Zugriffs auf sämtliche Daten in Ihrem Azure Cosmos-Konto können Sie die [Hauptschlüssel](secure-access-to-data.md) verwenden, die Ihrem Konto zugeordnet sind. Darüber hinaus können Sie zum Schutz des Zugriffs auf Ihre Daten einen [VNET-Dienstendpunkt](vnet-service-endpoint.md) und eine [IP-Firewall](firewall-support.md) für Ihr Azure Cosmos-Konto konfigurieren.
## <a name="elements-in-an-azure-cosmos-account"></a>Elemente in einem Azure Cosmos-Konto
Ein Azure Cosmos-Container ist die grundlegende Einheit für die Skalierbarkeit. Ein Container kann über nahezu unbegrenzten bereitgestellten Durchsatz (RU/s) und Speicher verfügen. Azure Cosmos DB partitioniert Ihren Container transparent unter Verwendung des logischen Partitionsschlüssels, den Sie angeben, um die elastische Skalierung Ihres bereitgestellten Durchsatzes und Speichers zu ermöglichen. Weitere Informationen finden Sie unter [Arbeiten mit Azure Cosmos-Datenbanken, -Containern und -Elementen](databases-containers-items.md).
Aktuell können unter einem Azure-Abonnement bis zu 100 Azure Cosmos-Konten erstellt werden. Ein einzelnes Azure Cosmos-Konto kann eine nahezu unbegrenzte Menge an Daten und bereitgestelltem Durchsatz verwalten. Zur Verwaltung Ihrer Daten und Ihres bereitgestellten Durchsatzes können Sie unter Ihrem Konto eine oder mehrere Azure Cosmos-Datenbanken und innerhalb der jeweiligen Datenbank einen oder mehrere Container erstellen. Die folgende Abbildung zeigt die Hierarchie der Elemente in einem Azure Cosmos-Konto:

## <a name="next-steps"></a>Nächste Schritte
Weitere Informationen zum Verwalten Ihres Azure Cosmos-Kontos und zu anderen Konzepten:
* [Verwalten von Datenbankkonten in Azure Cosmos DB](how-to-manage-database-account.md)
* [Globale Verteilung](distribute-data-globally.md)
* [Konsistenzebenen](consistency-levels.md)
* [Arbeiten mit Azure Cosmos-Datenbanken, -Containern und -Elementen](databases-containers-items.md)
* [Sicherer Zugriff auf ein Azure Cosmos DB-Konto durch Verwenden eines Azure Virtual Network-Dienstendpunkts](vnet-service-endpoint.md)
* [Azure Cosmos DB-Firewallunterstützung](firewall-support.md)
* [Hinzufügen/Entfernen von Regionen für Ihr Datenbankkonto](how-to-manage-database-account.md)
* [SLA für Azure Cosmos DB](https://azure.microsoft.com/support/legal/sla/cosmos-db/v1_2/)
| 98.956522 | 891 | 0.823374 | deu_Latn | 0.989877 |
934afb013888aa8fd85ce682353ea5c03fe6230d | 144 | md | Markdown | README.md | amadamombe/TicTacToeGame | 9b5f5dd6e02391620069786838551326804c0aac | [
"MIT"
] | null | null | null | README.md | amadamombe/TicTacToeGame | 9b5f5dd6e02391620069786838551326804c0aac | [
"MIT"
] | null | null | null | README.md | amadamombe/TicTacToeGame | 9b5f5dd6e02391620069786838551326804c0aac | [
"MIT"
] | null | null | null | # TicTacToe Game
## Using only HTML, CSS and JS
[](https://postimg.cc/2qN34pDN)
| 24 | 89 | 0.6875 | kor_Hang | 0.543277 |
934b3a2a086ded5aeea2ea4ec8716c924a1e0bc6 | 557 | md | Markdown | docs/2-sutta-pitaka/2.2-majjhima-nikaya/2.2.131-bhaddekaratta-sutta.md | nuuuwan/tripitaka | 3cd8036d764bbf7f39cc98f02aedc8fa7538ac6c | [
"MIT"
] | null | null | null | docs/2-sutta-pitaka/2.2-majjhima-nikaya/2.2.131-bhaddekaratta-sutta.md | nuuuwan/tripitaka | 3cd8036d764bbf7f39cc98f02aedc8fa7538ac6c | [
"MIT"
] | null | null | null | docs/2-sutta-pitaka/2.2-majjhima-nikaya/2.2.131-bhaddekaratta-sutta.md | nuuuwan/tripitaka | 3cd8036d764bbf7f39cc98f02aedc8fa7538ac6c | [
"MIT"
] | null | null | null | # Bhaddekaratta Sutta
*In this stirring discourse the Buddha underscores the vital urgency of keeping one's attention firmly rooted in the present moment. After all, the past is gone, the future isn't here; this present moment is all we have.*
Source: [https://www.accesstoinsight.org/tipitaka/mn/tp://www.suttareadings.net/audio/index.html#mn.131](https://www.accesstoinsight.org/tipitaka/mn/tp://www.suttareadings.net/audio/index.html#mn.131)
---
*Translated by *
---
## Where would you like to go next?
Home page
| Help page
| Site map
| Nibbana
| 27.85 | 221 | 0.752244 | eng_Latn | 0.864038 |
934be61e363acc192d09100780b5ad1a05322357 | 80,674 | md | Markdown | _posts/2015-01-13-enterprise-integration-with-spring.md | monkama/monkama.github.io | 010fe03357ee35f0c1643ace7b3a5a495d660fa7 | [
"Apache-2.0"
] | 2 | 2021-08-06T10:21:24.000Z | 2021-12-18T06:41:04.000Z | _posts/2015-01-13-enterprise-integration-with-spring.md | monkama/monkama.github.io | 010fe03357ee35f0c1643ace7b3a5a495d660fa7 | [
"Apache-2.0"
] | 8 | 2020-06-20T15:21:41.000Z | 2021-08-06T10:19:05.000Z | _posts/2015-01-13-enterprise-integration-with-spring.md | monkama/monkama.github.io | 010fe03357ee35f0c1643ace7b3a5a495d660fa7 | [
"Apache-2.0"
] | 2 | 2020-07-12T13:32:31.000Z | 2020-09-30T12:13:57.000Z | ---
layout: post
title: '"Enterprise Integration with Spring" certification'
date: '2015-01-13T15:44:00.000+01:00'
author: Monik
tags:
- Programming
- Java
- Spring
modified_time: '2016-05-27T11:20:17.799+02:00'
blogger_id: tag:blogger.com,1999:blog-5940427300271272994.post-1069023866261926173
blogger_orig_url: http://learningmonik.blogspot.com/2015/01/enterprise-integration-with-spring.html
commentIssueId: 26
type: certification
---
<div class="bg-info panel-body" markdown="1">
These are my notes I took before taking the
<b>Enterprise Integration with Spring Certification </b>exam.<br/><br/>I passed the exam. At the second attempt, after actually taking those notes. The fist attempt was after 1-2 weeks of preparation and it was unnecessarily rushed (I knew it's too early, if you think it's too early it's probably too early), The most difficult thing in preparation for this exam was keeping all those different new things separate in my head and not mixing them up. As soon as I knew where I am in the imaginary table of contents, everything was clear. What helped in organising the knowledge was reading
<a href="http://learningmonik.blogspot.de/2015/01/spring-integration-in-action.html"
target="_blank">this book</a>.<br/><br/>There is some overlap of this exam with the first Spring Core exam. There is also some stuff that logically would belong to Spring Integration but is actually not part of the exam - the amount of material is simply too big to squeeze everything. That is why it is essential to read the official study guide and study according to it. Or to my notes below (which follow the guide) :P
</div>
<h3>Table of contents</h3>
- TOC
{:toc max_level=2}
### Remoting
<ol>
<li style="font-weight: bold;">
The concepts involved with Spring Remoting on both server- and client-side
</li>
<ul>
<li>
Exporters on server side, bind a service to registry, or expose an endpoint; no code change to the service
</li>
<li>
<span>ProxyFactoryBeans on client side, which handle the communication and convert </span><span
style="font-weight: bold; ">exceptions</span><span>; you inject the instance of service which in in fact a dynamic proxy (polymorphism)</span>
</li>
</ul>
<li style="font-weight: bold;">
The benefits of Spring Remoting over traditional remoting technologies
</li>
<ul>
<li>
Hide ‘plumbing’ code
</li>
<li>
<span>Support multiple protocols in a </span><span
style="font-weight: bold; ">consistent</span><span> way (normally it’s violation of concerns, business logic mixed with remoting infrastructure)</span>
</li>
<li>
Configure and expose services declaratively (configuration-based approach)
</li>
</ul>
<li style="font-weight: bold;">
The remoting protocols supported by Spring
</li>
<ul>
<li>
RMI - uses RMI registry, Java serialisation
</li>
<li>
<div>
HttpInvoker - HTTP based (POST), Java serialisation
</div>
</li>
<li>
<div>
Hessian / Burlap - HTTP based, binary/txt XML serialisation
</div>
</li>
<li>
Stateless EJB (not mentioned in the training)
</li>
</ul>
<table style="margin-left:2em; margin-top:1em;"> <!-- The remoting protocols supported by Spring -->
<tr>
<td><br/></td>
<td>
Server side
</td>
<td>
Client side
</td>
</tr>
<tr style="height: 0px;">
<td>
RMI
</td>
<td>
<div><span><bean class="</span><span
style=" font-weight: bold; ">RmiServiceExporter</span><span>"></span>
</div>
<div>
<span> <p:service-name value="myServiceNameInRegistry"/></span>
</div>
<div>
<span> <p:service-interface value="...TheService"/></span>
</div>
<div><span> <p:service ref="myService"/></span>
</div>
<div><span> <p:registry-port="1099"/></span>
</div>
<div><span></bean></span>
</div>
</td>
<td>
<div><span><bean id, class="</span><span
style=" font-weight: bold; ">RmiProxyFactoryBean</span><span>"></span>
</div>
<div><span> <p:service-interface value="myService"/></span>
</div>
<div>
<span> <p:service-url value="rmi://foo:1099/myServiceName</span><br/><span>InRegistry"/></span>
</div>
<div><span></bean></span>
</div>
</td>
</tr>
<tr style="height: 0px;">
<td>
<div><span>Http</span><br/><span>Invoker</span>
</div>
</td>
<td>
<div><span><bean </span><span
style=" font-weight: bold; ">name="/transfer" </span><span>class="</span><span
style=" font-weight: bold; ">HttpInvokerServiceExporter</span><span>"></span>
</div>
<div>
<span> <p:service-interface value="...TheService"/></span>
</div>
<div><span> <p:service ref="myService"/></span>
</div>
<div><span></bean></span>
</div>
<br/>
<div>
<span>+ DispatcherServlet or HttpRequestHandlerServlet with name ”</span><span
style=" font-weight: bold; ">transfer</span><span>”</span>
</div>
</td>
<td>
<div><span><bean id, class="</span><span
style=" font-weight: bold; ">HttpInvokerProxyFactoryBean</span><span>"></span>
</div>
<div><span> <p:service-interface value="myService"/></span>
</div>
<div>
<span> <p:service-url value="rmi://foo:8080/services/transfer"/></span>
</div>
<div><span></bean></span>
</div>
</td>
</tr>
<tr>
<td>
<div><span>Hessian / Burlap</span>
</div>
</td>
<td colspan="2">
<div>
<span>same as above, replace “HttpInvoker” with “Hessian” / “Burlap”</span>
</div>
</td>
</tr>
</table>
<li
style="font-weight: bold;">
<div ><span>How Spring Remoting-based RMI is less invasive than plain RMI </span>
</div>
</li>
<li
style="font-weight: bold;">
<div ><span>Spring HTTP Invoker: how client and server interact with each other </span>
</div>
</li>
</ol>
### Web Services
<ol>
<li style="font-weight: bold;">
How do Web Services compare to Remoting and Messaging
</li>
<ul>
<li>
information exchange protocol and format is HTTP and XML (or JSON) => no firewalls in between
</li>
<ul>
<li>
two ways of using the HTTP to transfer XML
</li>
<ul>
<li style="list-style-type: circle; ">
SOAP/POX and WSDL/XSD - contract-first, they use only POST and/or GET method for all the operations
</li>
<li style="list-style-type: circle; ">
REST
</li>
</ul>
</ul>
<li>
“loose coupling – we define document-oriented contract between service consumers and providers
</li>
<li>
interoperability – XML payload (is understood by all major platforms like Java. NET, C++, Ruby, PHP, Perl,...)”
</li>
</ul>
<li style="font-weight: bold;">
The approach to building web services that Spring-WS supports
</li>
<ul>
<li>
is contract-first approach (start with XSD / WSDL)
</li>
<li>
POX / SOAP + WSDL
</li>
<li>
WebServiceClient - support for creating e.g. SOAP message
</li>
</ul>
<table style="margin-left:2em; margin-top:1em;"> <!-- The approach to building web services that Spring-WS supports -->
<tbody>
<tr style="height: 27px;">
<td><br/></td>
<td colspan="2">
<div><span>SOAP / POX (Spring WS)</span>
</div>
</td>
</tr>
<tr style="height: 27px;">
<td>
<div><span>web.xml</span>
</div>
</td>
<td colspan="2">
<div><span><servlet></span>
</div>
<div><span> <servlet-name></span><span
style="font-weight: bold; ">si-ws-gateway</span><span></servlet-name></span>
</div>
<div><span> <servlet-class></span>
</div>
<div><span> MessageDispatcherServlet</span>
</div>
<div><span> </servlet-class></span>
</div>
<div><span> <init-param></span>
</div>
<div>
<span> <param-name>contextConfigLocation</param-name> </span>
</div>
<div><span> <param-value>si-ws-config.xml</param-value></span>
</div>
<div><span> </init-param></span>
</div>
<div><span> <load-on-startup>1</load-on-startup></span>
</div>
<div><span></servlet></span>
</div>
<br/>
<div><span><servlet-mapping></span>
</div>
<div><span> <servlet-name></span><span
style="font-weight: bold; ">si-ws-gateway</span><span></servlet-name></span>
</div>
<div><span> <url-pattern>/quoteservice</url-pattern></span>
</div>
<div><span></servlet-mapping></span>
</div>
<br/>
<div>
<span>* also add contextConfigLocation as context-param and ContextLoaderListener for app context</span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td>
<div><span>web infrastr. config</span>
</div>
</td>
<td colspan="2">
<div><span><bean class="....UriEndpointMapping"></span>
</div>
<div><span> <p:name="defaultEndpoint" ref="</span><span
style="font-weight: bold; ">ws-inbound-gateway</span><span>"/></span>
</div>
<div><span></bean></span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td>
<div><span>app config</span>
</div>
</td>
<td colspan="2">
<div><span><context:component-scan base-package=”transfers.ws”/></span>
</div>
<div><span><ws:annotation-driven/></span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td>
<div><span>Endpoint implementation</span>
</div>
</td>
<td colspan="2">
<div><span>@Endpoint</span>
</div>
<div><span>public class TransferServiceEndpoint {</span>
</div>
<div><span> ...</span>
</div>
<div><span> @PayloadRoot(localPart=”transferRequest”, namespace=”</span><a
href="http://mybank.com/schemas/tr"
style="color: #009eb8; display: inline; text-decoration: none; transition: color 0.3s;"><span
style="color: #1155cc;text-decoration: underline; ">http://mybank.com/schemas/tr</span></a><span>”</span>
</div>
<div>
<span> public @ResponsePayload TransferResponse newTransfer(@RequestPayload TransferRequest request){</span>
</div>
<div><span> ...</span>
</div>
<div><span> }</span>
</div>
<div><span>}</span>
</div>
</td>
</tr>
</tbody>
</table>
<ul style=" padding: 0px 0px 0px 2em;">
<ul>
<li >
<div style=" ">
<span><span>@PayloadRoot </span><span>defines the resource path, </span><span>@ResponsePayload </span><span>and </span><span>@RequestPayload </span><span>are for XML mapping</span></span>
</div>
</li>
</ul>
</ul>
<div markdown="1">
```xml
<int-ws:inbound-gateway id="ws-inbound-gateway"
request-channel="ws-requests"
extract-payload="false"
[marshaller/unmarshaller=”jaxb2”]/>
```
</div>
<li style="font-weight: bold;">
<div ><span>The Object-to-XML frameworks supported by Spring-OXM (or Spring 3.0) </span>
</div>
</li>
<ul>
<li >
<div ><span>JAXB1/2, Castor, XMLBeans, JiBX</span>
</div>
</li>
<li >
<div ><span>XStream (not mentioned in training)</span>
</div>
</li>
<li >
<div ><span>also XPath argument binding</span>
</div>
</li>
</ul>
<div markdown="1">
```xml
<oxm:jaxb2-marshaller
id=”marshaller”
contextPath=”reward.ws.types:someotherpackage”/>
```
, or just simply
```xml
<ws:annotation-driven/> <!-- registers all infrastructure beans needed for annotation-based endpoints, like JAXB2 (un)marshalling-->
```
</div>
<li style="font-weight: bold;">
<div ><span><span>The strategies supported to map requests to endpoints (?) </span><span
style="font-weight: normal; ">(</span><a
href="http://docs.spring.io/spring-ws/site/reference/html/server.html#server-endpoint-mapping"
style="color: #009eb8; display: inline; text-decoration: none;"><span
style="color: #1155cc; font-weight: normal; text-decoration: underline; ">link</span></a><span
style="font-weight: normal; ">)</span></span>
</div>
</li>
<ul>
<li >
<div >
<span><span>@PayloadRoot - on method level, requires </span><span>namespace</span><span>+</span><span>localPart</span><span> values, which build the URL qualifier; needs the </span><span>PayloadRootAnnotationMethodEndpointMapping</span><span> registered;</span></span>
</div>
</li>
<li >
<div ><span><span>SOAP Action Header - based on the </span><span>To </span><span>and </span><span>Action </span><span>SOAP headers, @SoapAction: on method level “</span><span
style="font-style: italic; ">Whenever a message comes in which has this SOAPAction header, the method will be invoked</span><span>.”</span></span>
</div>
</li>
<div markdown="1">
```xml
<SOAP-ENV::Header>
...
<wsa:To S:mustUnderstand="true">http://example/com/fabrikam</wsa:To>
<wsa:Action>http://example.com/fabrikam/mail/Delete</wsa:Action>
</SOAP-ENV:Header>
```
</div>
<li >
<div >
<span><span>WS-Addressing, or AnnotationActionEndpointMapping, also AddressingEndpointInterceptor - annotate the handling methods with the</span><span> </span></span><span>@Action("http://samples/RequestOrder")</span><span> annotation</span>
</div>
</li>
<div markdown="1">
```java
@Action("http://samples/RequestOrder")
public Order getOrder(OrderRequest orderRequest) {
return orderService.getOrder(orderRequest.getId());
}
```
</div>
<li >
<div style=" ">
<span>XPath - </span><span>@Namespace(prefix = "s", uri="http://samples")</span><span> annotation on class/method level, and</span><span><span> </span><span>@XPathParam("/s:orderRequest/@id") int orderId </span></span><span>in the method - the attribute will be given the value which is the evaluation of the XPath expression</span>
</div>
</li>
<div markdown="1">
```java
@PayloadRoot(localPart = "orderRequest", namespace = "http://samples")
@Namespace(prefix = "s", uri="http://samples")
public Order getOrder(@XPathParam("/s:orderRequest/@id") int orderId) {
Order order = orderService.getOrder(orderId);
// create Source from order and return it
}
```
</div>
<li >
<div ><span>Message Payload - e.g. @RequestPayload Element inside the method</span>
</div>
</li>
</ul>
<li style="font-weight: bold;">
<div ><span>Of these strategies, how does @PayloadRoot work exactly? </span>
</div>
</li>
<ul>
<li >
<div ><span><span>it’s written above, but: </span><span
style="font-style: italic; ">“</span><span
style="font-style: italic; ">The PayloadRootAnnotationMethodEndpointMapping uses the @PayloadRoot annotation, with the localPart and namespace elements, to mark methods with a particular qualified name. Whenever a message comes in which has this qualified name for the payload root element, the method will be invoked.“</span></span>
</div>
</li>
</ul>
<li style="font-weight: bold;">
<div >
<span>The functionality offered by the WebServiceTemplate </span>
</div>
</li>
<ul>
<li >
<div >
<span>does the SOAP stupid stuff, and also POX</span>
</div>
</li>
<li >
<div >
<span>works with marshallers / unmarshallers (set the “marshaller” and “unmarshaller” propperty)</span>
</div>
</li>
<li >
<div >
<span>convenience methods</span>
</div>
</li>
<li >
<div >
<span>callbacks</span>
</div>
</li>
<li >
<div >
<span>error handling (annotate method with </span><span>@SoapFault(faultCode=FaultCode.CLIENT)</span><span>, </span><span>SoapFaultMessageResolver </span><span>is default)</span>
</div>
</li>
</ul>
<div markdown="1">
```xml
<bean class=”...WebServiceTemplate”>
<property name=”defaultUri” value=”http://mybank.com/transfer”/>
<property name=”marshaller” ref=”marshallerAndUnmarshaller”/>
<property name=”unmarshaller” ref=”marshallerAndUnmarshaller”/>
<property name=”faultMessageResolver” ref=”myCustomFaultMessageResolver”/>
</bean>
<bean id=”marshallerAndUnmarshaller” class=”...CastorMarshaller”>
<property name=”mappingLocation” value=”classpath:castor-mapping.xml”/>
</bean>
template.marshallSendAndReceive(new TransferRequest(“S123”));
template.sendSourceAndReceiveToResult(source, result);
full definition, e.g.:
doSendAndReceive(MessageContext messageContext, WebServiceConnection connection, WebServiceMessageCallback requestCallback, WebServiceMessageExtractor<T> responseExtractor)
```
</div>
<ul>
<li >
<div style=" ">
<span>the template by default uses Java's </span><span>HttpUrlConnectionMessageSender</span><span>, if you wanna apache client, override "messageSender" with </span><span>HttpComponentsMessageSender</span>
</div>
</li>
<li >
<div ><span><span>it’s also possible to use </span><span>JmsMessageSender</span></span>
</div>
</li>
</ul>
<li style="font-weight: bold;">
<div
style=" "><span><span>The underlying WS-Security implementations supported by Spring-WS (</span><a
href="http://docs.spring.io/spring-ws/docs/2.2.0.RELEASE/reference/htmlsingle/#security"
style="color: #009eb8; display: inline; text-decoration: none; transition: color 0.3s;"><span
style="color: #1155cc; text-decoration: underline; ">link</span></a><span>)</span></span>
</div>
</li>
<ul>
<li >
<div ><span>are implemented as interceptors</span>
</div>
</li>
<li >
<div >
<span><span>XwsSecurityInterceptor </span><span>(requires Sun JVM and SAAJ) - requires security policy XML configuration file</span></span>
</div>
</li>
<ul>
<li >
<div >
<span>Wss4jSecurityInterceptor </span><span>(also supports non-Sun JVMs and Axiom)</span>
</div>
</li>
<li >
<div ><span>support for Spring Security and JAAS keystores (JAAS goes with wss)</span>
</div>
</li>
</ul>
<li >
<div ><span>also client-side interceptors, which are injected in the WebServiceTemplate</span>
</div>
</li>
<li >
<div ><span>supports: authentication, digital signatures, encryption and decryption</span>
</div>
</li>
<li >
<div >
<span>exception handling: </span><span>WsSecuritySecurementException </span><span>(just logs), </span><span>WsSecurityValidationException </span><span>(translated to SOAP Fault)</span>
</div>
</li>
</ul>
<div markdown="1">
```xml
<ws:interceptors>
<bean class=”blabla.SoapEnvelopeLoggingInterceptor”
</ws:interceptors>
```
or
```xml
<ws:interceptors>
<ws:payloadRoot localPart=”MyRequest” namespaceUri=”htpp://blabla.com/namespace”>
<bean class=”blabla.SoapEnvelopeLoggingInterceptor”
</ws:payloadRoot>
</ws:interceptors>
```
client-side:
```xml
<bean class=”org...WebServiceTemplate”>
<property name=”interceptors”>
<bean class=”blablabalInterceptor”
</property>
</bean>
```
</div>
<li style="font-weight: bold;">
<div style=" ">
<span>How key stores are supported by Spring-WS for use with WS-Security (?)</span>
</div>
</li>
<ul>
<li >
<div >
<span><span>in Java, the keystores are of type</span><span> </span></span><span>java.security.KeyStore</span><span> and they store:</span>
</div>
</li>
<ul>
<li >
<div ><span>private keys</span>
</div>
</li>
<li >
<div ><span>symmetric keys</span>
</div>
</li>
<li >
<div ><span>trusted certificates</span>
</div>
</li>
</ul>
<li >
<div >
<span>Spring provides </span><span>KeyStoreFactoryBean</span><span><span> </span><span>and </span></span><span>KeyStoreCallbackHandler</span>
</div>
</li>
<div markdown="1">
```xml
<bean id="keyStore" class="org.springframework.ws.soap.security.support.KeyStoreFactoryBean">
<property name="password" value="password"/>
<property name="location" value="classpath:org/springframework/ws/soap/security/xwss/test-keystore.jks"/>
</bean>
<bean id="keyStoreHandler" class="org.springframework.ws.soap.security.xwss.callback.KeyStoreCallbackHandler">
<property name="keyStore" ref="keyStore"/>
<property name="privateKeyPassword" value="changeit"/>
</bean>
```
```xml
<bean id="wsSecurityInterceptor"
class="org.springframework.ws.soap.security.xwss.XwsSecurityInterceptor">
<property name="policyConfiguration" value="classpath:securityPolicy.xml"/>
<property name="callbackHandlers">
<list>
<ref bean="keyStoreHandler"/>
<ref bean="...
</list>
</property>
</bean>
```
</div>
</ul>
</ol>
### RESTful services with Spring-MVC
<ol>
<li style="font-weight: bold;">
The main REST principles
</li>
<ul>
<li>
<div><span>(makes HTTP not only transport protocol, but also application protocol)</span>
</div>
</li>
<li>
<div><span><span>H </span><span>Hypermedia (links)</span></span>
</div>
</li>
<li>
<div><span><span>U </span><span>Uniform Interface (nouns for resources, verbs for operations: GET, POST, PUT, DELETE, HEAD, OPTIONS)</span></span>
</div>
</li>
<li>
<div><span><span>S </span><span>Stateless Conversation => scalable</span></span>
</div>
</li>
<li>
<div><span><span>I </span><span>Identifiable Resources</span></span>
</div>
</li>
<li>
<div><span><span>R </span><span>Resource Representations (multiple representations for resource, which is abstract; Accept header in req, Content-Type in res)</span></span>
</div>
</li>
<table style="margin-left:2em; margin-top:1em;"> <!-- The main REST principles -->
<tbody>
<tr style="height: 0px;">
<td>
<div style="text-align: right;">
<span style="font-family: "arial"; font-weight: bold; ">Method</span>
</div>
</td>
<td>
<div style="text-align: center;">
<span style="font-family: "arial"; font-weight: bold; ">Safe (no side effects)</span>
</div>
</td>
<td>
<div style="text-align: center;">
<span style="font-family: "arial"; font-weight: bold; ">Indepotent</span>
</div>
</td>
<td>
<div style=" "><span style="font-family: "arial"; font-weight: bold; ">Comments</span>
</div>
</td>
</tr>
<tr style="height: 0px;">
<td>
<div style="text-align: right;">
<span style="font-family: "arial"; ">GET</span>
</div>
</td>
<td>
<div style="text-align: center;">y
</div>
</td>
<td>
<div style="text-align: center;">y
</div>
</td>
<td>
<div style=" "><span
style="font-family: "arial"; ">Is cacheable (ETag) or Last-Modified, 304</span>
</div>
</td>
</tr>
<tr style="height: 0px;">
<td>
<div style="text-align: right;">
<span style="font-family: "arial"; ">HEAD</span>
</div>
</td>
<td>
<div style="text-align: center;">y
</div>
</td>
<td>
<div style="text-align: center;">y
</div>
</td>
<td><br/></td>
</tr>
<tr style="height: 0px;">
<td>
<div style="text-align: right;">
<span style="font-family: "arial"; ">POST</span>
</div>
</td>
<td>
<div style="text-align: center;">n
</div>
</td>
<td>
<div style="text-align: center;">n
</div>
</td>
<td>
<div style=" "><span style="font-family: "arial"; ">Location header in response</span>
</div>
</td>
</tr>
<tr style="height: 0px;">
<td>
<div style="text-align: right;">
<span style="font-family: "arial"; ">PUT</span>
</div>
</td>
<td>
<div style="text-align: center;">n
</div>
</td>
<td>
<div style="text-align: center;">y
</div>
</td>
<td>
<div style=" "><span style="font-family: "arial"; ">Create OR update</span>
</div>
</td>
</tr>
<tr style="height: 0px;">
<td>
<div style="text-align: right;">
<span style="font-family: "arial"; ">DELETE</span>
</div>
</td>
<td>
<div style="text-align: center;">n
</div>
</td>
<td>
<div style="text-align: center;">y
</div>
</td>
<td><br/></td>
</tr>
</tbody>
</table>
</ul>
<li style="font-family: Arial; font-weight: bold;">
<div><span>REST support in Spring-MVC </span>
</div>
</li>
<ul>
<table style="margin-left:2em; margin-top:1em;"> <!-- REST support in Spring-MVC -->
<tbody>
<tr style="height: 27px;">
<td><br/></td>
<td colspan="2">
<div><span
style="color: #333333; font-family: "courier new"; ">SOAP / POX (Spring WS)</span>
</div>
</td>
</tr>
<tr style="height: 27px;">
<td>
<div><span style="color: #333333; font-family: "courier new"; ">web.xml</span>
</div>
</td>
<td colspan="2">
<div><span style="color: #333333; font-family: "courier new"; "><servlet></span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> <servlet-name>http-ws-gateway</servlet-name></span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> <servlet-class></span><span
style="color: #333333; font-family: "courier new"; font-weight: bold; ">HttpRequestHandlerServlet</span><span
style="color: #333333; font-family: "courier new"; "></servlet-class></span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> <init-param></span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> <param-name>contextConfigLocation</param- name></span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> <param-value>http-ws-gateway.xml</param-value></span>
</div>
<div><span style="color: #333333; font-family: "courier new"; "></servlet></span>
</div>
<br/>
<div><span
style="color: #333333; font-family: "courier new"; "><servlet-mapping></span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> <servlet-name>http-ws-gateway</servlet-name></span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> <url-pattern>/httpquote</url-pattern></span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "></servlet-mapping></span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td>
<div style=" "><span
style="color: #333333; font-family: "courier new"; ">web infrastr. config</span>
</div>
</td>
<td colspan="2">
<div><span
style="color: #333333; font-family: "courier new"; "><int-http:inbound-gateway </span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> id="http-inbound-gateway"</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> request-channel="http-request"</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> reply-channel="http-response"</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> extract-reply-payload="false"</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> view-name="about"</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> reply-key, reply-timeout,message-converters, </span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> supported-methods, convert-exceptions, </span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> request-payload-type, error-code, errors-key,</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> header-mapper, name/></span>
</div>
<br/>
<ul style=" padding: 0px 0px 0px 2em;">
<li style="color: #333333; font-family: Arial; ">
<div><span>name - e.g. "/subscribe", so that it allows it to be used with DispatcherServlet</span>
</div>
</li>
<li style="color: #333333; font-family: Arial; ">
<div><span>view-name - is the Spring MVC view name</span>
</div>
</li>
<li style="color: #333333; font-family: Arial; ">
<div><span>you can also use inbound-message-adapter if you don't need two way communication, it uses MessageTemplate</span>
</div>
</li>
</ul>
</td>
</tr>
<tr style="height: 28px;">
<td>
<div style=" "><span
style="color: #333333; font-family: "courier new"; ">app config</span>
</div>
</td>
<td colspan="2">
<div><span style="color: #333333; font-family: "courier new"; ">N/A</span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td>
<div style=" "><span
style="color: #333333; font-family: "courier new"; ">Endpoint implementation</span>
</div>
</td>
<td colspan="2">
<div><span style="color: #333333; font-family: "courier new"; ">@Controller</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; ">@RequestMapping(“/rewards”)</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; ">public class Blabla{</span>
</div>
<br/>
<div><span
style="color: #333333; font-family: "courier new"; "> @RequestMapping(value=”/{number}”, method=GET)</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> public String Reward blabla(@RequestBody someObject / Model model, @PathVariable(“number”) String number){</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> // ...</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> // return view name</span>
</div>
<div><span style="color: #333333; font-family: "courier new"; "> }</span>
</div>
<br/>
<div><span
style="color: #333333; font-family: "courier new"; "> @RequestMapping(method=POST)</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> @ResponseStatus(HttpStatus.CREATED)</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> public @ResponseBody Reward blabla(@RequestBody someObject, HttpServletResponse res){</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> // ...</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> res.addHeader(“Location”, getMeUrl(order.getId()));</span>
</div>
<div><span
style="color: #333333; font-family: "courier new"; "> // return the object, will be mapped because of @ResponseBody, by converter based on Accept header</span>
</div>
<div><span style="color: #333333; font-family: "courier new"; "> }</span>
</div>
<div><span style="color: #333333; font-family: "courier new"; ">}</span>
</div>
</td>
</tr>
</tbody>
</table>
<li>
<div style=" "><span>you can also add e.g. </span><span
style="font-family: "courier new" , "courier" , monospace; ">@ResponseStatus(value=HttpStatus.CONFLICT)</span><span><span
style="font-family: "courier new" , "courier" , monospace;"> </span><span>on your Exception class, to have the exception mapped</span></span>
</div>
</li>
<li>
<div><span>alternatively to the above, in your controller you can add an empty void method annotated with </span><span><span
style="font-family: "courier new" , "courier" , monospace;">@ExceptionHandler(value=YourException.class)</span><span> </span></span><span>and </span><span
style="font-family: "courier new" , "courier" , monospace; ">@ResponseStatus(value=HttpStatus.NOT_FOUND)</span>
</div>
</li>
<div markdown="1">
```xml
<int-http:outbound-gateway
url="http://blblah"
request-channel="requests"
http-method="GET"
expected-response-type="java.lang.String">
<int-http:uri-variable
name="location"
expression="payload"/>
</int:http:outbound-gateway>
```
- you can use outbound-channel-adapter if you don't need two way communication, it uses RestTemplate;
- in the case above it's better to override the error handler, as the default one treats only 4** and 5** responses as errors
</div>
<li
style="font-weight: bold;">
<div ><span
>Spring-MVC is an alternative to JAX-RS, not an implementation </span>
</div>
</li>
<ul >
<li
>
<div style=" ">
<span >got it ;)</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>The @RequestMapping annotation, including URI template support </span>
</div>
</li>
<li
style="font-weight: bold;">
<div ><span
>The @RequestBody and @ResponseBody annotations </span>
</div>
</li>
<li
style="font-weight: bold;">
<div ><span
>The functionality offered by the RestTemplate</span>
</div>
</li>
<ul >
<li
>
<div ><span
>for client side</span>
</div>
</li>
<li
>
<div ><span
>has default </span><span
style="font-family: "courier new" , "courier" , monospace; ">HttpMessageConverters </span><span
>(same like on server), supports URI templates</span>
</div>
</li>
<ul >
<li
>
<div ><span
>e.g. </span><span
style="font-family: "courier new" , "courier" , monospace; ">Jaxb2RootElementHttpMessageConverter</span><span
><span
>, register it with</span><span
style="font-family: "courier new" , "courier" , monospace;"> </span></span><span
style="font-family: "courier new" , "courier" , monospace; "><mvc:annotation-driven/></span><span
> !</span>
</div>
</li>
<li
>
<div ><span
>it can also use external configuration, e.g. Apache Commons HTTP Client (set the “</span><span
>requestFactory</span><span
>” property to </span><span
style="font-family: "courier new" , "courier" , monospace; ">CommonsCliemtHttpRequestFactory</span>
</div>
</li>
</ul>
<li
>
<div ><span
><span
>HttpEntity </span><span
>represents request or response (payload + headers)</span></span>
</div>
</li>
</ul>
<div markdown="1">
```java
<T> T getForObject(URIurl, Class<T>responseType) throws RestClientException; ← returns the object from GET
<T> T getForObject(Stringurl, Class<T>responseType, Object... uriVariables)throwsRestClientException;
<T> T getForObject(Stringurl, Class<T>responseType, Map<String, ?>uriVariables)throwsRestClientException;
<T> ResponseEntity<T> getForEntity(URI url, Class<T> responseType)
throws RestClientException; ← returns the whole response from GET (with headers)
void put(String url, Object request, Object...uriVariables) throws RestClientException;
void delete(String url, Map<String, ?>uriVariables) throws RestClientException;
<T> T postForObject(String url,Object request,Class<T> responseType, Object… uriVariables) throws RestClientException;
<T> ResponseEntity <T> postForEntity(String url, Object request, Class<T> resType, Object...uriVariables)
throws RestClientException;
* URI url=response.getHeaders().getLocation(); ← to get location of the new resource!
<T> T execute()
<T> ResponseEntity <T> exchange(String url, HttpMethod method, HttpEntity<?> reqEntity, Class<T> resType, Object... uriVariables)
throws RestClientException;
```
</div>
</ul>
</ol>
### JMS with Spring
<ol>
<li
style="font-weight: bold;">
<div ><span
>Where can Spring-JMS applications obtain their JMS resources from </span>
</div>
</li>
<ul >
<li
>
<div ><span
>either create manually (standalone), or obtain from JNDI:</span>
</div>
</li>
</ul>
<table style="margin-left:2em; margin-top:1em;" ><!-- Where can Spring-JMS applications obtain their JMS resources from -->
<tbody>
<tr style="height: 28px;">
<td rowspan="2" >
<div style=" "><span
style="font-family: "arial"; ">Destination</span>
</div>
</td>
<td >
<div style=" "><span
style="font-family: "courier new"; "><bean id=”orderQueue” class=”org.apache.activemq...ActiveMQQueue”></span><span
style="font-family: "courier new"; "><br
class="kix-line-break"/></span><span
style="font-family: "courier new"; "> <constructor-arg value=”queue.orders”/></span><span
style="font-family: "courier new"; "><br
class="kix-line-break"/></span><span
style="font-family: "courier new"; "></bean></span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td >
<div style=" "><span
style="font-family: "courier new"; "><jee:jndi-lookup id=”orderQueue” jndi-name=”jms/OrderQueue”/></span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td rowspan="2" >
<div style=" "><span
style="font-family: "arial"; ">ConnectionFactory</span>
</div>
</td>
<td >
<div style=" "><span
style="font-family: "courier new"; "><bean id=”cf” class=”org.apache.activemq.ActiveMQConnectionFactory”></span><span
style="font-family: "courier new"; "><br
class="kix-line-break"/></span><span
style="font-family: "courier new"; "> <property name=”brokerURL” value=”tc[://localhost:61616”/></span><span
style="font-family: "courier new"; "><br
class="kix-line-break"/></span><span
style="font-family: "courier new"; "></bean></span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td >
<div style=" "><span
style="font-family: "courier new"; "><jee:jndi lookup id=”cf” jndi-name=”jms/ConnectionFactory”/></span>
</div>
</td>
</tr>
<tr style="height: 0px;">
<td >
<div style=" "><span
style="font-family: "arial"; ">Connection</span>
</div>
</td>
<td >
<div style=" "><span
style="font-family: "courier new"; ">connectionFactory.createConnection();</span>
</div>
</td>
</tr>
<tr style="height: 0px;">
<td >
<div style=" "><span
style="font-family: "arial"; ">Session</span>
</div>
</td>
<td >
<div style=" "><span
style="font-family: "arial"; ">created from the </span><span
style="font-family: "courier new"; ">Connection</span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td >
<div style=" "><span
style="font-family: "arial"; ">JMS Message</span>
</div>
</td>
<td rowspan="3" >
<div style=" "><span
style="font-family: "arial"; ">created from </span><span
style="font-family: "courier new"; ">Session</span>
</div>
<br/>
<div style=" "><span
style="font-family: "courier new"; ">session.createProducer(destination);</span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td >
<div style=" "><span
style="font-family: "arial"; ">MessageProducer</span>
</div>
</td>
</tr>
<tr style="height: 28px;">
<td >
<div style=" "><span
style="font-family: "arial"; ">MessageConsumer</span>
</div>
</td>
</tr>
</tbody>
</table>
<li
style="font-weight: bold;">
<div style=" "><span
>The functionality offered by Spring's JMS message listener container, including the use of a MessageListenerAdapter through the 'method' attribute in the <jms:listener/> element</span>
</div>
</li>
<ul >
<li
>
<div ">
<span >MessageListener an interface for asynchronous reception of messages; one method: </span><span
style="font-family: "courier new" , "courier" , monospace; ">void onMessage(Message)</span>
</div>
</li>
<li
>
<div ><span
>implement MessageListener, or </span><span
style="font-family: "courier new" , "courier" , monospace; ">SessionAwareMessageListener </span><span
>(extends MessageListener)</span>
</div>
</li>
<li
>
<div ><span
>requires a listener container - in the past EJB container, now</span>
</div>
</li>
<ul >
<li
>
<div ><span
style="font-family: "courier new" , "courier" , monospace; ">SimpleMessageListenerContainer </span><span
>- fixed number of sessions</span>
</div>
</li>
<li
>
<div ><span
style="font-family: "courier new" , "courier" , monospace; ">DefaultMessageListenerContainer </span><span
>- adds transactional capability</span>
</div>
</li>
</ul>
</ul>
<div markdown="1">
```xml
<jms:listener-container connection-factory=”cf”>
<jms:listener destination=”queue.orders”
ref=”myListener
(method=”order”)
(response-destination=”queue.confirmation”)/>
</jms:listener-container>
```
</div>
<li
style="font-weight: bold;">
<div ><span
>The functionality offered by the JmsTemplate</span>
</div>
</li>
<ul >
<li
>
<div ><span
>converts exceptions to unchecked</span>
</div>
</li>
<li
>
<div ><span
>convenience methods</span>
</div>
</li>
<li
>
<div ><span
>needs reference to </span><span
style="font-family: "courier new" , "courier" , monospace; ">ConnectionFactory</span><span
>, optionally set defaultDestination property</span>
</div>
</li>
<ul >
<li
>
<div ><span
>use </span><span
style="font-family: "courier new" , "courier" , monospace; ">CachingConnectionFactory </span><span
><span
>wrapper around the </span><span
style="font-family: "courier new" , "courier" , monospace;">ActiveMQConnectionFactory</span><span
>, as JmsTemplate aggresively opens and closes and reopens JMS resources</span></span>
</div>
</li>
</ul>
<li
>
<div ><span
>uses</span>
</div>
</li>
<ul >
<li
>
<div ><span
><span
style="font-family: "courier new" , "courier" , monospace;">MessageConverter </span><span
>(default </span></span><span
>SimpleMessageConverter</span><span
> - handles String, Serializable, Map, byte[]) - from/to Message</span>
</div>
</li>
<li
>
<div ><span
style="font-family: "courier new" , "courier" , monospace; ">DestinationResolver </span><span
>(</span><span
>default </span><span
>DynamicDestinationResolver, JndiDestinationResolver)</span><span
>- from String to Destination</span>
</div>
</li>
</ul>
</ul>
<div markdown="1">
```java
void convertAndSend([String/Destination d,] Object m)
void convertAndSend(Object m, MessagePostProcessor mpp) // to do stuff to the message after it has been converted
void send(MessageCreator mc) // is used inside convertAndSend()
Object execute(ProducerCallback<T> action)
Object execute(SessionCallback<T> action)
Message receive([String/Destination d,])
Object receiveAndConvert(destination)
```
</div>
</ol>
### Transactions
<ol >
<li
style="font-weight: bold;">
<div ><span
>Local JMS Transactions with Spring</span>
</div>
</li>
<ul >
<li
>
<div ><span
>acknowledge mode - not transactions </span><span
style="font-family: "courier new" , "courier" , monospace; ">connection.createSession(transacted=false, acknowledgeMode)</span>
</div>
</li>
<ul >
<li
>
<div ><span
>AUTO_ACKNOWLEDGE (default) => calling .receive() or onMessage() = removing message from the queue</span>
</div>
</li>
<li
>
<div ><span
>CLIENT_ACKNOWLEDGE => client must call message.acknowledge(), and this one call will remove all messages since last time! (which are bound to the same session). And client can also call session.recover() to have redelivery of those messages (can be duplicates)</span>
</div>
</li>
<li
>
<div ><span
>DUPS_OK_ACKNOWLEDGE => like auto but lazily, once every few times</span>
</div>
</li>
</ul>
<li
>
<div ><span
><span
>transacted session -</span><span
style="font-family: "courier new" , "courier" , monospace;"> </span></span><span
style="font-family: "courier new" , "courier" , monospace; ">connection.createSession(transacted=true, null)</span>
</div>
</li>
<ul >
<li
>
<div ><span
>starts local JMS when no managed JMS or JTA transaction is in progress; will be synchronised with existing local transaction</span>
</div>
</li>
<li
>
<div ><span
>transaction starts, when message is received</span>
</div>
</li>
<li
>
<div ><span
>if a message fails, it will be put back on the queue</span>
</div>
</li>
</ul>
</ul>
<li
style="font-weight: bold;">
<div ><span
>How to enable local JMS transactions with Spring's message listener container</span>
</div>
</li>
<div markdown="1">
```xml
<jms:listener-container acknowledge=”auto|client|dups_ok|transacted”/>
```
or
```xml
<jms:listener-container transaction-manager=”tm”/>
```
```java
connection.createSession(transacted=true, acknowledgeMode);
session.commit();
session.rollback();
```
</div>
<li
style="font-weight: bold;">
<div ><span
>If and if so, how is a local JMS transaction made available to the JmsTemplate</span>
</div>
</li>
<ul>
<div markdown="1">
```xml
<bean class=”..JmsTemplate>
(<property name=”sessionAcknowledgeMode” value=”...”/>)
<property name=”sessionTransacted” value=”true”/>
</bean>
```
</div>
<li
>
<div style=" ">
<span >jmsTemplate will automatically use same session (</span><span
style="font-family: "courier new" , "courier" , monospace; ">ConnectionFactoryUtils.doGetTransactionalSession()</span><span
>), so the above settings are ignored in case the session was created within an active transaction already</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>How does Spring attempt to synchronize a local JMS transaction and a local database transaction (?)</span>
</div>
</li>
<ul >
<li
>
<div ><span
>Commit database before JMS (can end up with duplicates), and at the end</span>
</div>
</li>
<li
>
<div ><span
>Put commits close together</span>
</div>
</li>
<li
>
<div style=" "><span
>only as last resort use XA</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>The functionality offered by the JmsTransactionManager (?)</span>
</div>
</li>
<ul >
<li
>
<div >
<span style="font-family: "courier new" , "courier" , monospace; ">JmsTransactionManager</span><span
><span
style="font-family: "courier new" , "courier" , monospace;"> </span><span
>or </span></span><span
style="font-family: "courier new" , "courier" , monospace; ">DataSourceTransactionManager </span><span
>are both implementations of </span><span
style="font-family: "courier new" , "courier" , monospace; ">PlatformTransactionManager</span>
</div>
</li>
<li
>
<div >
<span ><span
>“Performs local resource transactions, binding a JMS Connection/Session pair from the specified </span><span
>ConnectionFactory </span><span
>to the thread</span></span>
</div>
</li>
<li
>
<div >
<span >The </span><span
style="font-family: "courier new" , "courier" , monospace; ">JmsTemplate </span><span
>auto-detects an attached thread and participates automatically with Session</span>
</div>
</li>
<li
>
<div >
<span><span
>The </span><span
style="font-family: "courier new" , "courier" , monospace;">JmsTransationManager </span><span
>allows a </span></span><span
style="font-family: "courier new" , "courier" , monospace; ">CachingConnectionFactory </span><span
>that uses a single connection for all JMS access (performance gains). All Sessions belong to the same connection”</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>What guarantees does JTA provide that local transactions do not provide</span>
</div>
</li>
<ul >
<li
>
<div >
<span >once-and-once-only delivery</span>
</div>
</li>
<li
>
<div >
<span >ACID with multiple resources</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>How to switch from local to global JTA transactions</span>
</div>
</li>
<div markdown="1">
<jms:listener-container transaction-manager=”transactionManager” ../>
</div><ul>
<li
>
<div >
<span >you also may have to reconfigure some resources like Hibernate</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>Where can you obtain a JTA transaction manager from</span>
</div>
</li>
<ul >
<li
>
<div style=" ">
<span >within J2EE server:</span>
</div>
</li>
<div markdown="1">
```xml
<tx:jta-transaction-manager/>
<jee:jndi-lookup id=”dataSource” jndi-name=”java:comp/env/jdbc/myDS”/>
<jee:jndi-lookup id=”connectionFactory” jndi-name=”java:comp/env/jdbc/myConnFac”/>
```
</div>
<li
>
<div style=" "><span
>standalone definition:</span>
</div>
</li>
<div markdown="1">
```xml
<bean id=”transactionManager” class=”org….JtaTransactionManager”>
<property name=”transactionManager” ref=”jtaTxMgr”/>
<property name=”userTransaction” ref=”userTx”/>
</bean>
```
</div>
<li
style="font-family: Arial; ">
<div ><span
>this class is provided by Spring, but it only integrates with external JTA TX manager (e.g. Atomikos was used in course)</span>
</div>
</li>
<li
style="font-family: Arial; ">
<div ><span
>the externally provided transactionManager and userTransaction resources have to be of type “XA aware”</span>
</div>
</li>
</ul>
</ol>
### Batch processing with Spring Batch
<ol >
<li
style="font-family: Arial; font-weight: bold;">
<div ><span
>Main concepts (Job, Step, Job Instance, Job Execution, Step Execution, etc.) </span>
</div>
</li>
<ul >
<li
style="font-family: Arial; ">
<div ><span
>easy, see also </span><a
href="https://docs.google.com/document/d/1Bk3cR5GoH-yMURSaZH47FSBawiAaY68BFsYiOlPzTms/edit#"
style="color: #009eb8; display: inline; text-decoration: none; transition: color 0.3s;"><span
style="color: #1155cc; font-family: "arial"; text-decoration: underline; ">here</span></a>
</div>
</li>
<li
style="font-family: Arial; ">
<div ><span
>job execution has status, but job instance (job+parameters) also has a status, the “overall status”</span>
</div>
</li>
<div markdown="1">
```xml
<job id="resendUnprocessedDinings">
<step id="processConfirmationsStep" next="sendUnprocessedDiningsStep">
<tasklet>
<chunk reader="confirmationReader"
writer="confirmationUpdater"
commit-interval="${chunk.size}"
reader-transactional-queue="true"/>
</tasklet>
</step>
</job>
```
</div></ul>
<li style="font-weight: bold;">
<div><span>The interfaces typically used to implement a chunk-oriented Step</span>
</div>
</li>
<ul>
<li>
<div><span>Step for one chunk goes like this:</span>
</div>
</li>
<ul>
<li>
<div><span><span>ItemReader, </span><span
style="font-family: "courier new" , "courier" , monospace;"> </span></span><span><span
style="font-family: "courier new" , "courier" , monospace;">ItemReader<Dining></span><span>,</span><span
style="font-family: "courier new" , "courier" , monospace;"> public Dining read(){}</span></span>
</div>
</li>
<li>
<div><span><span
style="font-family: "courier new" , "courier" , monospace;">ItemProcessor </span><span>(optional), </span></span><span
style="font-family: "courier new" , "courier" , monospace; ">ItemProcessor<XMLDining, Dining></span><span>, </span><span
style="font-family: "courier new" , "courier" , monospace; ">public Dining process(XMLDining bla){}</span>
</div>
</li>
<li>
<div><span><span
style="font-family: "courier new" , "courier" , monospace;">ItemWriter</span><span>, </span></span><span
style="font-family: "courier new" , "courier" , monospace; ">ItemWriter<Dining></span><span>,</span><span
style="font-family: "courier new" , "courier" , monospace; "> public write(List<? extends Dining> dinings){}</span>
</div>
</li>
<div markdown="1">
```xml
<bean id="itemReader" class="....FlatFileItemReader" scope="step">
<property name="resource" value="file://#{jobParameters['filena']}"/>
<property name="lineMapper">
<bean class="....DefaultLineMapper">
<property name="lineTokenizer">
<bean class="....DelimitedLineTokenizer">
<property name="names" value="source,dest,amount,date"/>
</bean>
</property>
<property name="fieldSetMapper" ref=”myMapper”/>
</bean>
</property>
</bean>
<bean id="itemWriter" class="....FlatFileItemWriter" scope="step">
<property name=”fieldSetCreator” ref=”customCreator”/>
...
</bean>
```
</div></ul></ul>
<li style="font-weight: bold;">
<div><span>How and where state can be stored</span>
</div>
</li>
<ul>
<li>
<div><span>in </span><span
style="font-family: "courier new" , "courier" , monospace; ">JobRepository</span>
</div>
</li>
<ul>
<li>
<div><span>it can use database or in memory map</span>
</div>
</li>
<li>
<div><span>persists jobs’ metadata and intermediate state of execution (=job instance and job execution)</span>
</div>
</li>
</ul>
<div markdown="1">
```xml
<batch:job-repository id=”jobRepository”/>
```
, or
```xml
<batch:job-repository
data-source="dataSource"
id="jobRepository"
transaction-manager="transactionManager"
table-prefix="BATCH_"/>
```
</div>
<li style="font-family: Arial; ">
<div>
<span>JobLauncher creates JobExecution entity in the JobRepository, next it executes the job, and returns the result</span>
</div>
</li>
<li style="font-family: Arial; ">
<div><span>JobLauncher is already wrapped in </span><span
style="font-family: "courier new"; ">CommandLineJobRunner</span><span>, if you wanna use it</span>
</div>
</li>
<li style="font-family: Arial; ">
<div><span style="font-family: "courier new"; ">jobLauncher.run(job, parameters)</span>
</div>
</li>
<div markdown="1">
```xml
<bean id="jobLauncher" class="....SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository"/>
</bean>
```
</div></ul>
<li
style="font-weight: bold;">
<div ><span
>What are job parameters and how are they used</span>
</div>
</li>
<ul >
<li
>
<div ><span
>they identify the job instance, same instance cannot be run twice thats why add there some counter</span>
</div>
</li>
<li
>
<div ><span
>in case same job instance is attempted to be re-launched you will get </span><span
style="font-family: "courier new" , "courier" , monospace; ">JobInstanceAlreadyCompletedException</span>
</div>
</li>
<li
>
<div ><span
>you create it programmatically using builder: </span>
</div>
</li>
<div markdown="1">
```java
JobParametersBuilder jpb = new JobParametersBuilder();
jpb.addString('filena', 'payment.xml');
JobExecution execution = jobLauncher.run(job, jpb.toJobParameters());
```
</div></ul>
<li
style="font-weight: bold;">
<div ><span
>What is a FieldSetMapper and what is it used for</span>
</div>
</li>
<ul >
<li
>
<div ><span
>FieldSetMapper<T> → T mapFieldSet(FieldSet fs)</span>
</div>
</li>
<li
>
<div ><span
>used by e.g. LineMapper (e.g. </span><span
style="font-family: "courier new" , "courier" , monospace; ">DefaultLineMapper</span><span
>), which is used by </span><span
style="font-family: "courier new" , "courier" , monospace; ">FlatFileItemReader</span>
</div>
</li>
<div markdown="1">
```java
Date date = fs.readDate(0,"dd/MM/yyyy");
Long number = fs.readLong(1);
String value = fs.readString("city");
String[] values = fs.getValues();
```
```java
public class MyMapper implements FieldSetMapper<Payment>{
@Override
public Payment mapFieldSet(FieldSet fieldSet)
throws BindException {
... = fieldSet.readString("source");
... = fieldSet.readBigDecimal("amount");
... = fieldSet.readDate("date");
}
}
```
</div> </ul>
</ol>
### Spring Integration
<ol >
<li
style="font-weight: bold;">
<div ><span
>Main concepts (Messages, Channels, Endpoint types)</span>
</div>
</li>
<ul >
<li
>
<div ><span
><span
>please, refer to </span><a
href="https://docs.google.com/document/d/1kAMz6QZD-iFhiJKCZcijb5c7mdp2VdmuX0ylA-yGZtc/edit#heading=h.41jqjf4k4rq6"
style="color: #009eb8; display: inline; text-decoration: none; transition: color 0.3s;"><span
style="color: #1155cc; text-decoration: underline; ">this</span></a><span
>, and </span><a
href="https://docs.google.com/document/d/1g6xTuuZZaZ-r2kNxbwCYLIC6kjVeW_x3UvLnpfLQnac/edit#"
style="color: #009eb8; display: inline; text-decoration: none; transition: color 0.3s;"><span
style="color: #1155cc; text-decoration: underline; ">this</span></a></span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>Pay special attention to the various Endpoint types and how they're used!</span>
</div>
</li>
<ul >
<li
>
<div ><span
>refer even more carefully the above</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>How to programmatically create new Messages </span>
</div>
</li>
<ul >
<li
>
<div ><span
><span
>MessageBuilder </span><span
>of course</span></span>
</div>
</li>
<li
>
<div ><span
>each message is created with unique ID</span>
</div>
</li>
<li
>
<div ><span
>Also </span><span
style="font-family: "courier new" , "courier" , monospace; ">MessagingTemplate </span><span
>is worth mentioning, which is a programmatic endpoint for sending the messages created by MessageBuilder (e.g. for testing)</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>Using chains and bridges</span>
</div>
</li>
<ul >
<li
>
<div ><span
>see a.</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>Synchronous vs. asynchronous message passing: the different Channel types and how each of them should be used</span>
</div>
</li>
<ul >
<li
>
<div ><span
>see a.</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>The corresponding effects on things like transactions and security</span>
</div>
</li>
<ul >
<li
>
<div ><span
>see a.</span>
</div>
</li>
</ul>
<li
style="font-weight: bold;">
<div ><span
>The need for active polling and how to configure that </span>
</div>
</li>
<ul >
<li
>
<div ><span
><span
>see </span><a
href="https://docs.google.com/document/d/1kAMz6QZD-iFhiJKCZcijb5c7mdp2VdmuX0ylA-yGZtc/edit#heading=h.xl5a2aw6wmob"
style="color: #009eb8; display: inline; text-decoration: none;"><span
style="color: #1155cc; text-decoration: underline; ">this</span></a><span
>, and </span><span
style="color: #1155cc; text-decoration: underline; "><a
href="https://docs.google.com/document/d/1kAMz6QZD-iFhiJKCZcijb5c7mdp2VdmuX0ylA-yGZtc/edit#heading=h.yik4lvhdqsdb"
style="color: #009eb8; display: inline; text-decoration: none;">this</a></span></span>
</div>
</li>
</ul>
</ol>
| 38.916546 | 589 | 0.477948 | eng_Latn | 0.352165 |
934c7ab119b5349aae2e87de58fce7d73a2b90fa | 1,459 | md | Markdown | skills/B01M4IDXBZ/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 232 | 2016-03-05T06:24:41.000Z | 2022-03-21T19:32:55.000Z | skills/B01M4IDXBZ/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 5 | 2016-03-21T02:25:06.000Z | 2020-01-03T15:01:39.000Z | skills/B01M4IDXBZ/README.md | AndHor66/alexa-skills-list | 3833958be3f3a4091414645d81a531915303a2ef | [
"MIT"
] | 52 | 2016-04-02T06:08:55.000Z | 2021-12-12T23:52:13.000Z | # [Parks And Recreation Trivia (Unofficial)](http://alexa.amazon.com/#skills/amzn1.ask.skill.82071acc-391f-4ede-9bde-5bbc391dbb1d)
 3
To use the Parks And Recreation Trivia (Unofficial) skill, try saying...
* *Alexa, open parks rec trivia*
* *Alexa, begin parks rec trivia*
* *Alexa, start parks rec trivia*
In this game, you'll be asked five trivia questions about Parks and Recreation.
For each question you will have three answers to choose from.
To answer, just say the number of the answer.
To skip, say "I don't know" or "skip".
Say 'Repeat' to hear a question again.
Say "Start over" or "New game" if you would like to clear scores and start a new game.
NOTE
To fit with the invocation guidelines, you can only use the invocation phrase "parks rec trivia".
See the example phrases for more information.
DISCLAIMER
This skill is not sponsored or endorsed by NBCUniversal Television Distribution
***
### Skill Details
* **Invocation Name:** parks rec trivia
* **Category:** null
* **ID:** amzn1.ask.skill.82071acc-391f-4ede-9bde-5bbc391dbb1d
* **ASIN:** B01M4IDXBZ
* **Author:** Kathryn L Webb
* **Release Date:** October 26, 2016 @ 01:45:48
* **In-App Purchasing:** No
| 37.410256 | 274 | 0.734064 | eng_Latn | 0.768597 |
934cc685e297f79cbf3a1389cfaa5fe0c1f2b453 | 568 | md | Markdown | README.md | alu0100970876/nutrientes | 7378c62875975aa8403bcd710a4ebe62e1fd414a | [
"MIT"
] | null | null | null | README.md | alu0100970876/nutrientes | 7378c62875975aa8403bcd710a4ebe62e1fd414a | [
"MIT"
] | null | null | null | README.md | alu0100970876/nutrientes | 7378c62875975aa8403bcd710a4ebe62e1fd414a | [
"MIT"
] | null | null | null | # Nutrientes
Assignement for LPP (Lenguajes y paradigmas de programación)
In this directory , you'll find the files you need to be able to use the class Alimentos , as well as all its test.
## Usage
$rake <-- make the test
## Made by
*Miguel Jiménez Gomis*
## License
The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).+
[](https://coveralls.io/github/alu0100970876/nutrientes?branch=practica7)
| 27.047619 | 178 | 0.758803 | eng_Latn | 0.760017 |
934d2f61ea5148cc92d4dc14376c8bf99a37d21b | 4,536 | md | Markdown | windows-driver-docs-pr/wdf/debugging-umdf-2-0-drivers.md | Ryooooooga/windows-driver-docs.ja-jp | c7526f4e7d66ff01ae965b5670d19fd4be158f04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-driver-docs-pr/wdf/debugging-umdf-2-0-drivers.md | Ryooooooga/windows-driver-docs.ja-jp | c7526f4e7d66ff01ae965b5670d19fd4be158f04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-driver-docs-pr/wdf/debugging-umdf-2-0-drivers.md | Ryooooooga/windows-driver-docs.ja-jp | c7526f4e7d66ff01ae965b5670d19fd4be158f04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: UMDF 2.0 ドライバーのクラッシュのトラブルシューティング
description: ユーザーモードドライバーフレームワーク (UMDF) バージョン2以降では、Wdfkd に実装されているデバッガー拡張機能コマンドのサブセットを使用して、UMDF ドライバーをデバッグできます。
ms.assetid: df1bfc10-379b-457f-a9c8-40fa10048f81
ms.date: 04/20/2017
ms.localizationpriority: medium
ms.openlocfilehash: 4336f7b007fe1d03f5f2fb902658779d24d5f84d
ms.sourcegitcommit: 9355a80229bb2384dd45493d36bdc783abdd8d7a
ms.translationtype: MT
ms.contentlocale: ja-JP
ms.lasthandoff: 01/07/2020
ms.locfileid: "75694242"
---
# <a name="troubleshooting-umdf-20-driver-crashes"></a>UMDF 2.0 ドライバーのクラッシュのトラブルシューティング
ユーザーモードドライバーフレームワーク (UMDF) バージョン2以降では、Wdfkd に実装されているデバッガー拡張機能コマンドのサブセットを使用して、UMDF ドライバーをデバッグできます。 このトピックでは、UMDF ドライバーの問題をトラブルシューティングするために、どのコマンドを起動するかについて説明します。
## <a name="determining-why-a-umdf-20-driver-crashed"></a>UMDF 2.0 ドライバーがクラッシュした原因の特定
ドライバーのホストプロセスが終了した場合、コールバックで[ホストのタイムアウト](how-umdf-enforces-time-outs.md)しきい値を超えた問題が発生している可能性があります。 この場合、リフレクタはドライバーホストプロセスを終了します。
調査するには、まず、「 [UMDF ドライバーのデバッグを有効にする方法](enabling-a-debugger.md)」の説明に従って、カーネルモードのデバッグセッションを設定します。 テストシステムにアタッチされたカーネルデバッガーで、UMDF ドライバーのすべての開発とテストを行い、WUDFHost で[アプリケーション検証ツール ()](../debugger/debugger-download-tools.md)を有効にすることを強くお勧めします。 次のコマンドを使用して、カーネルデバッガーをアタッチし、再起動します。
```cpp
AppVerif –enable Heaps Exceptions Handles Locks Memory TLS Leak –for WudfHost.exe
```
- **HostFailKdDebugBreak**が設定されている場合 (これは既定で Windows 8 を起動しています)、タイムアウトしきい値を超えたときに、リフレクターはカーネルモードのデバッガーを中断します。 デバッガーの出力には、クリックできるリンクなど、開始方法に関するいくつかの推奨事項が表示されます。 たとえば次のようになります。
```cpp
**** Problem detected in UMDF driver "WUDFOsrUsbFx2". !process 0xFFFFE0000495B080 0x1f, !devstack 0xFFFFE000032BFA10, Problem code 3 ****
**** Dump UMDF driver image name and stack: !wdfkd.wdfumdevstack 0x000000BEBB49AE20
**** Dump UM Irps for this stack: !wdfkd.wdfumirps 0x000000BEBB49AE20
**** Dump UMDF trace log: !wmitrace.logdump WUDFTrace
**** Helpful UMDF debugger extension commands: !wdfkd.wdfhelp
**** Note that driver host process may get terminated if you go past this break, making it difficult to debug the problem!
```
- [ **! Analyze**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-analyze)を使用して、エラーに関する情報と、実行できるその他の UMDF 拡張コマンドを表示します。
- [ **! Process 0 0x1f wudfhost .exe**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-process)を使用して、スレッドスタック情報を含むすべての Wudfhost .exe ドライバーのホストプロセスを一覧表示します。
また、! wdfkd. wdfumtriage と[ **! wdfkd. wdfldr**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-wdfkd-wdfldr)を使用すると、現在 WDF にバインドされているすべてのドライバーを表示できます。 UMDF ドライバーのイメージ名をクリックすると、ホストプロセスのアドレスがデバッガーに表示されます。 次に、プロセスのアドレスをクリックすると、そのプロセスに固有の情報が表示されます。
- 必要に応じて、 [ **. process/r/P*プロセス*** ](https://docs.microsoft.com/windows-hardware/drivers/debugger/-process--set-process-context-)を使用して、プロセスコンテキストを、ドライバーをホストしている wudfhost プロセスのコンテキストに切り替えます。 [ **. Cache forcedecodeuser**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-cache--set-cache-size-)と**lmu**を使用して、ドライバーが現在のプロセスでホストされていることを確認します。
- スレッド呼び出し履歴 ([ **! スレッド*アドレス*** ](https://docs.microsoft.com/windows-hardware/drivers/debugger/-thread)) を調べて、ドライバーコールバックがタイムアウトしたかどうかを確認します。スレッドのティック数を確認します。 Windows 8.1 では、リフレクターは1分後にタイムアウトします。
- デバイスツリーを詳細な形式で表示するには、 [ **! wdfkd. wdfdriverinfo**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-wdfkd-wdfdriverinfo)を使用します。 次に、[ [ **! wdfdevice**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-wdfkd-wdfdevice)] をクリックします。 このコマンドは、電源、電源ポリシー、およびプラグアンドプレイ (PnP) の詳細な状態情報を表示します。
- [**Wdfumirps**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-wdfkd-wdfumirps)を使用して、保留中の irp を探します。
- [ **! Wdfkd. wdfdevicequeues**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-wdfkd-wdfdevicequeues)を使用して、ドライバーのキューの状態を確認します。
- カーネルモードのデバッグセッションでは、 [ **! Wmitrace WudfTrace**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-wmitrace-logdump)を使用してトレースログを表示できます。
## <a name="displaying-the-umdf-20-ifr-log"></a>UMDF 2.0 IFR ログの表示
カーネルモードのデバッグセッションでは、 [ **! wdfkd. wdflogdump**](https://docs.microsoft.com/windows-hardware/drivers/debugger/-wdfkd-wdflogdump) extension コマンドを使用して、Windows Driver FRAMEWORK (WDF) の実行中のレコーダー (IFR) ログレコードを表示できます (使用可能な場合)。
## <a name="finding-memory-dump-files"></a>メモリダンプファイルの検索
ユーザーモードのダンプファイルの検索について[は、「リフレクターがホストプロセスを終了した理由の特定](determining-why-the-reflector-terminated-the-host-process.md)」を参照してください。 **LogMinidumpType**レジストリ値を設定してミニダンプファイルに格納されている情報の種類を指定する方法については、「 [UMDF ドライバーでの WPP ソフトウェアトレースの使用](using-wpp-software-tracing-in-umdf-drivers.md)」を参照してください。
| 65.73913 | 352 | 0.799603 | yue_Hant | 0.786318 |
934d4643a5ad136b5a839003e20335381af167a8 | 2,507 | md | Markdown | includes/policy/reference/byrp/microsoft.search.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/policy/reference/byrp/microsoft.search.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/policy/reference/byrp/microsoft.search.md | tsunami416604/azure-docs.hu-hu | aeba852f59e773e1c58a4392d035334681ab7058 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
author: DCtheGeek
ms.service: azure-policy
ms.topic: include
ms.date: 11/20/2020
ms.author: dacoulte
ms.custom: generated
ms.openlocfilehash: 7b76b04e952d500490473edf6d1c890b9f0be614
ms.sourcegitcommit: 9889a3983b88222c30275fd0cfe60807976fd65b
ms.translationtype: MT
ms.contentlocale: hu-HU
ms.lasthandoff: 11/20/2020
ms.locfileid: "94990914"
---
|Name<br /><sub>(Azure Portal)</sub> |Description |Hatás (ok) |Verzió<br /><sub>GitHub</sub> |
|---|---|---|---|
|[A keresési szolgáltatások diagnosztikai beállításainak üzembe helyezése az Event hub-ban](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F3d5da587-71bd-41f5-ac95-dd3330c2d58d) |A keresési szolgáltatások diagnosztikai beállításainak üzembe helyezése a regionális esemény központba, ha bármely olyan keresési szolgáltatás, amelyből hiányzik ez a diagnosztikai beállítás, létrejön vagy frissül. |DeployIfNotExists, letiltva |[2.0.0](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/Monitoring/Search_DeployDiagnosticLog_Deploy_EventHub.json) |
|[A keresési szolgáltatások diagnosztikai beállításainak üzembe helyezése Log Analytics munkaterületre](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2F08ba64b8-738f-4918-9686-730d2ed79c7d) |Központilag telepíti a keresési szolgáltatások diagnosztikai beállításait egy regionális Log Analytics munkaterületre, amikor bármely olyan keresési szolgáltatást létrehoz vagy frissít, amelyből hiányzik ez a diagnosztikai beállítás. |DeployIfNotExists, letiltva |[1.0.0](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/Monitoring/Search_DeployDiagnosticLog_Deploy_LogAnalytics.json) |
|[A keresési szolgáltatásokban engedélyezni kell a diagnosztikai naplókat.](https://portal.azure.com/#blade/Microsoft_Azure_Policy/PolicyDetailBlade/definitionId/%2Fproviders%2FMicrosoft.Authorization%2FpolicyDefinitions%2Fb4330a05-a843-4bc8-bf9a-cacce50c67f4) |Diagnosztikai naplók engedélyezésének naplózása. Ez lehetővé teszi, hogy újból létrehozza a vizsgálati célokra használandó tevékenység-nyomvonalat; biztonsági incidens bekövetkezésekor vagy a hálózat biztonsága esetén |AuditIfNotExists, letiltva |[3.0.0](https://github.com/Azure/azure-policy/blob/master/built-in-policies/policyDefinitions/Search/Search_AuditDiagnosticLog_Audit.json) |
| 125.35 | 713 | 0.841245 | hun_Latn | 0.996288 |
934d73c17b15b14551059a24d56307bcc2f3807c | 1,111 | md | Markdown | api/Excel.Shape.CopyPicture.md | CeptiveYT/VBA-Docs | 1d9c58a40ee6f2d85f96de0a825de201f950fc2a | [
"CC-BY-4.0",
"MIT"
] | 283 | 2018-07-06T07:44:11.000Z | 2022-03-31T14:09:36.000Z | api/Excel.Shape.CopyPicture.md | CeptiveYT/VBA-Docs | 1d9c58a40ee6f2d85f96de0a825de201f950fc2a | [
"CC-BY-4.0",
"MIT"
] | 1,457 | 2018-05-11T17:48:58.000Z | 2022-03-25T22:03:38.000Z | api/Excel.Shape.CopyPicture.md | CeptiveYT/VBA-Docs | 1d9c58a40ee6f2d85f96de0a825de201f950fc2a | [
"CC-BY-4.0",
"MIT"
] | 469 | 2018-06-14T12:50:12.000Z | 2022-03-27T08:17:02.000Z | ---
title: Shape.CopyPicture method (Excel)
keywords: vbaxl10.chm636127
f1_keywords:
- vbaxl10.chm636127
ms.prod: excel
api_name:
- Excel.Shape.CopyPicture
ms.assetid: 276cd993-18b1-8c5b-3618-95e5b5c9a773
ms.date: 05/14/2019
ms.localizationpriority: medium
---
# Shape.CopyPicture method (Excel)
Copies the selected object to the Clipboard as a picture.
## Syntax
_expression_.**CopyPicture** (_Appearance_, _Format_)
_expression_ A variable that represents a **[Shape](Excel.Shape.md)** object.
## Parameters
|Name|Required/Optional|Data type|Description|
|:-----|:-----|:-----|:-----|
| _Appearance_|Optional| **Variant**|An **[XlPictureAppearance](Excel.XlPictureAppearance.md)** constant that specifies how the picture should be copied. The default value is **xlScreen**.|
| _Format_|Optional| **Variant**|An **[XlCopyPictureFormat](Excel.XlCopyPictureFormat.md)** constant that specifies the format of the picture. The default value is **xlPicture**.|
## Remarks
If you copy a range, it must be made up of adjacent cells.
[!include[Support and feedback](~/includes/feedback-boilerplate.md)] | 27.097561 | 189 | 0.744374 | eng_Latn | 0.666545 |
934e5df85f4517ae3cc0b40c14b95ee61e4063a4 | 10,790 | md | Markdown | repos/geonetwork/remote/4.0.md | devcode1981/repo-info | 2d59ac4492b5054ca75008b755f43d07ba879801 | [
"Apache-2.0"
] | 1 | 2022-01-14T20:54:33.000Z | 2022-01-14T20:54:33.000Z | repos/geonetwork/remote/4.0.md | devcode1981/repo-info | 2d59ac4492b5054ca75008b755f43d07ba879801 | [
"Apache-2.0"
] | null | null | null | repos/geonetwork/remote/4.0.md | devcode1981/repo-info | 2d59ac4492b5054ca75008b755f43d07ba879801 | [
"Apache-2.0"
] | 1 | 2021-05-21T18:38:51.000Z | 2021-05-21T18:38:51.000Z | ## `geonetwork:4.0`
```console
$ docker pull geonetwork@sha256:e715862c80f064df336edcd737de70f8344843f1e8fcf49e66316313359d2f76
```
- Manifest MIME: `application/vnd.docker.distribution.manifest.list.v2+json`
- Platforms: 1
- linux; amd64
### `geonetwork:4.0` - linux; amd64
```console
$ docker pull geonetwork@sha256:a862288a041576fd60ae55c956672cbdd5138c32d456388116061767a2fc28df
```
- Docker Version: 20.10.7
- Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json`
- Total Size: **430.3 MB (430339978 bytes)**
(compressed transfer size, not on-disk size)
- Image ID: `sha256:4b9e84ec8b7ce464e1784249b5c068cff762936e890c3b789b3fa7a7efe48395`
- Entrypoint: `["\/geonetwork-entrypoint.sh"]`
- Default Command: `["java","-jar","\/usr\/local\/jetty\/start.jar"]`
```dockerfile
# Tue, 28 Sep 2021 01:22:25 GMT
ADD file:d05a14b1e57f9cc8eeb316a843403bbb35176d6222d60d6ddbb34faba977e316 in /
# Tue, 28 Sep 2021 01:22:25 GMT
CMD ["bash"]
# Tue, 28 Sep 2021 01:49:57 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends ca-certificates curl netbase wget ; rm -rf /var/lib/apt/lists/*
# Tue, 28 Sep 2021 01:50:04 GMT
RUN set -ex; if ! command -v gpg > /dev/null; then apt-get update; apt-get install -y --no-install-recommends gnupg dirmngr ; rm -rf /var/lib/apt/lists/*; fi
# Tue, 28 Sep 2021 09:22:46 GMT
RUN set -eux; apt-get update; apt-get install -y --no-install-recommends bzip2 unzip xz-utils fontconfig libfreetype6 ca-certificates p11-kit ; rm -rf /var/lib/apt/lists/*
# Tue, 28 Sep 2021 09:26:48 GMT
ENV JAVA_HOME=/usr/local/openjdk-8
# Tue, 28 Sep 2021 09:26:49 GMT
RUN { echo '#/bin/sh'; echo 'echo "$JAVA_HOME"'; } > /usr/local/bin/docker-java-home && chmod +x /usr/local/bin/docker-java-home && [ "$JAVA_HOME" = "$(docker-java-home)" ] # backwards compatibility
# Tue, 28 Sep 2021 09:26:49 GMT
ENV PATH=/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
# Tue, 28 Sep 2021 09:26:49 GMT
ENV LANG=C.UTF-8
# Tue, 28 Sep 2021 09:26:50 GMT
ENV JAVA_VERSION=8u302
# Tue, 28 Sep 2021 09:27:08 GMT
RUN set -eux; arch="$(dpkg --print-architecture)"; case "$arch" in 'amd64') downloadUrl='https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u302-b08/OpenJDK8U-jre_x64_linux_8u302b08.tar.gz'; ;; 'arm64') downloadUrl='https://github.com/AdoptOpenJDK/openjdk8-upstream-binaries/releases/download/jdk8u302-b08/OpenJDK8U-jre_aarch64_linux_8u302b08.tar.gz'; ;; *) echo >&2 "error: unsupported architecture: '$arch'"; exit 1 ;; esac; wget --progress=dot:giga -O openjdk.tgz "$downloadUrl"; wget --progress=dot:giga -O openjdk.tgz.asc "$downloadUrl.sign"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver keyserver.ubuntu.com --recv-keys EAC843EBD3EFDB98CC772FADA5CD6035332FA671; gpg --batch --keyserver keyserver.ubuntu.com --keyserver-options no-self-sigs-only --recv-keys CA5F11C6CE22644D42C6AC4492EF8D39DC13168F; gpg --batch --list-sigs --keyid-format 0xLONG CA5F11C6CE22644D42C6AC4492EF8D39DC13168F | tee /dev/stderr | grep '0xA5CD6035332FA671' | grep 'Andrew Haley'; gpg --batch --verify openjdk.tgz.asc openjdk.tgz; gpgconf --kill all; rm -rf "$GNUPGHOME"; mkdir -p "$JAVA_HOME"; tar --extract --file openjdk.tgz --directory "$JAVA_HOME" --strip-components 1 --no-same-owner ; rm openjdk.tgz*; { echo '#!/usr/bin/env bash'; echo 'set -Eeuo pipefail'; echo 'trust extract --overwrite --format=java-cacerts --filter=ca-anchors --purpose=server-auth "$JAVA_HOME/lib/security/cacerts"'; } > /etc/ca-certificates/update.d/docker-openjdk; chmod +x /etc/ca-certificates/update.d/docker-openjdk; /etc/ca-certificates/update.d/docker-openjdk; find "$JAVA_HOME/lib" -name '*.so' -exec dirname '{}' ';' | sort -u > /etc/ld.so.conf.d/docker-openjdk.conf; ldconfig; java -version
# Wed, 06 Oct 2021 01:18:04 GMT
ENV JETTY_VERSION=9.4.44.v20210927
# Wed, 06 Oct 2021 01:18:04 GMT
ENV JETTY_HOME=/usr/local/jetty
# Wed, 06 Oct 2021 01:18:05 GMT
ENV JETTY_BASE=/var/lib/jetty
# Wed, 06 Oct 2021 01:18:05 GMT
ENV TMPDIR=/tmp/jetty
# Wed, 06 Oct 2021 01:18:05 GMT
ENV PATH=/usr/local/jetty/bin:/usr/local/openjdk-8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
# Wed, 06 Oct 2021 01:18:05 GMT
ENV JETTY_TGZ_URL=https://repo1.maven.org/maven2/org/eclipse/jetty/jetty-home/9.4.44.v20210927/jetty-home-9.4.44.v20210927.tar.gz
# Wed, 06 Oct 2021 01:18:05 GMT
ENV JETTY_GPG_KEYS=AED5EE6C45D0FE8D5D1B164F27DED4BF6216DB8F 2A684B57436A81FA8706B53C61C3351A438A3B7D 5989BAF76217B843D66BE55B2D0E1FB8FE4B68B4 B59B67FD7904984367F931800818D9D68FB67BAC BFBB21C246D7776836287A48A04E0C74ABB35FEA 8B096546B1A8F02656B15D3B1677D141BCF3584D FBA2B18D238AB852DF95745C76157BDF03D0DCD6 5C9579B3DB2E506429319AAEF33B071B29559E1E F254B35617DC255D9344BCFA873A8E86B4372146
# Wed, 06 Oct 2021 01:18:14 GMT
RUN set -xe ; export GNUPGHOME=/jetty-keys ; mkdir -p "$GNUPGHOME" ; for key in $JETTY_GPG_KEYS; do for server in ha.pool.sks-keyservers.net pgp.mit.edu hkp://p80.pool.sks-keyservers.net:80 hkp://keyserver.ubuntu.com:80 keyserver.pgp.com ipv4.pool.sks-keyservers.net ; do if gpg --batch --keyserver "$server" --recv-keys "$key"; then break; fi; done; done ; mkdir -p "$JETTY_HOME" ; cd $JETTY_HOME ; curl -SL "$JETTY_TGZ_URL" -o jetty.tar.gz ; curl -SL "$JETTY_TGZ_URL.asc" -o jetty.tar.gz.asc ; gpg --batch --verify jetty.tar.gz.asc jetty.tar.gz ; tar -xvf jetty.tar.gz --strip-components=1 ; sed -i '/jetty-logging/d' etc/jetty.conf ; mkdir -p "$JETTY_BASE" ; cd $JETTY_BASE ; java -jar "$JETTY_HOME/start.jar" --create-startd --add-to-start="server,http,deploy,jsp,jstl,ext,resources,websocket" ; mkdir -p "$TMPDIR" ; groupadd -r jetty && useradd -r -g jetty jetty ; chown -R jetty:jetty "$JETTY_HOME" "$JETTY_BASE" "$TMPDIR" ; usermod -d $JETTY_BASE jetty ; rm -rf /tmp/hsperfdata_root ; rm -fr $JETTY_HOME/jetty.tar.gz* ; rm -fr /jetty-keys $GNUPGHOME ; rm -rf /tmp/hsperfdata_root ; java -jar "$JETTY_HOME/start.jar" --list-config ;
# Wed, 06 Oct 2021 01:18:14 GMT
WORKDIR /var/lib/jetty
# Wed, 06 Oct 2021 01:18:14 GMT
COPY multi:6f466bb575b852e6668f47004d8edb31588ff8e21b73080bf09d6ed8d3dcb588 in /
# Wed, 06 Oct 2021 01:18:14 GMT
USER jetty
# Wed, 06 Oct 2021 01:18:15 GMT
EXPOSE 8080
# Wed, 06 Oct 2021 01:18:15 GMT
ENTRYPOINT ["/docker-entrypoint.sh"]
# Wed, 06 Oct 2021 01:18:15 GMT
CMD ["java" "-jar" "/usr/local/jetty/start.jar"]
# Wed, 06 Oct 2021 02:20:59 GMT
ENV DATA_DIR=/catalogue-data
# Wed, 06 Oct 2021 02:20:59 GMT
ENV JAVA_OPTS=-Dorg.eclipse.jetty.annotations.AnnotationParser.LEVEL=OFF -Djava.security.egd=file:/dev/./urandom -Djava.awt.headless=true -Xms512M -Xss512M -Xmx2G -XX:+UseConcMarkSweepGC -Dgeonetwork.resources.dir=/catalogue-data/resources -Dgeonetwork.data.dir=/catalogue-data -Dgeonetwork.codeList.dir=/var/lib/jetty/webapps/geonetwork/WEB-INF/data/config/codelist -Dgeonetwork.schema.dir=/var/lib/jetty/webapps/geonetwork/WEB-INF/data/config/schema_plugins
# Wed, 06 Oct 2021 02:21:00 GMT
USER root
# Wed, 06 Oct 2021 02:21:03 GMT
RUN apt-get -y update && apt-get -y install curl && rm -rf /var/lib/apt/lists/* && mkdir -p /${DATA_DIR} && chown -R jetty:jetty ${DATA_DIR} && mkdir -p /var/lib/jetty/webapps/geonetwork && chown -R jetty:jetty /var/lib/jetty/webapps/geonetwork
# Wed, 06 Oct 2021 02:21:04 GMT
USER jetty
# Wed, 06 Oct 2021 02:21:04 GMT
ENV GN_FILE=GeoNetwork-4.0.5-0.war
# Wed, 06 Oct 2021 02:21:04 GMT
ENV GN_VERSION=4.0.5
# Wed, 06 Oct 2021 02:21:04 GMT
ENV GN_DOWNLOAD_MD5=7dfcfdffc66b9a97f0d24b0769e9c3b7
# Wed, 06 Oct 2021 02:21:31 GMT
RUN cd /var/lib/jetty/webapps/geonetwork/ && curl -fSL -o geonetwork.war https://sourceforge.net/projects/geonetwork/files/GeoNetwork_opensource/v${GN_VERSION}/${GN_FILE}/download && echo "${GN_DOWNLOAD_MD5} *geonetwork.war" | md5sum -c && unzip -q geonetwork.war && rm geonetwork.war
# Wed, 06 Oct 2021 02:21:32 GMT
COPY file:ca46ab251df3dfc253cb04cf962e7266e42428fab31ad2f583a7c86b06d5f778 in /geonetwork-entrypoint.sh
# Wed, 06 Oct 2021 02:21:33 GMT
ENTRYPOINT ["/geonetwork-entrypoint.sh"]
# Wed, 06 Oct 2021 02:21:33 GMT
CMD ["java" "-jar" "/usr/local/jetty/start.jar"]
# Wed, 06 Oct 2021 02:21:33 GMT
VOLUME [/catalogue-data]
```
- Layers:
- `sha256:df5590a8898bedd76f02205dc8caa5cc9863267dbcd8aac038bcd212688c1cc7`
Last Modified: Tue, 28 Sep 2021 01:28:33 GMT
Size: 54.9 MB (54927682 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:705bb4cb554eb7751fd21a994f6f32aee582fbe5ea43037db6c43d321763992b`
Last Modified: Tue, 28 Sep 2021 01:57:51 GMT
Size: 5.2 MB (5153152 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:519df5fceacdeaadeec563397b1d9f4d7c29c9f6eff879739cab6f0c144f49e1`
Last Modified: Tue, 28 Sep 2021 01:57:51 GMT
Size: 10.9 MB (10871798 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:bc850b11e97cfdecca53799a94b78db748c40ae0a76694dbc10af9cd746c1229`
Last Modified: Tue, 28 Sep 2021 09:45:06 GMT
Size: 5.7 MB (5653948 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:68accb5562eba4dd2f73905ccaf67ad60e40faaabc20f8fb573e9bb2d76197dc`
Last Modified: Tue, 28 Sep 2021 09:48:40 GMT
Size: 211.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:c83ddc9323ff0ff81badeca7b50ad1e0986157ffad2874d4cf6ba29a73ac82cf`
Last Modified: Tue, 28 Sep 2021 09:48:47 GMT
Size: 41.4 MB (41358587 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:addc867247af6af4bd8f2c59780e728b60baf5c3082858c7d730a76bb62aec66`
Last Modified: Wed, 06 Oct 2021 01:25:43 GMT
Size: 9.9 MB (9878287 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:1f412d933be3b5e49d4110552ba0dd6f496673503c9dfeff9864a03e439f9d86`
Last Modified: Wed, 06 Oct 2021 01:25:42 GMT
Size: 1.4 KB (1437 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:e509580a64b640ab961b3f88f99504501c14b4e08b8d87451dd7fcad15712a84`
Last Modified: Wed, 06 Oct 2021 02:22:05 GMT
Size: 515.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:5399a1ceb1be8f53485ada58f37856bf306916896d36f0eca52220daa61764d8`
Last Modified: Wed, 06 Oct 2021 02:22:23 GMT
Size: 302.5 MB (302493404 bytes)
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
- `sha256:86d7f233a80ca356edded078e42a432b7d32d7eb28ce8de90d56748052eac11f`
Last Modified: Wed, 06 Oct 2021 02:22:05 GMT
Size: 957.0 B
MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
| 72.416107 | 1,771 | 0.734384 | yue_Hant | 0.276606 |
934e89a25e6143c8d5ab936e0292fd7c219fa9a9 | 7,761 | md | Markdown | lib/msal-react/docs/hooks.md | ordonezgs/microsoft-authentication-library-for-js | 7dd8b323288fcbae97e8f7a34de8a9888cab4934 | [
"MIT"
] | null | null | null | lib/msal-react/docs/hooks.md | ordonezgs/microsoft-authentication-library-for-js | 7dd8b323288fcbae97e8f7a34de8a9888cab4934 | [
"MIT"
] | null | null | null | lib/msal-react/docs/hooks.md | ordonezgs/microsoft-authentication-library-for-js | 7dd8b323288fcbae97e8f7a34de8a9888cab4934 | [
"MIT"
] | null | null | null | # Hooks
1. [`useAccount`](#useaccount-hook)
1. [`useIsAuthenticated`](#useisauthenticated-hook)
1. [`useMsal`](#usemsal-hook)
1. [`useMsalAuthentication`](#usemsalauthentication-hook)
## `useAccount` hook
The `useAccount` hook accepts an `accountIdentifier` parameter and returns the `AccountInfo` object for that account if it is signed in or `null` if it is not.
You can read more about the `AccountInfo` object returned in the `@azure/msal-browser` docs [here](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/login-user.md#account-apis).
Note: At least one account identifier must be provided, all others are optional. Additionally we do not recommend relying only on `username`.
```javascript
const accountIdentifier = {
localAccountId: "example-local-account-identifier",
homeAccountId: "example-home-account-identifier"
username: "example-username"
}
const accountInfo = useAccount(accountIdentifier);
```
## `useIsAuthenticated` hook
The `useIsAuthenticated` hook returns a boolean indicating whether or not an account is signed in. It optionally accepts an `accountIdentifier` object you can provide if you need to know whether or not a specific account is signed in.
### Determine if any account is currently signed in
```javascript
import React from 'react';
import { useIsAuthenticated } from "@azure/msal-react";
export function App() {
const isAuthenticated = useIsAuthenticated();
return (
<React.Fragment>
<p>Anyone can see this paragraph.</p>
{isAuthenticated && (
<p>At least one account is signed in!</p>
)}
{!isAuthenticated && (
<p>No users are signed in!</p>
)}
</React.Fragment>
);
}
```
### Determine if specific user is signed in
Note: At least one account identifier must be provided, all others are optional. Additionally we do not recommend relying only on `username`.
```javascript
import React from 'react';
import { useIsAuthenticated } from "@azure/msal-react";
export function App() {
const accountIdentifiers = {
localAccountId: "example-local-account-identifier",
homeAccountId: "example-home-account-identifier",
username: "example-username"
}
const isAuthenticated = useIsAuthenticated(accountIdentifiers);
return (
<React.Fragment>
<p>Anyone can see this paragraph.</p>
{isAuthenticated && (
<p>User with specified localAccountId is signed in!</p>
)}
{!isAuthenticated && (
<p>User with specified localAccountId is not signed in!</p>
)}
</React.Fragment>
);
}
```
## `useMsal` hook
The `useMsal` hook returns the context. This can be used if you need access to the `PublicClientApplication` instance, the list of accounts currently signed in or if you need to know whether a login or other interaction is currently in progress.
Note: The `accounts` value returned by `useMsal` will only update when accounts are added or removed, and will not update when claims are updated. If you need access to updated claims for the current user, use the `useAccount` hook or call `acquireTokenSilent` instead.
```javascript
const { instance, accounts, inProgress } = useMsal();
let accessToken = null;
useEffect(() => {
if (inProgress === "none" && accounts.length > 0) {
// Retrieve an access token
accessToken = instance.acquireTokenSilent({
account: accounts[0],
scopes: ["User.Read"]
}).then(response => {
if (response.accessToken) {
return response.accessToken;
}
return null;
});
}
}, [inProgress, accounts, instance]);
if (inProgress === "login") {
// Render loading component
} else if (accessToken) {
// Call your api and render component
}
```
Docs for the APIs `PublicClientApplication` exposes can be found in the `@azure/msal-browser` docs:
- [Login APIs](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/login-user.md)
- [Logout API](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/logout.md)
- [AcquireToken APIs](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/acquire-token.md)
- [Account APIs](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/accounts.md)
- [Event APIs](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/events.md)
## `useMsalAuthentication` hook
The `useMsalAuthentication` hook will initiate a login if a user is not already signed in. It accepts an `interactionType` ("Popup", "Redirect", or "Silent") and optionally accepts a [request object](https://github.com/AzureAD/microsoft-authentication-library-for-js/blob/dev/lib/msal-browser/docs/request-response-object.md#request-and-response-objects) and an `accountIdentifiers` object if you would like to ensure a specific user is signed in. The hook will return the `response` or `error` from the login call and the `login` callback which can be used to retry a failed login.
Note: Passing the "Silent" interaction type will call `ssoSilent` which attempts to open a hidden iframe and reuse an existing session with AAD. This will not work in browsers that block 3rd party cookies such as Safari. Additionally, when using the "Silent" type the request object is required and should contain either a `loginHint` or `sid` parameter.
### `ssoSilent` example
If you use silent you should catch any errors and attempt an interactive login as a fallback.
```javascript
import React, { useEffect } from 'react';
import { AuthenticatedTemplate, UnauthenticatedTemplate, useMsal, useMsalAuthentication } from "@azure/msal-react";
import { InteractionType } from '@azure/msal-browser';
function App() {
const request = {
loginHint: "[email protected]",
scopes: ["User.Read"]
}
const { login, result, error } = useMsalAuthentication(InteractionType.Silent, request);
useEffect(() => {
if (error) {
login(InteractionType.Popup, request);
}
}, [error]);
const { accounts } = useMsal();
return (
<React.Fragment>
<p>Anyone can see this paragraph.</p>
<AuthenticatedTemplate>
<p>Signed in as: {accounts[0]?.username}</p>
</AuthenticatedTemplate>
<UnauthenticatedTemplate>
<p>No users are signed in!</p>
</UnauthenticatedTemplate>
</React.Fragment>
);
}
export default App;
```
### Specific user example
If you would like to ensure a specific user is signed in, provide an `accountIdentifiers` object.
```javascript
import React from 'react';
import { useMsalAuthentication } from "@azure/msal-react";
export function App() {
const accountIdentifiers = {
username: "example-username"
}
const request = {
loginHint: "example-username",
scopes: ["User.Read"]
}
const [login, response, error] = useMsalAuthentication("popup", request, accountIdentifiers);
return (
<React.Fragment>
<p>Anyone can see this paragraph.</p>
<AuthenticatedTemplate username="example-username">
<p>Example user is signed in!</p>
</AuthenticatedTemplate>
<UnauthenticatedTemplate username="example-username">
<p>Example user is not signed in!</p>
</UnauthenticatedTemplate>
</React.Fragment>
);
}
```
| 38.805 | 582 | 0.683675 | eng_Latn | 0.954655 |
934e95189a83c53c9214d8a286b19c2e796bf503 | 624 | md | Markdown | about.md | edhzsz-unadm/edhzsz-unadm.github.io | 7732603aef39b268da12627dda0a823ff3a3ea38 | [
"MIT"
] | null | null | null | about.md | edhzsz-unadm/edhzsz-unadm.github.io | 7732603aef39b268da12627dda0a823ff3a3ea38 | [
"MIT"
] | null | null | null | about.md | edhzsz-unadm/edhzsz-unadm.github.io | 7732603aef39b268da12627dda0a823ff3a3ea38 | [
"MIT"
] | null | null | null | ---
layout: media
permalink: /perfil/
title: "Perfil"
date: 2014-01-01
excerpt:
image:
feature: about-feature.jpg
teaser:
thumb:
ads: false
---
Mi nombre es Edgar aunque la mayor parte de la gente que me conoce me llama Limo. Tengo 33 años y trabajo como Desarrollador de Software desde hace 7 años.
Soy fugitivo de tres Universidades en las cuales estudié Matemáticas, luego me fugué también de la carrera de Ciencias de la Computación de la última universidad.
Estoy inscrito a la carrera de Matemáticas que ofrece la UNADM y mi intención es titularme y ver la posibilidad de hacer una maestría en algunos años. | 34.666667 | 162 | 0.775641 | spa_Latn | 0.999709 |
934f4d791bc114c4854385584d8d096c51fb3778 | 91 | md | Markdown | figures/Readme.md | RaymundoNeto/utils | 2d534e1b5876ffa36fb5a1abfde4a9827f23cb0f | [
"MIT"
] | null | null | null | figures/Readme.md | RaymundoNeto/utils | 2d534e1b5876ffa36fb5a1abfde4a9827f23cb0f | [
"MIT"
] | null | null | null | figures/Readme.md | RaymundoNeto/utils | 2d534e1b5876ffa36fb5a1abfde4a9827f23cb0f | [
"MIT"
] | null | null | null | This folder contains miscelaneous functions to help of fully create figures in R or MATLAB
| 45.5 | 90 | 0.835165 | eng_Latn | 0.999897 |
934f59b1ca2c390ddf62cf90bf42149c36e0de24 | 146 | md | Markdown | _haikus/gossamer.md | kangarooSurfer/cloud_haiku | 5a1457f3ae5a889def682a2957f5bac645ae0127 | [
"MIT"
] | 140 | 2018-10-26T22:51:07.000Z | 2022-03-19T09:45:20.000Z | _haikus/gossamer.md | kangarooSurfer/cloud_haiku | 5a1457f3ae5a889def682a2957f5bac645ae0127 | [
"MIT"
] | 1,322 | 2018-10-14T22:47:59.000Z | 2022-03-24T23:31:52.000Z | _haikus/gossamer.md | kangarooSurfer/cloud_haiku | 5a1457f3ae5a889def682a2957f5bac645ae0127 | [
"MIT"
] | 2,571 | 2018-10-13T17:36:03.000Z | 2022-03-31T22:10:57.000Z | ---
layout: haiku
title: Gossamer
author: Christopher L. Goetzke
---
Bits delving the deep<br>
Dancing diaphanous to<br>
Weave digital dreams<br>
| 16.222222 | 30 | 0.753425 | eng_Latn | 0.704877 |
935026e9dda627e020d3ae7c3d349b2d90971508 | 975 | md | Markdown | _posts/2019-04-12-visit-blog.md | tzanio/llnl.github.io | c18772a5a3583ef50ab41171d68eb45ab68ced5b | [
"MIT"
] | null | null | null | _posts/2019-04-12-visit-blog.md | tzanio/llnl.github.io | c18772a5a3583ef50ab41171d68eb45ab68ced5b | [
"MIT"
] | 1 | 2020-03-05T01:05:17.000Z | 2020-03-11T00:33:16.000Z | _posts/2019-04-12-visit-blog.md | neely4/llnl.github.io | 8aa78df1cfdc9f9e99c9d97ee7444d7600839e1e | [
"MIT"
] | 2 | 2019-11-27T02:00:28.000Z | 2019-11-27T02:10:02.000Z | ---
title: "CTR: An Introduction Using Recent Tech Refresh Experiences on VisIt"
---
This LLNL-authored blog post describes the practice of continuous technology refreshment, or CTR -- the periodic upgrade or replacement of infrastructure to deliver continued reliability, improved speed, capacity, and/or new features. Using the VisIt code's recent migration to GitHub as an example, the post explains the development team's process through wrangling binary content, revision control, issue tracking, documentation, and other refreshments.
Learn more:
- [VisIt organization on GitHub](https://github.com/visit-dav)
- [Primary source code repo](https://github.com/visit-dav/visit)
- [User wiki](https://www.visitusers.org/index.php?title=Main_Page)
- [Website](https://wci.llnl.gov/simulation/computer-codes/visit/)
- Article: [The Great Migration: VisIt Moves from Subversion to GitHub](https://computation.llnl.gov/newsroom/great-migration-visit-moves-subversion-github)
| 75 | 455 | 0.791795 | eng_Latn | 0.912364 |
9350414ada0935a9d6e839a7bb115997577a7ce4 | 2,452 | md | Markdown | curriculum/challenges/chinese/08-coding-interview-prep/rosetta-code/greatest-subsequential-sum.chinese.md | innocentwkc/freeCodeCamp | 091ee7905cd193f29de6066d4de8c83af3c8d6ce | [
"BSD-3-Clause"
] | 3 | 2020-11-09T13:00:11.000Z | 2021-09-23T09:42:46.000Z | curriculum/challenges/chinese/08-coding-interview-prep/rosetta-code/greatest-subsequential-sum.chinese.md | JakariaSami/freeCodeCamp | c0e2b65864b15f51b2f18f03aedf1c3beec13e2b | [
"BSD-3-Clause"
] | 46 | 2020-09-09T10:14:35.000Z | 2022-02-14T04:34:30.000Z | curriculum/challenges/chinese/08-coding-interview-prep/rosetta-code/greatest-subsequential-sum.chinese.md | JakariaSami/freeCodeCamp | c0e2b65864b15f51b2f18f03aedf1c3beec13e2b | [
"BSD-3-Clause"
] | 5 | 2019-03-08T02:24:03.000Z | 2022-03-24T17:21:07.000Z | ---
title: Greatest subsequential sum
id: 5a23c84252665b21eecc7e84
challengeType: 5
videoUrl: ''
localeTitle: 最重要的后续总和
---
## Description
<section id="description">给定一个整数序列,找到一个连续的子序列,它最大化其元素的总和,也就是说,没有其他单个子序列的元素加起来大于这一个的值。空子序列被认为具有\(0 \)的总和;因此,如果所有元素都是负数,则结果必须是空序列。 </section>
## Instructions
<section id="instructions">
</section>
## Tests
<section id='tests'>
```yml
tests:
- text: <code>maximumSubsequence</code>应该是一个函数。
testString: assert(typeof maximumSubsequence=='function');
- text: '<code>maximumSubsequence("+JSON.stringify(tests[0])+")</code>应该返回一个数组。'
testString: assert(Array.isArray(maximumSubsequence([ 1, 2,-1, 3, 10, -10 ])));
- text: '<code>maximumSubsequence("+JSON.stringify(tests[0])+")</code>应返回<code>"+JSON.stringify(results[0])+"</code> 。'
testString: assert.deepEqual(maximumSubsequence([1,2,-1,3,10,-10]), [ 1, 2, -1, 3, 10 ]);
- text: '<code>maximumSubsequence("+JSON.stringify(tests[1])+")</code>应返回<code>"+JSON.stringify(results[1])+"</code> 。'
testString: assert.deepEqual(maximumSubsequence([0, 8, 10, -2, -4, -1, -5, -3]), [ 0, 8, 10 ]);
- text: '<code>maximumSubsequence("+JSON.stringify(tests[2])+")</code>应返回<code>"+JSON.stringify(results[2])+"</code> 。'
testString: assert.deepEqual(maximumSubsequence([ 9, 9, -10, 1 ]), [ 9, 9 ]);
- text: '<code>maximumSubsequence("+JSON.stringify(tests[3])+")</code>应返回<code>"+JSON.stringify(results[3])+"</code> 。'
testString: assert.deepEqual(maximumSubsequence([ 7, 1, -5, -3, -8, 1 ]), [ 7, 1 ]);
- text: '<code>maximumSubsequence("+JSON.stringify(tests[4])+")</code>应返回<code>"+JSON.stringify(results[4])+"</code> 。'
testString: assert.deepEqual(maximumSubsequence([ -3, 6, -1, 4, -4, -6 ]), [ 6, -1, 4 ]);
- text: '<code>maximumSubsequence("+JSON.stringify(tests[5])+")</code>应返回<code>"+JSON.stringify(results[5])+"</code> 。'
testString: assert.deepEqual(maximumSubsequence([ -1, -2, 3, 5, 6, -2, -1, 4, -4, 2, -1 ]), [ 3, 5, 6, -2, -1, 4 ]);
```
</section>
## Challenge Seed
<section id='challengeSeed'>
<div id='js-seed'>
```js
function maximumSubsequence (population) {
// Good luck!
}
```
</div>
### After Test
<div id='js-teardown'>
```js
console.info('after the test');
```
</div>
</section>
## Solution
<section id='solution'>
```js
// solution required
```
</section>
| 32.693333 | 141 | 0.657423 | eng_Latn | 0.211568 |
93504cf6885de54a8dd57c96d127a81c6403d557 | 342 | md | Markdown | docs/StopMessageLiveLocationBody.md | zhulik/teleswagger | ec76786fa9ba048aff22add502d2b8ed28c004a6 | [
"Apache-2.0"
] | 2 | 2019-07-31T19:01:38.000Z | 2020-09-01T11:59:27.000Z | docs/StopMessageLiveLocationBody.md | zhulik/teleswagger | ec76786fa9ba048aff22add502d2b8ed28c004a6 | [
"Apache-2.0"
] | null | null | null | docs/StopMessageLiveLocationBody.md | zhulik/teleswagger | ec76786fa9ba048aff22add502d2b8ed28c004a6 | [
"Apache-2.0"
] | 4 | 2018-02-07T08:41:32.000Z | 2020-09-01T12:08:50.000Z | # Teleswagger::StopMessageLiveLocationBody
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**chat_id** | **Object** | | [optional]
**message_id** | **Integer** | | [optional]
**inline_message_id** | **String** | | [optional]
**reply_markup** | **Object** | | [optional]
| 28.5 | 60 | 0.511696 | eng_Latn | 0.088562 |
93509b64901e7070b4cc7a4207a4558beb9adbc7 | 2,701 | md | Markdown | _posts/2018-11-17-psychonomicsposter.md | dmarkant/dmarkant.github.io | 830a0c5ba661646db98c11f4d6edaebec50caa88 | [
"MIT"
] | null | null | null | _posts/2018-11-17-psychonomicsposter.md | dmarkant/dmarkant.github.io | 830a0c5ba661646db98c11f4d6edaebec50caa88 | [
"MIT"
] | null | null | null | _posts/2018-11-17-psychonomicsposter.md | dmarkant/dmarkant.github.io | 830a0c5ba661646db98c11f4d6edaebec50caa88 | [
"MIT"
] | null | null | null | ---
layout: post
title: Psychonomics 2018 - Building relational knowledge through active control
comments: True
---
In some prior work, my colleagues and I have found that active control---being able to dictate the content or pacing of information---leads to enhanced episodic memory for materials experienced during study.[^1] [^2] Let's say that I'm your instructor and I have a set of definitions on flashcards that I want you to learn. If I give you (the student) more control over the selection and pacing of flashcards, it's likely that you'll have better memory later on compared to conditions where you don't have control.
But as an instructor, I don't *just* want my students to memorize a set of independent definitions. I also want them to integrate those concepts together to form some coherent knowledge about the domain. For example, I don't just want my research methods students to be able to define different types of validity; I also want them to be able to relate them to each other and the broader goals of experimental methods. In contrast to the first goal of forming memories of independent sets of items, it is less clear to what extent having control over learning leads to enhanced integration of study experiences into conceptual knowledge.
The project described in this poster aims to understand the effects of active control on relational knowledge formation. It uses a common relational reasoning task (transitive inference) to disentangle enhancements to memory for individual items from enhanced *integrative encoding*. The results so far suggest that having control over the selection of items leads to improved integrative encoding over a passive condition lacking such control. Critically, however, the benefit of active control only appears among people with higher working memory capacity. This provides another piece of evidence that the benefits of "active learning" may not be so universal, but instead depend on students having the cognitive resources to maintain and integrate disparate study experiences.
Click on the image below to get the PDF of the poster:
[](/assets/2018_Psychonomics_poster.pdf)
[^1]: Markant, D., Ruggeri, A., Gureckis, T. M., and Xu, F. (2016). [“Enhanced memory as a common effect of active learning.”](/assets/MarkantEtAl_MBE2016.pdf) <i>Mind, Brain, and Education.</i>
[^2]: Markant, D., Dubrow, S., Davachi, L., and Gureckis, T.M. (2014). <a href="http://link.springer.com/article/10.3758/s13421-014-0435-9">"Deconstructing the effect of self-directed study on episodic memory."</a> <i>Memory & Cognition 42</i>(8), 1211—1224. doi: 10.3758/s13421-014-0435-9
| 142.157895 | 779 | 0.789337 | eng_Latn | 0.997904 |
9350d218ef5d9ecb592af09b97f3b12b7309d36f | 685 | md | Markdown | README.md | izzyx6/Google-Play-Store-Review | 12588e099665ae1e9cb6ab07d285e0625a63b349 | [
"MIT"
] | null | null | null | README.md | izzyx6/Google-Play-Store-Review | 12588e099665ae1e9cb6ab07d285e0625a63b349 | [
"MIT"
] | null | null | null | README.md | izzyx6/Google-Play-Store-Review | 12588e099665ae1e9cb6ab07d285e0625a63b349 | [
"MIT"
] | null | null | null | # Google-Play-Store-Review
The project is a simple sentiment analysis using NLP. The project in written in python
it shows how to do text preprocessing (removing of bad words, stop words, lemmatization, tokenization).
It further shows how to save a trained model, and use the model in a real life suitation.
The machine learning model used here is Logistic Regression which is used to build the model.
Various performance evaluation techniques are used, and they include confusion matrix, and Sk learn
classification report which give the accuracy, percision, recall and f1- score preformance of the model.
The target values been classified are positive, negative and neutral review.
| 68.5 | 104 | 0.811679 | eng_Latn | 0.999398 |
935105a6180307e4baf0c414622e3ef23d1ed71c | 5,517 | md | Markdown | data/readme_files/n0fate.chainbreaker.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | 5 | 2021-05-09T12:51:32.000Z | 2021-11-04T11:02:54.000Z | data/readme_files/n0fate.chainbreaker.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | null | null | null | data/readme_files/n0fate.chainbreaker.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | 3 | 2021-05-12T12:14:05.000Z | 2021-10-06T05:19:54.000Z | chainbreaker
============
The chainbreaker can extract user credential in a Keychain file with Master Key or user password in forensically sound manner.
Master Key candidates can be extracted from [volafox](https://github.com/n0fate/volafox) or [volatility](https://github.com/volatilityfoundation/volatility) keychaindump module.
## Supported OS
Snow Leopard, Lion, Mountain Lion, Mavericks, Yosemite, El Capitan, (High) Sierra
## Target Keychain file
* User Keychain(~/Users/[username]/Library/Keychains/login.keychain) : It has user id/password about installed application, ssh/vpn, mail, contacts, calendar and so on. It has key for call history decryption too.
* System Keychain(/Library/Keychains/System.keychain) : It has WiFi password registered by local machine and several certifications and public/private keys. (Detailed Info : http://forensic.n0fate.com/2014/09/system-keychain-analysis/)
## How to use:
If you have only keychain file and password, command as follow:
$ python chainbreaker.py
usage: chainbreaker.py [-h] -f FILE (-k KEY | -p PASSWORD)
chainbreaker.py: error: argument -f/--file is required
If you have memory image, you can extract master key candidates using volafox project. The volafox, memory forensic toolit for Mac OS X has been written in Python as a cross platform open source project. Of course, you can dump it using volatility.
$ python volafox.py -i [memory image] -o keychaindump
....
....
$ python chainbreaker.py -f [keychain file] -k [master key]
## Example
$ python vol.py -i ~/Desktop/show/macosxml.mem -o keychaindump
[+] Find MALLOC_TINY heap range (guess)
[-] range 0x7fef03400000-0x7fef03500000
[-] range 0x7fef03500000-0x7fef03600000
[-] range 0x7fef03600000-0x7fef03700000
[-] range 0x7fef04800000-0x7fef04900000
[-] range 0x7fef04900000-0x7fef04a00000
[*] Search for keys in range 0x7fef03400000-0x7fef03500000 complete. master key candidates : 0
[*] Search for keys in range 0x7fef03500000-0x7fef03600000 complete. master key candidates : 0
[*] Search for keys in range 0x7fef03600000-0x7fef03700000 complete. master key candidates : 0
[*] Search for keys in range 0x7fef04800000-0x7fef04900000 complete. master key candidates : 0
[*] Search for keys in range 0x7fef04900000-0x7fef04a00000 complete. master key candidates : 6
[*] master key candidate: 78006A6CC504140E077D62D39F30DBBAFC5BDF5995039974
[*] master key candidate: 26C80BE3346E720DAA10620F2C9C8AD726CFCE2B818942F9
[*] master key candidate: 2DD97A4ED361F492C01FFF84962307D7B82343B94595726E
[*] master key candidate: 21BB87A2EB24FD663A0AC95E16BEEBF7728036994C0EEC19
[*] master key candidate: 05556393141766259F62053793F62098D21176BAAA540927
[*] master key candidate: 903C49F0FE0700C0133749F0FE0700404158544D00000000
$ python chainbreaker.py -h
usage: chainbreaker.py [-h] -f FILE (-k KEY | -p PASSWORD)
Tool for OS X Keychain Analysis by @n0fate
optional arguments:
-h, --help show this help message and exit
-f FILE, --file FILE Keychain file(*.keychain)
-k KEY, --key KEY Masterkey candidate
-p PASSWORD, --password PASSWORD
User Password
$ python chainbreaker.py -f ~/Desktop/show/login.keychain -k 26C80BE3346E720DAA10620F2C9C8AD726CFCE2B818942F9
[-] DB Key
00000000: 05 55 63 93 14 17 66 25 9F 62 05 37 93 F6 20 98 .Uc...f%.b.7.. .
00000010: D2 11 76 BA AA 54 09 27 ..v..T.'
[+] Symmetric Key Table: 0x00006488
[+] Generic Password: 0x0000dea4
[+] Generic Password Record
[-] RecordSize : 0x000000fc
[-] Record Number : 0x00000000
[-] SECURE_STORAGE_GROUP(SSGP) Area : 0x0000004c
[-] Create DateTime: 20130318062355Z
[-] Last Modified DateTime: 20130318062355Z
[-] Description :
[-] Creator :
[-] Type :
[-] PrintName : ***********@gmail.com
[-] Alias :
[-] Account : 1688945386
[-] Service : iCloud
[-] Password
00000000: ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ** ****************
00000010: 7A ** 69 ** 50 ** 51 36 ** ** ** 48 32 61 31 66 ****************
00000020: ** 49 ** 73 ** 62 ** 79 79 41 6F 3D **********=
<snip>
[+] Internet Record
[-] RecordSize : 0x0000014c
[-] Record Number : 0x00000005
[-] SECURE_STORAGE_GROUP(SSGP) Area : 0x0000002c
[-] Create DateTime: 20130318065146Z
[-] Last Modified DateTime: 20130318065146Z
[-] Description : Web form password
[-] Comment : default
[-] Creator :
[-] Type :
[-] PrintName : www.facebook.com (***********@gmail.com)
[-] Alias :
[-] Protected :
[-] Account : ***********@gmail.com
[-] SecurityDomain :
[-] Server : www.facebook.com
[-] Protocol Type : kSecProtocolTypeHTTPS
[-] Auth Type : kSecAuthenticationTypeHTMLForm
[-] Port : 0
[-] Path :
[-] Password
00000000: ** ** ** ** ** ** ** ** ** ** ** ** ************
If you have memory image only, you can dump a keychain file on it and decrypt keychain contents as [link](https://gist.github.com/n0fate/790428d408d54b910956)
## Contacts
chainbreaker was written by [n0fate](http://twitter.com/n0fate)
E-Mail address can be found from source code.
## License
[GNU GPL v2](http://www.gnu.org/licenses/old-licenses/gpl-2.0.html)
| 45.221311 | 248 | 0.65706 | eng_Latn | 0.459883 |
93518578b19d4e963d4337c03ade877617a8dec8 | 1,135 | md | Markdown | api/Excel.TickLabels.Offset.md | qiezhenxi/VBA-Docs | c49aebcccbd73eadf5d1bddc0a4dfb622e66db5d | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-10-15T16:15:38.000Z | 2018-10-15T16:15:38.000Z | api/Excel.TickLabels.Offset.md | qiezhenxi/VBA-Docs | c49aebcccbd73eadf5d1bddc0a4dfb622e66db5d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/Excel.TickLabels.Offset.md | qiezhenxi/VBA-Docs | c49aebcccbd73eadf5d1bddc0a4dfb622e66db5d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: TickLabels.Offset Property (Excel)
keywords: vbaxl10.chm616085
f1_keywords:
- vbaxl10.chm616085
ms.prod: excel
api_name:
- Excel.TickLabels.Offset
ms.assetid: a353b803-34a3-0ff9-83d2-3318c308ec35
ms.date: 06/08/2017
---
# TickLabels.Offset Property (Excel)
Returns or sets a **Long** value that represents the distance between the levels of labels, and the distance between the first level and the axis line.
## Syntax
_expression_. `Offset`
_expression_ A variable that represents a [TickLabels](./Excel.TickLabels(Graph property).md) object.
## Remarks
The default distance is 100 percent, which represents the default spacing between the axis labels and the axis line. The value can be an integer percentage from 0 through 1000, relative to the axis label's font size.
## Example
This example sets the label spacing of the category axis in Chart1 to twice the current setting, if the offset is less than 500.
```vb
With Charts("Chart1").Axes(xlCategory).TickLabels
If .Offset < 500 then
.Offset = .Offset * 2
End If
End With
```
## See also
[TickLabels Object](Excel.TickLabels(object).md)
| 22.7 | 217 | 0.753304 | eng_Latn | 0.971233 |
93519c90eae134bbb87dba02a180df50af6da68f | 3,853 | md | Markdown | docs/ado/reference/ado-api/description-helpcontext-helpfile-nativeerror-number-source-example-vb.md | dirceuresende/sql-docs.pt-br | 023b1c4ae887bc1ed6a45cb3134f33a800e5e01e | [
"CC-BY-4.0",
"MIT"
] | 2 | 2021-10-12T00:50:30.000Z | 2021-10-12T00:53:51.000Z | docs/ado/reference/ado-api/description-helpcontext-helpfile-nativeerror-number-source-example-vb.md | dirceuresende/sql-docs.pt-br | 023b1c4ae887bc1ed6a45cb3134f33a800e5e01e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/reference/ado-api/description-helpcontext-helpfile-nativeerror-number-source-example-vb.md | dirceuresende/sql-docs.pt-br | 023b1c4ae887bc1ed6a45cb3134f33a800e5e01e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-06-25T13:33:56.000Z | 2020-06-25T13:33:56.000Z | ---
title: Exemplo de propriedades de objeto de erro (VB) | Microsoft Docs
ms.prod: sql
ms.prod_service: connectivity
ms.technology: connectivity
ms.custom: ''
ms.date: 01/19/2017
ms.reviewer: ''
ms.suite: sql
ms.tgt_pltfrm: ''
ms.topic: conceptual
dev_langs:
- VB
helpviewer_keywords:
- Number property [ADO], Visual Basic example
- Source property [ADO], Visual Basic example
- NativeError property [ADO], Visual Basic example
- Description property [ADO], Visual Basic example
- HelpFile property [ADO], Visual Basic example
- SQLState property [ADO], Visual Basic example
- HelpContext property [ADO], Visual Basic example
ms.assetid: 5c728458-d85c-497c-afcf-2cfa36c3342a
caps.latest.revision: 10
author: MightyPen
ms.author: genemi
manager: craigg
ms.openlocfilehash: 201877575785e620f44584d16430c652b316738c
ms.sourcegitcommit: 62826c291db93c9017ae219f75c3cfeb8140bf06
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 06/11/2018
ms.locfileid: "35277875"
---
# <a name="description-helpcontext-helpfile-nativeerror-number-source-and-sqlstate-properties-example-vb"></a>Descrição, HelpContext, HelpFile, NativeError, número, fonte e exemplo de propriedades de SQLState (VB)
Este exemplo dispara um erro, intercepta e exibe o [descrição](../../../ado/reference/ado-api/description-property.md), [HelpContext](../../../ado/reference/ado-api/helpcontext-helpfile-properties.md), [HelpFile](../../../ado/reference/ado-api/helpcontext-helpfile-properties.md), [NativeError](../../../ado/reference/ado-api/nativeerror-property-ado.md), [ Número](../../../ado/reference/ado-api/number-property-ado.md), [fonte](../../../ado/reference/ado-api/source-property-ado-error.md), e [SQLState](../../../ado/reference/ado-api/sqlstate-property.md) propriedades de resultante [erro](../../../ado/reference/ado-api/error-object.md) objeto.
```
'BeginDescriptionVB
Public Sub Main()
Dim Cnxn As ADODB.Connection
Dim Err As ADODB.Error
Dim strError As String
On Error GoTo ErrorHandler
' Intentionally trigger an error
Set Cnxn = New ADODB.Connection
Cnxn.Open "nothing"
Set Cnxn = Nothing
Exit Sub
ErrorHandler:
' Enumerate Errors collection and display
' properties of each Error object
For Each Err In Cnxn.Errors
strError = "Error #" & Err.Number & vbCr & _
" " & Err.Description & vbCr & _
" (Source: " & Err.Source & ")" & vbCr & _
" (SQL State: " & Err.SQLState & ")" & vbCr & _
" (NativeError: " & Err.NativeError & ")" & vbCr
If Err.HelpFile = "" Then
strError = strError & " No Help file available"
Else
strError = strError & _
" (HelpFile: " & Err.HelpFile & ")" & vbCr & _
" (HelpContext: " & Err.HelpContext & ")" & _
vbCr & vbCr
End If
Debug.Print strError
Next
Resume Next
End Sub
'EndDescriptionVB
```
## <a name="see-also"></a>Consulte também
[Propriedade Description](../../../ado/reference/ado-api/description-property.md)
[Objeto Error](../../../ado/reference/ado-api/error-object.md)
[Propriedades HelpContext e HelpFile](../../../ado/reference/ado-api/helpcontext-helpfile-properties.md)
[Propriedades HelpContext e HelpFile](../../../ado/reference/ado-api/helpcontext-helpfile-properties.md)
[Propriedade NativeError (ADO)](../../../ado/reference/ado-api/nativeerror-property-ado.md)
[Propriedade de número (ADO)](../../../ado/reference/ado-api/number-property-ado.md)
[Propriedade Source (erro de ADO)](../../../ado/reference/ado-api/source-property-ado-error.md)
[Propriedade SQLState](../../../ado/reference/ado-api/sqlstate-property.md)
| 42.811111 | 649 | 0.66312 | kor_Hang | 0.207228 |
9352edc48b40e13afdc5180a8b746ffaf3518225 | 44 | md | Markdown | README.md | Appycola/quora-have-to-visit | 135b3fd2e99c33ef979215f306b662cd13433d0e | [
"MIT"
] | null | null | null | README.md | Appycola/quora-have-to-visit | 135b3fd2e99c33ef979215f306b662cd13433d0e | [
"MIT"
] | null | null | null | README.md | Appycola/quora-have-to-visit | 135b3fd2e99c33ef979215f306b662cd13433d0e | [
"MIT"
] | null | null | null | # quora-have-to-visit
Quora quora get it up
| 14.666667 | 21 | 0.75 | azj_Latn | 0.3151 |
935305b57d19f0b478ed7d1f6a34f7cd2484458b | 1,392 | md | Markdown | 2020/12/07/2020-12-07 16:15.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/12/07/2020-12-07 16:15.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/12/07/2020-12-07 16:15.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年12月07日16时数据
Status: 200
1.韩国总统文在寅向国民致歉
微博热度:2967028
2.杨坤直播时满脸通红
微博热度:2185247
3.潘成然秒删
微博热度:2171962
4.杨幂爸爸觉得她没有小时候好看
微博热度:1923665
5.上海6例本土确诊溯源结果
微博热度:1145614
6.女婴坠楼受伤父亲拒绝治疗
微博热度:1123219
7.网友初次约会点2万多火锅
微博热度:1038777
8.素媛案罪犯一小时可做1000个俯卧撑
微博热度:953900
9.成都急寻两类人员核酸检测
微博热度:844139
10.焉栩嘉道歉模版
微博热度:839906
11.爱奇艺把李诞奖杯收回去了
微博热度:809440
12.刘亦菲优越的颅顶
微博热度:697224
13.四川成都新增1例确诊病例
微博热度:591899
14.因为染发影响比赛结果合理吗
微博热度:544577
15.丁飞俊站姐
微博热度:526961
16.赵梦
微博热度:525531
17.辛巴团队回应燕窝赔付
微博热度:507352
18.主办方回应大学女足因染发被判负
微博热度:434205
19.RM
微博热度:404491
20.郑爽瑞丽开年封
微博热度:402256
21.杨祐宁得女
微博热度:394943
22.无鞋不谈
微博热度:382267
23.高校男生排队体验女生生理痛
微博热度:328421
24.张国伟水龙弹
微博热度:327570
25.赵丽颖暖阳野餐大片
微博热度:322438
26.成都郫都区
微博热度:312823
27.云南新版艾滋病防治条例
微博热度:307823
28.简历倒卖单份最低仅1元
微博热度:305429
29.默读影视化
微博热度:304632
30.有翡新剧照
微博热度:280177
31.没想到袋鼠这么有男人味
微博热度:274040
32.奇葩说第七季导师阵容
微博热度:272647
33.王毅称经贸合作始终是中美关系的压舱石
微博热度:272141
34.没事千万不要踢柿子树
微博热度:271670
35.珍珠青龙同框
微博热度:259730
36.共享充电宝会翻车吗
微博热度:241367
37.东北妈妈的通用妆容
微博热度:230659
38.焉栩嘉为97庆生音频
微博热度:216192
39.家长送老师干啥啥不行锦旗系自导自演
微博热度:214402
40.猪肉价格反弹
微博热度:212210
41.默读组讯
微博热度:208849
42.创意冰雪奇缘蛋糕
微博热度:208726
43.追星跟养宠物有点像
微博热度:208696
44.鲜虾奶酪烤吐司角
微博热度:202411
45.万茜喊话贾玲称和刘德华有感情戏
微博热度:201927
46.教育部曝光8起师德违规问题
微博热度:201872
47.原神
微博热度:187438
48.藏族仿妆可以有多绝
微博热度:183264
49.大雪
微博热度:182895
50.湖南一混凝土搅拌车与救护车相撞
微博热度:162068
| 6.823529 | 21 | 0.780891 | yue_Hant | 0.204145 |
9354ec400a7ada910756aeb5a8722ff63c7b9cdc | 52 | md | Markdown | test/specs/new/nested_square_link.md | BastianHofmann/marked | 6ae3651627c46bb03c54e2d71a5d73f4750ee47a | [
"BSD-3-Clause"
] | 13,341 | 2018-02-15T05:47:33.000Z | 2022-03-31T12:25:19.000Z | test/specs/new/nested_square_link.md | WebGameLinux/marked | 7b3036f8c0440cfd003ce47dd6e1a92af0f5e822 | [
"BSD-3-Clause"
] | 1,420 | 2018-02-15T04:48:12.000Z | 2022-03-31T02:38:09.000Z | test/specs/new/nested_square_link.md | WebGameLinux/marked | 7b3036f8c0440cfd003ce47dd6e1a92af0f5e822 | [
"BSD-3-Clause"
] | 1,789 | 2018-02-15T05:42:56.000Z | 2022-03-31T19:57:45.000Z | [the `]` character](/url)
[the \` character](/url)
| 13 | 25 | 0.576923 | eng_Latn | 0.797469 |
9355554f6f6cabda1117bf9150af9b2ff4d74506 | 699 | md | Markdown | README.md | jarenal/tiny-comments | fe9b04017b1750c52884890a8fb5403611ab5ddd | [
"MIT"
] | null | null | null | README.md | jarenal/tiny-comments | fe9b04017b1750c52884890a8fb5403611ab5ddd | [
"MIT"
] | null | null | null | README.md | jarenal/tiny-comments | fe9b04017b1750c52884890a8fb5403611ab5ddd | [
"MIT"
] | null | null | null | # Welcome to Tiny Comments!
This is a demonstrative project that try to create a Tiny Comments System without help of any external library or framework.
The project requires PHP >= 5.5.9 and uses FirePHP for debugging purposes.
## INSTALL
Please, for install the project correctly in your server, follow the next instructions:
1. Extract the project to any folder in your web server.
2. Create a database with the name that you prefer.
3. Import the sql script that located in sql folder to your new database.
4. Change the database settings in config.php file that is located in config folder.
5. Run the project calling to index.php file from the web server.
6. Have fun testing the project!
| 41.117647 | 124 | 0.782546 | eng_Latn | 0.999248 |
935564a8bafdfa255a24b69f313922895c45ef3b | 1,584 | md | Markdown | README.md | X0nic/charlotte | 07d892a5a5407f32a1da53a7e683c33749dc76b4 | [
"MIT"
] | null | null | null | README.md | X0nic/charlotte | 07d892a5a5407f32a1da53a7e683c33749dc76b4 | [
"MIT"
] | 1 | 2015-11-18T05:59:24.000Z | 2015-11-18T05:59:24.000Z | README.md | X0nic/charlotte | 07d892a5a5407f32a1da53a7e683c33749dc76b4 | [
"MIT"
] | null | null | null | # Charlotte
A web crawler that will show you a map of all static assets
## Installation
Add this line to your application's Gemfile:
```ruby
gem 'charlotte'
```
And then execute:
$ bundle
Or install it yourself as:
$ gem install charlotte
## Usage
# Fetch and display all links and assets for a site
$ charlotte fetch duckduckgo.com
# Fetch and display all links and assets for a site, but only decend two levels
$ charlotte fetch duckduckgo.com -l 2
# Fetch site, and display links between all pages
$ charlotte visualize_pages duckduckgo.com
# Fetch site, and display links between pages and assets
$ charlotte visualize_assets duckduckgo.com
## Development
After checking out the repo, run `bin/setup` to install dependencies. Then, run `rake spec` to run the tests. You can also run `bin/console` for an interactive prompt that will allow you to experiment. Run `bundle exec charlotte` to use the gem in this directory, ignoring other installed copies of this gem.
To install this gem onto your local machine, run `bundle exec rake install`. To release a new version, update the version number in `version.rb`, and then run `bundle exec rake release`, which will create a git tag for the version, push git commits and tags, and push the `.gem` file to [rubygems.org](https://rubygems.org).
## Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/X0nic/charlotte.
## License
The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
| 31.68 | 324 | 0.743056 | eng_Latn | 0.991936 |
9355c5e24cf50cfe1565262b4e0e800489756d23 | 2,337 | md | Markdown | docs/framework/unmanaged-api/debugging/icordebugvariablesymbol-interface.md | soelax/docs.de-de | 17beb71b6711590e35405a1086e6ac4eac24c207 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugvariablesymbol-interface.md | soelax/docs.de-de | 17beb71b6711590e35405a1086e6ac4eac24c207 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/unmanaged-api/debugging/icordebugvariablesymbol-interface.md | soelax/docs.de-de | 17beb71b6711590e35405a1086e6ac4eac24c207 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: ICorDebugVariableSymbol-Schnittstelle
ms.date: 03/30/2017
ms.assetid: 0e58b85e-69bd-41ff-bedb-8cdc8be6a7a2
author: rpetrusha
ms.author: ronpet
ms.openlocfilehash: 9822a3403c6ff738f7a6556a35cb383426575cb9
ms.sourcegitcommit: 6b308cf6d627d78ee36dbbae8972a310ac7fd6c8
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 01/23/2019
ms.locfileid: "54582590"
---
# <a name="icordebugvariablesymbol-interface"></a>ICorDebugVariableSymbol-Schnittstelle
Ruft die Debugsymbolinformationen für eine Variable ab.
## <a name="methods"></a>Methoden
|Methode|Beschreibung|
|------------|-----------------|
|[GetName-Methode](../../../../docs/framework/unmanaged-api/debugging/icordebugvariablesymbol-getname-method.md)|Ruft den Namen einer Variablen ab.|
|[GetSize-Methode](../../../../docs/framework/unmanaged-api/debugging/icordebugvariablesymbol-getsize-method.md)|Ruft die Größe einer Variablen in Bytes ab.|
|[GetSlotIndex-Methode](../../../../docs/framework/unmanaged-api/debugging/icordebugvariablesymbol-getslotindex-method.md)|Ruft den verwalteten Slotindex einer lokalen Variablen ab.|
|[GetValue-Methode](../../../../docs/framework/unmanaged-api/debugging/icordebugvariablesymbol-getvalue-method.md)|Ruft den Wert einer Variablen als Bytearray ab.|
|[SetValue-Methode](../../../../docs/framework/unmanaged-api/debugging/icordebugvariablesymbol-setvalue-method.md)|Weist einer Variablen den Wert eines Bytearrays zu.|
## <a name="remarks"></a>Hinweise
> [!NOTE]
> Diese Schnittstelle ist nur in Verbindung mit .NET Native verfügbar. Wenn Sie diese Schnittstelle für ICorDebug-Szenarien außerhalb von .NET Native implementieren, ignoriert die Common Language Runtime diese Schnittstelle.
## <a name="requirements"></a>Anforderungen
**Plattformen:** Weitere Informationen finden Sie unter [Systemanforderungen](../../../../docs/framework/get-started/system-requirements.md).
**Header:** CorDebug.idl, CorDebug.h
**Bibliothek:** CorGuids.lib
**.NET Framework-Versionen:** [!INCLUDE[net_46_native](../../../../includes/net-46-native-md.md)]
## <a name="see-also"></a>Siehe auch
- [Debuggen von Schnittstellen](../../../../docs/framework/unmanaged-api/debugging/debugging-interfaces.md)
- [Debuggen](../../../../docs/framework/unmanaged-api/debugging/index.md)
| 53.113636 | 227 | 0.741121 | deu_Latn | 0.559704 |
9355eaa5300887ee613a162589bda08c7d480453 | 1,310 | md | Markdown | _posts/people-love/23/2021-04-07-justzik.md | chito365/p | d43434482da24b09c9f21d2f6358600981023806 | [
"MIT"
] | null | null | null | _posts/people-love/23/2021-04-07-justzik.md | chito365/p | d43434482da24b09c9f21d2f6358600981023806 | [
"MIT"
] | null | null | null | _posts/people-love/23/2021-04-07-justzik.md | chito365/p | d43434482da24b09c9f21d2f6358600981023806 | [
"MIT"
] | null | null | null | ---
id: 13924
title: JustZik
date: 2021-04-07T14:02:18+00:00
author: victor
layout: post
guid: https://ukdataservers.com/justzik/
permalink: /04/07/justzik
tags:
- show love
- unspecified
- single
- relationship
- engaged
- married
- complicated
- open relationship
- widowed
- separated
- divorced
- Husband
- Wife
- Boyfriend
- Girlfriend
category: Guides
---
* some text
{: toc}
## Who is JustZik
Notable for his reaction videos, he is a YouTube sensation who has earned more than 3.9 million subscribers by reacting to music videos, with an emphasis on hip hop videos.
## Prior to Popularity
He attended college at the University of Tulsa. His channel was created in September of 2013, but his earliest archived videos are from 2016.
## Random data
His JustZik Twitter account has more than 210,000 followers.
## Family & Everyday Life of JustZik
He grew up in Dallas, Texas.
## People Related With JustZik
He reacts alongside his friend B.Lou for his ZIAS! channel.
| 16.582278 | 173 | 0.564122 | eng_Latn | 0.998355 |
935652608da3f05c7929b713c7fac29030c384ca | 11,513 | md | Markdown | docs/t-sql/language-elements/transactions-sql-data-warehouse.md | conky77/sql-docs.it-it | 43be0950d07b85a6566f565206630912ec8c3197 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/t-sql/language-elements/transactions-sql-data-warehouse.md | conky77/sql-docs.it-it | 43be0950d07b85a6566f565206630912ec8c3197 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/t-sql/language-elements/transactions-sql-data-warehouse.md | conky77/sql-docs.it-it | 43be0950d07b85a6566f565206630912ec8c3197 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Transazioni (SQL Data Warehouse) | Microsoft Docs
ms.custom: ''
ms.date: 03/14/2017
ms.prod: sql
ms.reviewer: ''
ms.technology: t-sql
ms.topic: language-reference
dev_langs:
- TSQL
ms.assetid: 87e5e593-a121-4428-9d3c-3af876224e35
author: ronortloff
ms.author: rortloff
monikerRange: '>= aps-pdw-2016 || = azure-sqldw-latest || = sqlallproducts-allversions'
ms.openlocfilehash: 21e6d25305bd6abf4a3dc4555f2148a2fe385187
ms.sourcegitcommit: b2e81cb349eecacee91cd3766410ffb3677ad7e2
ms.translationtype: HT
ms.contentlocale: it-IT
ms.lasthandoff: 02/01/2020
ms.locfileid: "68121591"
---
# <a name="transactions-sql-data-warehouse"></a>Transazioni (SQL Data Warehouse)
[!INCLUDE[tsql-appliesto-xxxxxx-xxxx-asdw-pdw-md](../../includes/tsql-appliesto-xxxxxx-xxxx-asdw-pdw-md.md)]
Una transazione è un gruppo di una o più istruzioni di database di cui è stato interamente eseguito il commit o il rollback. Ogni transazione è atomica, coerente, isolata e duratura, dotata cioè delle cosiddette proprietà ACID. Se la transazione ha esito positivo, viene eseguito il commit di tutte le istruzioni al suo interno. Se la transazione ha esito negativo, ovvero se almeno una delle istruzioni del gruppo non riesce, viene eseguito il rollback dell'intero gruppo.
L'inizio e la fine delle transazioni dipendono dell'impostazione AUTOCOMMIT e dalle istruzioni BEGIN TRANSACTION, COMMIT e ROLLBACK. [!INCLUDE[ssSDW](../../includes/sssdw-md.md)] supporta i tipi di transazioni seguenti:
- Le *transazioni esplicite* iniziano con l'istruzione BEGIN TRANSACTION e terminano con l'istruzione COMMIT o ROLLBACK.
- Le *transazioni con commit automatico* iniziano automaticamente all'interno di una sessione e non iniziano con l'istruzione BEGIN TRANSACTION. Se l'impostazione AUTOCOMMIT corrisponde a ON, ogni istruzione viene eseguita in una transazione e non è necessaria un'istruzione COMMIT o ROLLBACK esplicita. Se l'impostazione AUTOCOMMIT corrisponde a OFF, è necessaria un'istruzione COMMIT o ROLLBACK per determinare il risultato della transazione. In [!INCLUDE[ssSDW](../../includes/sssdw-md.md)], le transazioni con commit automatico iniziano immediatamente dopo un'istruzione COMMIT o ROLLBACK oppure dopo un'istruzione SET AUTOCOMMIT OFF.
 [Convenzioni della sintassi Transact-SQL (Transact-SQL)](../../t-sql/language-elements/transact-sql-syntax-conventions-transact-sql.md)
## <a name="syntax"></a>Sintassi
```
BEGIN TRANSACTION [;]
COMMIT [ TRAN | TRANSACTION | WORK ] [;]
ROLLBACK [ TRAN | TRANSACTION | WORK ] [;]
SET AUTOCOMMIT { ON | OFF } [;]
SET IMPLICIT_TRANSACTIONS { ON | OFF } [;]
```
## <a name="arguments"></a>Argomenti
BEGIN TRANSACTION
Contrassegna il punto di inizio di una transazione esplicita.
COMMIT [ WORK ]
Contrassegna la fine di una transazione esplicita o con commit automatico. Con questa istruzione viene eseguito il commit permanente nel database delle modifiche apportate alla transazione. L'istruzione COMMIT è identica alle istruzioni COMMIT WORK, COMMIT TRAN e COMMIT TRANSACTION.
ROLLBACK [ WORK ]
Esegue il rollback di una transazione fino all'inizio della transazione stessa. Nel database non viene eseguito il commit di alcuna modifica alla transazione. L'istruzione ROLLBACK è identica alle istruzioni ROLLBACK WORK, ROLLBACK TRAN e ROLLBACK TRANSACTION.
SET AUTOCOMMIT { **ON** | OFF }
Determina la modalità di inizio e di fine delle transazioni.
ATTIVA
Ogni istruzione viene eseguita in una transazione specifica. Non è necessaria un'istruzione COMMIT o ROLLBACK esplicita. Se AUTOCOMMIT corrisponde a ON, le transazioni esplicite sono consentite.
OFF
[!INCLUDE[ssSDW](../../includes/sssdw-md.md)] avvia automaticamente una transazione, se non ne è in corso alcuna. Tutte le istruzioni successive vengono eseguite nell'ambito della transazione ed è necessaria un'istruzione COMMIT o ROLLBACK per determinare il risultato della transazione. Non appena viene eseguito il commit o il rollback di una transazione in questa modalità operativa, la modalità rimane impostata su OFF e [!INCLUDE[ssSDW](../../includes/sssdw-md.md)] avvia una nuova transazione. Se AUTOCOMMIT corrisponde a OFF, le transazioni esplicite non sono consentite.
Se si modifica l'impostazione AUTOCOMMIT all'interno di una transazione attiva, l'impostazione influisce sulla transazione corrente e ha effetto solo dopo il completamento della transazione.
Se AUTOCOMMIT corrisponde a ON, l'esecuzione di un'altra istruzione SET AUTOCOMMIT ON non ha alcun effetto. Analogamente, se AUTOCOMMIT corrisponde a OFF, l'esecuzione di un'altra istruzione SET AUTOCOMMIT OFF non ha alcun effetto.
SET IMPLICIT_TRANSACTIONS { ON | **OFF** }
Attiva e disattiva le stesse modalità di SET AUTOCOMMIT. Quando è impostata su ON, l'opzione SET IMPLICIT_TRANSACTIONS imposta per la connessione la modalità di transazione implicita. Quando è impostata su OFF, ripristina la modalità di transazione con commit automatico. Per altre informazioni, vedere [SET IMPLICIT_TRANSACTIONS (Transact-SQL)](../../t-sql/statements/set-implicit-transactions-transact-sql.md).
## <a name="permissions"></a>Autorizzazioni
Per eseguire le istruzioni correlate alle transazioni non sono necessarie autorizzazioni specifiche. Le autorizzazioni sono necessarie per eseguire le istruzioni all'interno della transazione.
## <a name="error-handling"></a>Gestione degli errori
Se si eseguono istruzioni COMMIT o ROLLBACK e non è presente alcuna transazione attiva, viene generato un errore.
Se si esegue un'istruzione BEGIN TRANSACTION mentre è già in corso una transazione, viene generato un errore. Ciò può verificarsi se un'istruzione BEGIN TRANSACTION si verifica dopo un'istruzione BEGIN TRANSACTION con esito positivo o se la sessione è in modalità SET AUTOCOMMIT OFF.
Se una transazione esplicita non viene eseguita correttamente a causa di un errore non correlato all'esecuzione di un'istruzione, [!INCLUDE[ssSDW](../../includes/sssdw-md.md)] esegue automaticamente il rollback della transazione e libera tutte le risorse usate dalla transazione stessa. Se ad esempio viene interrotta la connessione di rete tra il client e un'istanza di [!INCLUDE[ssSDW](../../includes/sssdw-md.md)] o se il client si disconnette dall'applicazione, quando la rete notifica l'interruzione all'istanza viene eseguito il rollback di tutte le transazioni della connessione di cui non è ancora stato eseguito il commit.
Se si verifica un errore durante l'esecuzione di un'istruzione in un batch, [!INCLUDE[ssSDW](../../includes/sssdw-md.md)] si comporta in modo coerente con l'impostazione di [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]XACT_ABORT**su**ON**in** e viene eseguito il rollback dell'intera transazione. Per altre informazioni sull'impostazione **XACT_ABORT**, vedere [SET XACT_ABORT (Transact-SQL)](https://msdn.microsoft.com/library/ms188792.aspx).
## <a name="general-remarks"></a>Osservazioni generali
In un momento specifico, una sessione può eseguire solo una transazione. I punti di salvataggio e le transazioni annidate non sono supportati.
È compito del programmatore di [!INCLUDE[DWsql](../../includes/dwsql-md.md)] eseguire l'istruzione COMMIT solo quando tutti i dati a cui la transazione fa riferimento sono logicamente corretti.
Se una sessione viene terminata prima del completamento di una transazione, viene eseguito il rollback della transazione stessa.
Le modalità di transazione vengono gestite a livello di sessione. Se, ad esempio, una sessione inizia una transazione esplicita oppure imposta AUTOCOMMIT su OFF o IMPLICIT_TRANSACTIONS su ON, non ha alcun effetto sulle modalità delle transazioni di qualsiasi altra sessione.
## <a name="limitations-and-restrictions"></a>Limitazioni e restrizioni
Non è possibile eseguire il rollback di una transazione dopo l'esecuzione di un'istruzione COMMIT. Le modifiche dei dati del database, infatti, sono diventate permanenti.
Non è possibile usare i comandi [CREATE DATABASE (Azure SQL Data Warehouse) ](../../t-sql/statements/create-database-azure-sql-data-warehouse.md) e [DROP DATABASE (Transact-SQL) ](../../t-sql/statements/drop-database-transact-sql.md) all'interno di una transazione esplicita.
[!INCLUDE[ssSDW](../../includes/sssdw-md.md)] non ha un meccanismo di condivisione delle transazioni. Ciò comporta il fatto che in qualsiasi momento solo una sessione può eseguire operazioni per una transazione nel sistema.
## <a name="locking-behavior"></a>Comportamento di blocco
Per garantire l'integrità delle transazioni e mantenere la coerenza dei database se più utenti accedono simultaneamente ai dati, [!INCLUDE[ssSDW](../../includes/sssdw-md.md)] usa la funzionalità di blocco. Il blocco viene usato sia dalle transazioni implicite che dalle transazioni esplicite. Ogni transazione richiede blocchi di tipo diverso per le risorse, ad esempio per le tabelle o i database di quali dipende. Tutti i blocchi di [!INCLUDE[ssSDW](../../includes/sssdw-md.md)] sono a livello di tabella o superiore. I blocchi impediscono alle altre transazioni di modificare le risorse in modo tale da creare problemi alla transazione che richiede il blocco. Ogni transazione rilascia i propri blocchi quando non ha più una dipendenza dalle risorse bloccate. Le transazioni esplicite mantengono blocchi finché la transazione non viene completata tramite commit o rollback.
## <a name="examples-includesssdwfullincludessssdwfull-mdmd-and-includesspdwincludessspdw-mdmd"></a>Esempi: [!INCLUDE[ssSDWfull](../../includes/sssdwfull-md.md)] e [!INCLUDE[ssPDW](../../includes/sspdw-md.md)]
### <a name="a-using-an-explicit-transaction"></a>R. Uso di una transazione esplicita
```
BEGIN TRANSACTION;
DELETE FROM HumanResources.JobCandidate
WHERE JobCandidateID = 13;
COMMIT;
```
### <a name="b-rolling-back-a-transaction"></a>B. Rollback di una transazione
Nell'esempio seguente viene illustrato l'effetto del rollback di una transazione. In questo esempio l'istruzione ROLLBACK esegue il rollback dell'istruzione INSERT, ma la tabella creata sarà ancora presente.
```
CREATE TABLE ValueTable (id int);
BEGIN TRANSACTION;
INSERT INTO ValueTable VALUES(1);
INSERT INTO ValueTable VALUES(2);
ROLLBACK;
```
### <a name="c-setting-autocommit"></a>C. Impostazione di AUTOCOMMIT
L'esempio seguente imposta AUTOCOMMIT su `ON`.
```
SET AUTOCOMMIT ON;
```
L'esempio seguente imposta AUTOCOMMIT su `OFF`.
```
SET AUTOCOMMIT OFF;
```
### <a name="d-using-an-implicit-multi-statement-transaction"></a>D. Uso di una transazione implicita con istruzioni multiple
```
SET AUTOCOMMIT OFF;
CREATE TABLE ValueTable (id int);
INSERT INTO ValueTable VALUES(1);
INSERT INTO ValueTable VALUES(2);
COMMIT;
```
## <a name="see-also"></a>Vedere anche
[SET IMPLICIT_TRANSACTIONS (Transact-SQL)](../../t-sql/statements/set-implicit-transactions-transact-sql.md)
[SET TRANSACTION ISOLATION LEVEL (Transact-SQL)](../../t-sql/statements/set-transaction-isolation-level-transact-sql.md)
[@@TRANCOUNT (Transact-SQL)](../../t-sql/functions/trancount-transact-sql.md)
| 75.248366 | 879 | 0.766785 | ita_Latn | 0.993652 |
9357b06235d8db8b56226deab599699dcfe29784 | 202 | md | Markdown | README.md | grahamc/prometheus-packet-spot-market-price-exporter | b894f5dc061e2ab2d0ef101c28fce390285ad492 | [
"MIT"
] | null | null | null | README.md | grahamc/prometheus-packet-spot-market-price-exporter | b894f5dc061e2ab2d0ef101c28fce390285ad492 | [
"MIT"
] | null | null | null | README.md | grahamc/prometheus-packet-spot-market-price-exporter | b894f5dc061e2ab2d0ef101c28fce390285ad492 | [
"MIT"
] | null | null | null | # Packet Spot Prices and Capacity
Emit Packet.com's spot market prices and capacity data, by plan and
facility. Check out [./grafana.json](./grafana.json) for an example
dashboard.

| 25.25 | 67 | 0.742574 | eng_Latn | 0.971185 |
935876d3de70b5f2ce28d32e821308202aa9eae2 | 2,746 | md | Markdown | README.md | LeratClarisse/polygon-ethereum-nextjs-marketplace | afce8a0b162058ad03fc0df11eb9cdf246d57a77 | [
"MIT"
] | 829 | 2021-07-06T16:45:09.000Z | 2022-03-31T16:37:17.000Z | README.md | LeratClarisse/polygon-ethereum-nextjs-marketplace | afce8a0b162058ad03fc0df11eb9cdf246d57a77 | [
"MIT"
] | 32 | 2021-07-16T15:17:59.000Z | 2022-03-31T11:27:27.000Z | README.md | LeratClarisse/polygon-ethereum-nextjs-marketplace | afce8a0b162058ad03fc0df11eb9cdf246d57a77 | [
"MIT"
] | 588 | 2021-07-06T22:47:43.000Z | 2022-03-31T21:56:04.000Z | ## Full stack NFT marketplace built with Polygon, Solidity, IPFS, & Next.js

This is the codebase to go along with tbe blog post [Building a Full Stack NFT Marketplace on Ethereum with Polygon](https://dev.to/dabit3/building-scalable-full-stack-apps-on-ethereum-with-polygon-2cfb)
### Running this project
#### Gitpod
To deploy this project to Gitpod, follow these steps:
1. Click this link to deploy
[](https://gitpod.io/#github.com/dabit3/polygon-ethereum-nextjs-marketplace)
2. Import the RPC address given to you by GitPod into your MetaMask wallet
This endpoint will look something like this:
```
https://8545-copper-swordtail-j1mvhxv3.ws-eu18.gitpod.io/
```
The chain ID should be 1337. If you have a localhost rpc set up, you may need to overwrite it.

#### Local setup
To run this project locally, follow these steps.
1. Clone the project locally, change into the directory, and install the dependencies:
```sh
git clone https://github.com/dabit3/polygon-ethereum-nextjs-marketplace.git
cd polygon-ethereum-nextjs-marketplace
# install using NPM or Yarn
npm install
# or
yarn
```
2. Start the local Hardhat node
```sh
npx hardhat node
```
3. With the network running, deploy the contracts to the local network in a separate terminal window
```sh
npx hardhat run scripts/deploy.js --network localhost
```
4. Start the app
```
npm run dev
```
### Configuration
To deploy to Polygon test or main networks, update the configurations located in __hardhat.config.js__ to use a private key and, optionally, deploy to a private RPC like Infura.
```javascript
require("@nomiclabs/hardhat-waffle");
const fs = require('fs');
const privateKey = fs.readFileSync(".secret").toString().trim() || "01234567890123456789";
// infuraId is optional if you are using Infura RPC
const infuraId = fs.readFileSync(".infuraid").toString().trim() || "";
module.exports = {
defaultNetwork: "hardhat",
networks: {
hardhat: {
chainId: 1337
},
mumbai: {
// Infura
// url: `https://polygon-mumbai.infura.io/v3/${infuraId}`
url: "https://rpc-mumbai.matic.today",
accounts: [privateKey]
},
matic: {
// Infura
// url: `https://polygon-mainnet.infura.io/v3/${infuraId}`,
url: "https://rpc-mainnet.maticvigil.com",
accounts: [privateKey]
}
},
solidity: {
version: "0.8.4",
settings: {
optimizer: {
enabled: true,
runs: 200
}
}
}
};
```
If using Infura, update __.infuraid__ with your [Infura](https://infura.io/) project ID.
| 24.963636 | 203 | 0.698835 | eng_Latn | 0.813861 |
93588d2839a46d657e3d1f25452a721f56519a7e | 751 | md | Markdown | _posts/2016-12-16-the-whisky-diaries-suntory-royal.md | sjmeunier/sjmeunier.github.io | 691cc3e5697345e95c2d69656b4c740cdb47dc3d | [
"MIT"
] | null | null | null | _posts/2016-12-16-the-whisky-diaries-suntory-royal.md | sjmeunier/sjmeunier.github.io | 691cc3e5697345e95c2d69656b4c740cdb47dc3d | [
"MIT"
] | null | null | null | _posts/2016-12-16-the-whisky-diaries-suntory-royal.md | sjmeunier/sjmeunier.github.io | 691cc3e5697345e95c2d69656b4c740cdb47dc3d | [
"MIT"
] | null | null | null | ---
layout: post
title: "The Whisky Diaries: Suntory Royal Blend"
date: 2016-12-16 19:02:00
categories: Whisky
tags: [whisky]
---
Everyone knows that the best whisky comes from Scotland, Ireland and the USA, but there **are** other places that produce some excellent stuff.
Surprisingly, Japan is home to whisky that is very reminiscent of Scotland, and just as good! The Japanese whisky industry was inspired by Scotland, and the results are almost indistinguishable
I received a bottle of Suntory Royal Blend form a friend who had visited Japan, and it is one of the superior blends which I have tried. It is amazingly smooth with subtle flavours. It is very easy drinking.
It is a good whisky from the most unlikely of places. | 50.066667 | 208 | 0.762983 | eng_Latn | 0.999895 |
935a036589d3ab28ce10e23c26e0fea6d0a3da64 | 2,235 | md | Markdown | examples/README.md | pk-218/dataframe | 65050c3328530fe9aa1530c96d5c9c79cac3ea91 | [
"Apache-2.0"
] | 3 | 2020-07-01T10:06:41.000Z | 2020-10-13T15:21:45.000Z | examples/README.md | pk-218/dataframe | 65050c3328530fe9aa1530c96d5c9c79cac3ea91 | [
"Apache-2.0"
] | 2 | 2020-08-20T17:04:35.000Z | 2020-08-20T19:47:49.000Z | examples/README.md | nikitinas/krangl-typed | 949a8d79b016e503b3702b4184a5ae104902d91d | [
"Apache-2.0"
] | null | null | null | # Examples of Kotlin Dataframe
### Idea examples
* [movies](idea-examples/movies)
* [titanic](idea-examples/titanic)
* [youtube](idea-examples/youtube)
### Notebook examples
* puzzles ([Jupyter](jupyter-notebooks/puzzles/40%20puzzles.ipynb)/[Datalore](https://datalore.jetbrains.com/view/notebook/CVp3br3CDXjUGaxxqfJjFF)) –
Inspired [by 100 pandas puzzles](https://github.com/ajcr/100-pandas-puzzles). You will go from the simplest tasks to
complex problems where need to think. This notebook will show you how to solve these tasks with the Kotlin
Dataframe in a laconic, beautiful style.
___
* movies ([Jupyter](jupyter-notebooks/movies/movies.ipynb)/[Datalore](https://datalore.jetbrains.com/view/notebook/89IMYb1zbHZxHfwAta6eKP)) –
In this notebook you can see the basic operations of the Kotlin Dataframe on data from [movielens](https://movielens.org/).
You can take the data from the [link](https://grouplens.org/datasets/movielens/latest/).
___
* netflix ([Jupyter](jupyter-notebooks/netflix/netflix.ipynb)/[Datalore](https://datalore.jetbrains.com/view/notebook/wB6Vq1oKU3GniCi1i05l2X)) –
Explore TV shows and movies from Netflix with the powerful Kotlin Dataframe API and beautiful
visualizations from [lets-plot](https://github.com/JetBrains/lets-plot-kotlin).
___
* github ([Jupyter](jupyter-notebooks/github/github.ipynb)/[Datalore](https://datalore.jetbrains.com/view/notebook/wGlYql3ObFCloN0YpWR1Xw)) –
This notebook shows the hierarchical dataframes look like and how to work with them.
___
* titanic ([Jupyter](jupyter-notebooks/titanic/Titanic.ipynb)/[Datalore](https://datalore.jetbrains.com/view/notebook/B5YeMMONSAR78FgKQ9yJyW)) –
Let's see how the new library will show itself on the famous Titanic dataset.
___
* wine ([Jupyter](jupyter-notebooks/wine/WineNetWIthKotlinDL.ipynb)/[Datalore](https://datalore.jetbrains.com/view/notebook/aK9vYHH8pCA8H1KbKB5WsI)) –
Wine. Kotlin Dataframe. KotlinDL. What came out of this can be seen in this notebook.
___
* youtube ([Jupyter](jupyter-notebooks/youtube/Youtube.ipynb)/[Datalore](https://datalore.jetbrains.com/view/notebook/uXH0VfIM6qrrmwPJnLBi0j)) –
Explore YouTube videos with YouTube REST API and Kotlin Dataframe
| 65.735294 | 156 | 0.792394 | eng_Latn | 0.426276 |
935a2604319986e3f22d3a6c95e86247ea17c430 | 2,714 | md | Markdown | .github/CONTRIBUTING-zh-CN.md | LukBukkit/NES.css | 4a067444317cc2658691ee2a6a6c88d14d2893b4 | [
"MIT"
] | null | null | null | .github/CONTRIBUTING-zh-CN.md | LukBukkit/NES.css | 4a067444317cc2658691ee2a6a6c88d14d2893b4 | [
"MIT"
] | null | null | null | .github/CONTRIBUTING-zh-CN.md | LukBukkit/NES.css | 4a067444317cc2658691ee2a6a6c88d14d2893b4 | [
"MIT"
] | null | null | null | # 贡献
使用其他语言来阅读本文档:
[English](CONTRIBUTING.md) / [日本語](.github/CONTRIBUTING-jp.md) / [Español](.github/CONTRIBUTING-es.md) / [Português](.github/CONTRIBUTING-pt-BR.md)
你打算为本项目做贡献?太棒了!
## 需要知道的事情
本项目与参与者行为准则保持一致. 我们期望你在参与本项目的时候也赞同并支持该行为准则. 关于报告不可接受的行为,请参考我们的[行为准则][code-of-conduct].
**在忙于你的第一个PR吗?**
[如何在GitHub上面为开源项目做贡献][egghead]
## 如何
* 搭建项目?
[我们有详细的说明!](#project-setup)
* 找到了bug?
[请让我们知道!][new-issue]
* 为bug打补丁?
[提交PR!][new-pr]
* 添加一个新功能?
请确保[新开一个issue][new-issue] 来描述你的功能, 当你准备好接受反馈的时候再提交一个[新的PR][new-pr]!
## 搭建项目
你有想为我们的项目做贡献,我们真的很高兴! ❤️ 请参考如下的步骤开始吧:
1. Fork 并且 clone 我们的代码仓库
2. 安装必须的依赖:
```sh
$ npm install
```
3. 启动开发服务器:
```sh
$ npm run storybook
```
### 目录结构
```sh
.
├── index.html: Demo page
├── style.css: Demo page style
├── css: Distribution files
├── docs: Storybook stories
└── scss: Source
├── base
│ ├── reboot.scss: Don't change! (Bootstrap Reboot)
│ ├── generic.scss: Generic style and reboot.css
│ └── variables.scss: Common variables
├── elements
├── form
├── icons: 16x16 icons
├── pixel-arts: For icons other than 16x16.
└── utilities
```
> 小建议: 确保你的 `master` 分支指向原始的代码仓库并且从你fork的分支上创建PR. 请按如下命令进行操作:
>
> ```
> git remote add upstream https://github.com/nostalgic-css/NES.css.git
> git fetch upstream
> git branch --set-upstream-to=upstream/master master
> ```
>
> 这样就会把原始的代码仓库添加为一个名为"upstream"的远程连接,并且从这个远程的仓库连接获取git的信息, 然后当你运行`git pull`命令的时候会指定本地的`master`分支去使用`upstream/master`分支. 在这个时候, 你就能基于这个`master` 分支来创建所有你自己的分支. 当你想更新你的`master`的版本信息的时候, 执行一个常规的`git pull`命令即可.
## `nostalgic-css` 组织如何为项目做贡献
`nostalgic-css` 组织的成员必须遵守如下的步骤. 外部的贡献者只需要遵守以上的准则即可.
### 开发步骤
1. 使用下面的格式化规则从`develop`分支来创建自己的分支。
2. 做满足问题要求的必要的工作。如果发现你的工作跟该问题无关,请[创建一个新的问题][new-issue]并且在另外一个分支在进行你的工作。
3. 提交你的PR然后合并回`develop`分支.
* 任何影响当前开发的改变都必须在文档里面描述清楚.
* 跟某一问题相关的PRs必须把那个问题的号码标注在标题里. 比如: `[#33] Fix bug`
* 分配一个问题给你自己.
* 当这个问题准备合并的时候, 必须向`nostalgic-css/NES.css` 小组申请审核.
4. 一旦PR被批准了,接下来合并分支的更改就是被分配者的义务了。
### 提交格式化
我们使用[Commitizen][commitizen] 以及 [`commitlint`][commitlint] 来确保所有的项目提交都是易于阅读的, 并且使用 [`semantic-release`][semantic-release] 来确保我们的发布是自动化的, [不浪漫的以及不带情感色彩的][sentimental-versioning].
[code-of-conduct]: CODE_OF_CONDUCT.md
[commitizen]: https://github.com/commitizen/cz-cli
[commitlint]: [https://github.com/marionebl/commitlint]
[egghead]: https://egghead.io/series/how-to-contribute-to-an-open-source-project-on-github
[new-issue]: https://github.com/nostalgic-css/NES.css/issues/new/choose
[new-pr]: https://github.com/nostalgic-css/NES.css/compare/develop...develop
[semantic-release]: https://github.com/semantic-release/semantic-release
[sentimental-versioning]: http://sentimentalversioning.org/
| 26.349515 | 205 | 0.716286 | yue_Hant | 0.547263 |
935b801c66422397826c5d9d1c03612009e6e8f8 | 2,309 | md | Markdown | docs/resources/reproducible-builds.md | atlantisO2/gaia | a8741914a81033748b2e604dedc442ffd2ef8f7a | [
"Apache-2.0"
] | null | null | null | docs/resources/reproducible-builds.md | atlantisO2/gaia | a8741914a81033748b2e604dedc442ffd2ef8f7a | [
"Apache-2.0"
] | null | null | null | docs/resources/reproducible-builds.md | atlantisO2/gaia | a8741914a81033748b2e604dedc442ffd2ef8f7a | [
"Apache-2.0"
] | null | null | null | ---
order: 5
title: Building Gaia Deterministically
---
# Build Gaia Deterministically
The [Tendermint rbuilder Docker image](https://github.com/tendermint/images/tree/master/rbuilder) provides a deterministic build environment that is used to build Cosmos SDK applications. It provides a way to be reasonably sure that the executables are really built from the git source. It also makes sure that the same, tested dependencies are used and statically built into the executable.
## Prerequisites
Make sure you have [Docker installed on your system](https://docs.docker.com/get-docker/).
All the following instructions have been tested on *Ubuntu 18.04.2 LTS* with *docker 20.10.2*.
## Build
Clone `gaia`:
```
git clone https://github.com/cosmos/gaia.git
```
Checkout the commit, branch, or release tag you want to build:
```
cd gaia/
git checkout v4.2.1
```
The buildsystem supports and produces binaries for the following architectures:
* **darwin/amd64**
* **linux/amd64**
* **linux/arm64**
* **windows/amd64**
Run the following command to launch a build for all supported architectures:
```
make distclean build-reproducible
```
The build system generates both the binaries and deterministic build report in the `artifacts` directory.
The `artifacts/build_report` file contains the list of the build artifacts and their respective checksums, and can be used to verify
build sanity. An example of its contents follows:
```
App: gaiad
Version: v4.2.1
Commit: dbd8a6fb522c571debf958837f9113c56d418f6b
Files:
29d219b0b120b3188bd7cd7249fc96b9 gaiad-v4.2.1-darwin-amd64
80338d9f0e55ea8f6c93f2ec7d4e18d6 gaiad-v4.2.1-linux-amd64
9bc77a512acca673ca1769ae67b4d6c7 gaiad-v4.2.1-linux-arm64
c84387860f52178e2bffee08897564bb gaiad-v4.2.1-windows-amd64.exe
c25cca8ccceec06a6fabae90f671fab1 gaiad-v4.2.1.tar.gz
Checksums-Sha256:
05e5b9064bac4e71f0162c4c3c3bff55def22ca016d34205a5520fef89fd2776 gaiad-v4.2.1-darwin-amd64
ccda422cbda29c723aaf27653bcf0f6412e138eec33fba2b49de131f9ffbe2d2 gaiad-v4.2.1-linux-amd64
95f89e8213cb758d12e1b0b631285938de822d04d2e25f399e99c0b798173cfd gaiad-v4.2.1-linux-arm64
7ef98f0041f1573f0a8601abad4a14b1c163f47481c7ba1954fd81ed423a6408 gaiad-v4.2.1-windows-amd64.exe
422883ba43c96a6ea5ef9512d39321dd1356633c6a9505517b9c651788df4a7f gaiad-v4.2.1.tar.gz
```
| 36.078125 | 391 | 0.805977 | eng_Latn | 0.876505 |
935c458270e064e131e6b1a9043502fcbb84408b | 61 | md | Markdown | README.md | 5Mixer/GGJ20 | a12a14d596ab150e8d96dda5a21defcd176f251f | [
"MIT"
] | 4 | 2020-01-19T20:17:25.000Z | 2020-12-11T03:16:51.000Z | README.md | 5Mixer/GGJ20 | a12a14d596ab150e8d96dda5a21defcd176f251f | [
"MIT"
] | null | null | null | README.md | 5Mixer/GGJ20 | a12a14d596ab150e8d96dda5a21defcd176f251f | [
"MIT"
] | 1 | 2020-01-21T01:27:35.000Z | 2020-01-21T01:27:35.000Z | # bonsai
A Kha game engine made in preparation for GGJ 2020.
| 20.333333 | 51 | 0.770492 | eng_Latn | 0.998318 |
935d010e1a84b561d570fc64f9b3bf0807826069 | 609 | md | Markdown | README.md | pawan-59/git-test | 9db74eeb892c683b00acb0c8c898cb3c4f9b0ff2 | [
"Apache-2.0"
] | null | null | null | README.md | pawan-59/git-test | 9db74eeb892c683b00acb0c8c898cb3c4f9b0ff2 | [
"Apache-2.0"
] | null | null | null | README.md | pawan-59/git-test | 9db74eeb892c683b00acb0c8c898cb3c4f9b0ff2 | [
"Apache-2.0"
] | null | null | null | # Git Repo Sync
```yaml
name: <action-name>
on:
- push
- delete
jobs:
sync:
runs-on: ubuntu-latest
name: Git Repo Sync
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- uses: pawan-59/[email protected]
with:
# Such as https://github.com/pawan-59/git-repo-sync.git
target-url: <target-url>
# Such as wangchucheng
target-username: <target-username>
# You can store token in your project's 'Setting > Secrets' and reference the name here. Such as ${{ secrets.ACCESS_TOKEN }}
target-token: <target-token>
```
| 22.555556 | 132 | 0.604269 | eng_Latn | 0.712178 |
935d7fb1920576211a46af5039bf474332b4f5f7 | 6,657 | md | Markdown | articles/key-vault/secrets/quick-create-portal.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-03-12T23:37:08.000Z | 2021-03-12T23:37:08.000Z | articles/key-vault/secrets/quick-create-portal.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/key-vault/secrets/quick-create-portal.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Démarrage rapide Azure - Définir et récupérer un secret depuis Key Vault à l’aide du portail Azure | Microsoft Docs
description: Ce guide de démarrage rapide vous montre comment définir et récupérer un secret depuis Azure Key Vault à l’aide du portail Azure.
services: key-vault
author: msmbaldwin
tags: azure-resource-manager
ms.service: key-vault
ms.subservice: secrets
ms.topic: quickstart
ms.custom: mvc
ms.date: 09/03/2019
ms.author: mbaldwin
ms.openlocfilehash: f18174b1d3c38e571d2efaedfc41debec29f40ac
ms.sourcegitcommit: 692382974e1ac868a2672b67af2d33e593c91d60
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 10/22/2021
ms.locfileid: "130263469"
---
# <a name="quickstart-set-and-retrieve-a-secret-from-azure-key-vault-using-the-azure-portal"></a>Démarrage rapide : Définir et récupérer un secret depuis Azure Key Vault à l’aide du portail Azure
Azure Key Vault est un service cloud qui fonctionne comme un magasin sécurisé contenant des secrets. Vous pouvez stocker des clés, des mots de passe, des certificats et d’autres secrets en toute sécurité. Vous pouvez créer et gérer des coffres de clés Azure grâce au portail Azure. Dans ce démarrage rapide, vous créez un coffre de clés, puis l’utilisez pour stocker un secret.
Pour plus d’informations, consultez
- [Vue d’ensemble du coffre de clés](../general/overview.md)
- [Vue d’ensemble des secrets](about-secrets.md).
## <a name="prerequisites"></a>Prérequis
Pour accéder à Azure Key Vault, vous avez besoin d’un abonnement Azure. Si vous n’avez pas d’abonnement, vous pouvez créer un [compte gratuit](https://azure.microsoft.com/free/?WT.mc_id=A261C142F) avant de commencer.
Tout accès aux secrets s’effectue par le biais d’Azure Key Vault. Pour ce guide de démarrage rapide, créez un coffre de clés à l’aide du [portail Azure](../general/quick-create-portal.md), de l’interface [Azure CLI](../general/quick-create-cli.md) ou d’[Azure PowerShell](../general/quick-create-powershell.md).
## <a name="sign-in-to-azure"></a>Connexion à Azure
Connectez-vous au portail Azure sur https://portal.azure.com.
## <a name="add-a-secret-to-key-vault"></a>Ajouter un secret au coffre de clés
Pour ajouter un secret au coffre, procédez comme suit :
1. Accédez à votre nouveau coffre de clés dans le portail Azure.
1. Dans les pages des paramètres de coffre de clés, sélectionnez **Secrets**.
1. Cliquez sur **Generate/Import (Générer/Importer)** .
1. Dans l’écran **Create a secret (Créer un secret)** , choisissez les valeurs suivantes :
- **Options de chargement** : Manuel.
- **Nom** : Entrez un nom pour le secret. Le nom du secret doit être unique dans le coffre de clés. Le nom doit être une chaîne comprise entre 1 et 127 caractères, commençant par une lettre et qui doit contenir uniquement des chiffres, des lettres et des tirets (0-9, a-z, A-Z et -). Pour plus d’informations sur le nommage, consultez [Objets, identificateurs et gestion de versions Key Vault](../general/about-keys-secrets-certificates.md#objects-identifiers-and-versioning).
- **Valeur** : Entrez une valeur pour le secret. Les API Key Vault acceptent et retournent des valeurs de secret sous forme de chaînes.
- Conservez les valeurs par défaut des autres options. Cliquez sur **Créer**.
Lorsque vous recevez le message confirmant la création du secret, cliquez dessus dans la liste.
Pour plus d’informations sur les attributs des secrets, consultez [À propos des secrets Azure Key Vault](./about-secrets.md).
## <a name="retrieve-a-secret-from-key-vault"></a>Récupérer un secret à partir de Key Vault
Si vous cliquez sur la version actuelle, vous voyez la valeur que vous avez spécifiée à l’étape précédente.
:::image type="content" source="../media/quick-create-portal/current-version-hidden.png" alt-text="Propriétés de secret":::
Vous pouvez afficher la valeur masquée en cliquant sur le bouton « Afficher la valeur secrète » dans le volet de droit.
:::image type="content" source="../media/quick-create-portal/current-version-shown.png" alt-text="Valeur secrète affichée":::
Vous pouvez également utiliser l’interface [Azure CLI]() ou [Azure PowerShell]() pour récupérer le secret créé précédemment.
## <a name="clean-up-resources"></a>Nettoyer les ressources
D’autres démarrages rapides et didacticiels sur les coffres de clés reposent sur ce démarrage rapide. Si vous prévoyez d’utiliser d’autres démarrages rapides et didacticiels, il peut être utile de conserver ces ressources.
Si vous n’en avez plus besoin, supprimez le groupe de ressources. Ce faisant, vous supprimez le coffre de clés et les ressources associées. Pour supprimer le groupe de ressources à l’aide du portail :
1. Entrez le nom de votre groupe de ressources dans la zone Recherche en haut du portail. Lorsque vous voyez le groupe de ressources utilisé dans ce démarrage rapide dans les résultats de recherche, sélectionnez-le.
2. Sélectionnez **Supprimer le groupe de ressources**.
3. Dans le champ **TYPE THE RESOURCE GROUP NAME: (TAPER LE NOM DU GROUPE DE RESSOURCES :)** , tapez le nom du groupe de ressources et sélectionnez **Supprimer**.
> [!NOTE]
> Il est important de noter qu’après la suppression d’un secret, d’une clé, d’un certificat ou d’un coffre de clés, ces derniers restent récupérables pendant une période configurable de 7 à 90 jours civils. Si aucune configuration n’est spécifiée, la période de récupération par défaut est définie sur 90 jours. Les utilisateurs disposent ainsi de suffisamment de temps pour remarquer la suppression accidentelle d’un secret et y remédier. Pour plus d’informations sur la suppression et la récupération des coffres de clés et des objets de coffre de clés, consultez [Vue d’ensemble de la suppression réversible d’Azure Key Vault](../general/soft-delete-overview.md).
## <a name="next-steps"></a>Étapes suivantes
Dans ce guide de démarrage rapide, vous avez créé un coffre de clés et vous y avez stocké un secret. Pour en savoir plus sur Key Vault et sur la manière de l’intégrer à vos applications, consultez les articles ci-dessous.
- Lire la [vue d’ensemble Azure Key Vault](../general/overview.md)
- Lire [Sécuriser l’accès à un coffre de clés](../general/security-features.md)
- Consulter [Utiliser Key Vault avec une application web App Service](../general/tutorial-net-create-vault-azure-web-app.md)
- Consulter [Utiliser Key Vault avec une application déployée sur une machine virtuelle](../general/tutorial-net-virtual-machine.md)
- Consulter le [Guide du développeur Azure Key Vault](../general/developers-guide.md)
- Passer en revue la [Vue d’ensemble de la sécurité de Key Vault](../general/security-features.md) | 75.647727 | 666 | 0.773772 | fra_Latn | 0.97565 |
935deb6c99e1abf34b0d6df3070caee94f5f9073 | 3,336 | md | Markdown | README.md | Jaxelr/Nancy.Template.AspNetCore | 3b5807ed2c81071bc1df9c356a13e454c5722154 | [
"MIT"
] | 2 | 2019-07-02T09:19:54.000Z | 2019-12-25T07:34:01.000Z | README.md | Jaxelr/Nancy.Template.AspNetCore | 3b5807ed2c81071bc1df9c356a13e454c5722154 | [
"MIT"
] | 38 | 2019-07-16T10:18:19.000Z | 2020-05-18T15:04:03.000Z | README.md | Jaxelr/Nancy.Template.AspNetCore | 3b5807ed2c81071bc1df9c356a13e454c5722154 | [
"MIT"
] | null | null | null | # Nancy.Template.Webservice [![Mit License][mit-img]][mit]
Dotnet template library used to create web services using the Nancy web Framework.
__Note:__ Since Nancyfx is no longer being maintained i will be deprioritizing this library, please check [here for details](https://github.com/NancyFx/Nancy/issues/3010)
## Builds
| Appveyor |
| :---: |
| [![Build status][build-img]][build] |
## Packages
| NuGet (Stable) | MyGet (Prerelease) |
| :---: | :---: |
| [![NuGet][nuget-img]][nuget] | [![MyGet][myget-img]][myget] |
### Purpose
A quick scaffolder for web services build using Nancyfx as the web framework. It provides the basic tools for validations, endpoint creation and database usage. As is the case with Nancy, it is relatively customizable.
It can be altered to use:
1. multiple types of logs (as based on the serilog sinks).
1. different types of caches (as based on the rapid cache lib).
1. other types of ORMs (i. e. Dapper) by replacing the Repository class with byo.
1. alternatively move to swagger (legacy version) from openapi replacing the nancy.metadata.openapi with nancy.metadata.swagger library from nuget.
### Install
For installation via the dotnet install command:
`dotnet new -i "Nancy.Template.Webservice::*"`
For myget installations you can specify the source on the dotnet command:
`dotnet new -i "Nancy.Template.Webservice::*" --nuget-source https://www.myget.org/F/nancy-template-webservice/api/v3/index.json`
Then you can freely use it by executing the following dotnet command:
`dotnet new nancyws -o MySampleWs`
### Uninstall
To uninstall simply execute:
`dotnet new -u "Nancy.Template.Webservice"`
### Dependencies
This template targets dotnet core 3.1. The following libraries are included as part of the projects:
* [Nancy](https://github.com/NancyFx/Nancy)
* [Nancy.Metadata.OpenApi](https://github.com/Jaxelr/Nancy.Metadata.OpenApi)
* [Nancy.RapidCache](https://github.com/Jaxelr/Nancy.RapidCache)
* [Nancy.Serilog](https://github.com/Zaid-Ajaj/Nancy.Serilog)
* [Serilog](https://github.com/serilog/serilog)
* [Microsoft.Extensions.Diagnostics.HealthChecks](https://github.com/dotnet/aspnetcore/tree/master/src/HealthChecks/HealthChecks)
* [Insight.Database](https://github.com/jonwagner/Insight.Database)
* [Xunit](https://github.com/xunit/xunit)
* [NSubstitute](https://github.com/nsubstitute/NSubstitute)
* [Altcover](https://github.com/SteveGilham/altcover)
* [ReportGenerator](https://github.com/danielpalme/ReportGenerator)
For further information on custom templates, refer to the [Microsoft documentation][docs].
[mit-img]: http://img.shields.io/badge/License-MIT-blue.svg
[mit]: https://github.com/Jaxelr/Nancy.Template.Webservice/blob/master/LICENSE
[build-img]: https://ci.appveyor.com/api/projects/status/4q831j12p00mkeij/branch/master?svg=true
[build]: https://ci.appveyor.com/project/Jaxelr/nancy-template-webservice/branch/master
[nuget-img]: https://img.shields.io/nuget/v/Nancy.Template.Webservice.svg
[nuget]: https://www.nuget.org/packages/Nancy.Template.Webservice/
[myget-img]: https://img.shields.io/myget/nancy-template-webservice/v/Nancy.Template.Webservice.svg
[myget]: https://www.myget.org/feed/nancy-template-webservice/package/nuget/Nancy.Template.Webservice
[docs]: https://docs.microsoft.com/en-us/dotnet/core/tools/custom-templates
| 43.324675 | 220 | 0.761091 | yue_Hant | 0.460518 |
935df66c97333df90088d0eaf914d1d6ed301f7c | 791 | md | Markdown | de/book/aliase.md | hustcer/nushell.github.io | 69abd051dffcedae289647a5c69d3b7d5b7bfda5 | [
"MIT"
] | null | null | null | de/book/aliase.md | hustcer/nushell.github.io | 69abd051dffcedae289647a5c69d3b7d5b7bfda5 | [
"MIT"
] | null | null | null | de/book/aliase.md | hustcer/nushell.github.io | 69abd051dffcedae289647a5c69d3b7d5b7bfda5 | [
"MIT"
] | null | null | null | # Aliase
Aliase in Nushell bieten eine einfache Möglichkeit, um Texte zur ersetzen. Damit ist es möglich, einen Kurznamen für längere Befehle zu definieren - inklusive der Argumente.
Beispielsweise kann ein Alias namens `ll` definiert werden, der den längeren Befehl `ls -l` ersetzt:
```
> alias ll = ls -l
```
Nun kann der Alias aufgerufen werden:
```
> ll
```
Wenn das getan wird, wirkt es, als sei `ls -l` aufgerufen worden. Das bietet auch die Möglichkeit weitere Parameter anzugeben. So kann auch geschrieben werden:
```
> ll -a
```
Das ist äquivalent zu `ls -l -a`. Deutlich kürzer.
## Persistenz
Für Informationen, um Aliase dauerhaft zu speichern, damit diese immer in Nushell nutzbar sind, kann ein Blick auf das [Konfigurationskapitel](konfiguration.md) geworfen werden.
| 28.25 | 177 | 0.750948 | deu_Latn | 0.999562 |
935f7cadfc68b34cd8c53f0b6abcc866f6d16797 | 1,678 | md | Markdown | dev/sharing/applying.md | Cheshire92/Intersect-Documentation | 74004d5cd68d52b371d2266152cceebabb0e4a0f | [
"MIT"
] | null | null | null | dev/sharing/applying.md | Cheshire92/Intersect-Documentation | 74004d5cd68d52b371d2266152cceebabb0e4a0f | [
"MIT"
] | null | null | null | dev/sharing/applying.md | Cheshire92/Intersect-Documentation | 74004d5cd68d52b371d2266152cceebabb0e4a0f | [
"MIT"
] | null | null | null | # Applying Patches
Github patches are very easy to apply, and they allow you to check compatibility before doing so.
Please note, applying git patches modifies your engines source code. Patches can impact performance and introduce bugs into your game. Only install patches from trusted developers!
Download your patch and move it to the root of your Intersect repository. Then open a command prompt window and navigate to your local Intersect repo. This can be done easily by clicking Repository -> Open In Command Prompt within Github Desktop.

In order to check for compatibility enter the following command replacing the patch filename with the one you downloaded. If there are compatibility issues the patch author may need to remake the patch using your version of Intersect, or you maybe applying multiple patches that conflict and are not compatible with each other.
```
git apply --check patchName.patch
```

If no errors appear the patch is compatible, go ahead and run the following command to apply the patch. You might see some warnings, but as long as there are no errors you are good to go!
```
git am --signoff < patchName.patch
```

If you need to remove a patch use the git apply -R command as shown below
```
git apply -R patchName.patch
```
 | 49.352941 | 327 | 0.801549 | eng_Latn | 0.992529 |
935fc3f6bdc47f45835980f3c8a2937faf08d473 | 1,190 | md | Markdown | Packs/PAN-OS/Playbooks/playbook-PAN-OS_-_Delete_Static_Routes_README.md | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 799 | 2016-08-02T06:43:14.000Z | 2022-03-31T11:10:11.000Z | Packs/PAN-OS/Playbooks/playbook-PAN-OS_-_Delete_Static_Routes_README.md | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 9,317 | 2016-08-07T19:00:51.000Z | 2022-03-31T21:56:04.000Z | Packs/PAN-OS/Playbooks/playbook-PAN-OS_-_Delete_Static_Routes_README.md | diCagri/content | c532c50b213e6dddb8ae6a378d6d09198e08fc9f | [
"MIT"
] | 1,297 | 2016-08-04T13:59:00.000Z | 2022-03-31T23:43:06.000Z | Deletes a PAN-OS static route from the PAN-OS instance.
## Dependencies
This playbook uses the following sub-playbooks, integrations, and scripts.
### Sub-playbooks
* PanoramaCommitConfiguration
### Integrations
This playbook does not use any integrations.
### Scripts
This playbook does not use any scripts.
### Commands
* panorama-list-static-routes
* panorama-delete-static-route
## Playbook Inputs
---
| **Name** | **Description** | **Default Value** | **Required** |
| --- | --- | --- | --- |
| virtual_router_name | The name of the virtual router to configure the static routes for. | - | Required |
| route_name | The name to identify the static route (up to 31 characters). The name is case-sensitive and must be unique. Can include, letters, numbers, spaces, hyphens, and underscores. | - |Required |
| AutoCommit | Whether to auto-commit the configuration to PAN-OS. Can be, "Yes" or "No". | No |Optional |
## Playbook Outputs
---
There are no outputs for this playbook.
## Playbook Image
---

| 34 | 203 | 0.735294 | eng_Latn | 0.949295 |
935ff6db666f4fdc6f865ee0afe3fbe2dd7facdf | 5,290 | md | Markdown | mfa-auth/NOTES.md | piddubnyi/mirth-plugins | 755908f18fd9c5610c2c3282b08535b8b290607b | [
"Apache-2.0"
] | 18 | 2020-02-14T12:03:40.000Z | 2022-03-18T09:12:33.000Z | mfa-auth/NOTES.md | piddubnyi/mirth-plugins | 755908f18fd9c5610c2c3282b08535b8b290607b | [
"Apache-2.0"
] | 5 | 2020-11-02T08:26:03.000Z | 2021-11-23T20:13:41.000Z | mfa-auth/NOTES.md | piddubnyi/mirth-plugins | 755908f18fd9c5610c2c3282b08535b8b290607b | [
"Apache-2.0"
] | 4 | 2020-05-19T01:57:58.000Z | 2021-05-18T17:50:18.000Z | # Mirth Connect authentication workflow notes
This was a little difficult to reverse-engineer because Mirth's classes and
interfaces aren't documented and there aren't really any guides to developing
authentication plug-ins.
## First Leg
When the client attempts a login, the server routes the attempt to
`com.mirth.connect.server.api.servlets.UserServlet.login(String, String)`
which calls `com.mirth.connect.server.controllers.UserController.authorizeUser(String, String)`
to check the username/password against the local user database or some other
authentication plug-in. Usually, this is handled by
`com.mirth.connect.server.controllers.DefaultUserController`.
Once `com.mirth.connect.server.controllers.DefaultUserController.authorizeUser(String,String)`
checks the username+password, it calls
`com.mirth.connect.server.controllers.DefaultUserController.handleSecondaryAuthentication(String, LoginStatus, LoginRequirementsChecker)`.
If the primary authentication was successful and there is a MFA plug-in of
any kind, that plug-in's
`com.mirth.connect.plugins.MultiFactorAuthenticationPlugin.authenticate(String, LoginStatus)`
method is called with the "primary" `LoginStatus` object, whose status will likely be SUCCESS.
Now, it's time for the MFA plug-in to decide what to do. It has two choices:
1. Return the `LoginStatus` unchanged; this allows the user to proceed and the workflow is done.
2. Return an `ExtendedLoginStatus` object which specifies the client-side
plug-in class to invoke for the next leg of the process. The status
should be FAIL, and can change the user's username for the next stage if
necessary. A "message" can also be sent along with the response to the
client.
## Second Leg
The client receives this error:
com.mirth.connect.client.core.UnauthorizedException: HTTP/1.1 401 Unauthorized
at com.mirth.connect.client.core.ServerConnection.handleResponse(ServerConnection.java:477)
at com.mirth.connect.client.core.ServerConnection.executeSync(ServerConnection.java:256)
at com.mirth.connect.client.core.ServerConnection.apply(ServerConnection.java:166)
... and the client's plug-in (specified above in the `ExtendedLoginStatus`
returned to the client) has its
`net.christopherschultz.mirth.plugins.auth.mfa.MFAAuthenticationClientPlugin.authenticate(Window, Client, String, LoginStatus)`
method invoked. This includes the user's username and the `ExtendedLoginStatus`
that came from the server. The status will be FAIL and the "message" will be
whatever the server sent as a part of that `ExtendedLoginStatus`.
Now is the chance for the client-side plug-in to take whatever action makes
sense. For MFA, we ask the user for a token. The thread calling `authenticate`
isn't on the AWT event thread, so it's okay to e.g. show a pop-up window,
make web-service calls, etc.
Once the client-side plug-in is ready to proceed with the next step, it should
re-authenticate with the server by calling
`com.mirth.connect.client.core.api.servlets.UserServletInterface.login(String, String)`
with the user's username and a `null` password. But you should include an HTTP
header in the call with the name `X-Mirth-Login-Data` (defined in
`com.mirth.connect.client.core.api.servlets.UserServletInterface.LOGIN_DATA_HEADER`)
which contains "additional" authentication information bound for your
server-side plug-in.
You can pack whatever data you need in this plug-in. You may find that you
need quite a bit more than you originally thought, since the second login
call, here, (a) doesn't have access to the user's password from the previous
leg and (b) must provide 100% of the information necessary to authenticate
the user on the server.
**Security Note**: Remember that any client can call
`com.mirth.connect.client.core.api.servlets.UserServletInterface.login(String, String)`
with this special HTTP header. If the server-side relies solely upon the
contents of that header for authentication, you'd better make sure it can't
be forged, replayed, etc. by a malicious client.
## Third Leg
Similar to the first leg, the `UserServlet` and `DefaultUserController` are
involved. But this time, because of the presence of the `X-Mirth-Login-Data`
HTTP header, the `DefaultUserController` sends the username and that header
value directly to the MFA plug-in's
`com.mirth.connect.plugins.MultiFactorAuthenticationPlugin.authenticate(String)`
method. There is no re-check of the username+password, so your plug-in
needs to make sure that the information about the username+password check
being successful is somehow carried-through the header. (See my MFA plug-in
implementation for how I solved that particular problem.)
Here, the plug-in does whatever it wants and returns a `LoginStatus`. Return
FAIL to cause the login to fail, or SUCCESS to cause it to succeed.
## Conclusion
Even going back through the code to firm-up these NOTEs caused me some
confusion due to the two different behaviors when calling `login()`, with
or without the `X-Mirth-Login-Data` header. A close read of the code is
necessary to follow it along, and it really helped to have folks from the
Mirth community give me some pointers and help me navigate through all
this. I've put their names in the README.md file at the top-level of this
repository.
| 54.536082 | 138 | 0.799244 | eng_Latn | 0.99287 |
9360e070bf03a65fef605bf5a97b6d2fc09b8825 | 2,336 | md | Markdown | glosalist/854_raw.md | fiasinstitute/glosa | 541c7b892226d21043f06f86322f7ec52ee294d1 | [
"MIT"
] | 3 | 2020-10-27T22:49:36.000Z | 2022-02-20T17:15:55.000Z | glosalist/854_raw.md | fiasinstitute/glosa | 541c7b892226d21043f06f86322f7ec52ee294d1 | [
"MIT"
] | null | null | null | glosalist/854_raw.md | fiasinstitute/glosa | 541c7b892226d21043f06f86322f7ec52ee294d1 | [
"MIT"
] | null | null | null | ---
authorName: ernobe
canDelete: false
contentTrasformed: false
from: '"ernobe" <ernobe@...>'
headers.inReplyToHeader: PGVpOHA0OSthbzQwQGVHcm91cHMuY29tPg==
headers.messageIdInHeader: PGVpYnNoZitjOGZjQGVHcm91cHMuY29tPg==
headers.referencesHeader: .nan
layout: email
msgId: 854
msgSnippet: '... A newer version has been uploaded, that fixes this and makes several
improvements. Several words previously not recognized now are. For example,'
nextInTime: 855
nextInTopic: 858
numMessagesInTopic: 12
postDate: '1162442095'
prevInTime: 853
prevInTopic: 853
profile: ernobe
replyTo: LIST
senderId: i-idQ0JY874MF1pBc3u-OP-5t6sogG7FUxbSqrgb42RmRtY1RIoPA54YEbMyV7SwzUJqFoq5hY5pNd1xqLShCXG2
spamInfo.isSpam: false
spamInfo.reason: '6'
systemMessage: false
title: 'Re: English-Glosa web page translator'
topicId: 852
userId: 80863808
---
--- In [email protected], "ernobe" <ernobe@...> wrote:
>
> I've dis=
covered a small error in the program, and uploaded the
> corrected version.=
In assigning the plural to words, it would leave
> out a hyphen, which wo=
uld indicate that the 'plu' part was part of the
> conjectured word. For e=
xample, 'endings' would return '?plu akro'
> instead of '?plu-akro'. If yo=
u have the version that does this,
> perhaps you might want to download the=
corrected version.
>
A newer version has been uploaded, that fixes this a=
nd makes several
improvements. Several words previously not recognized now=
are. For
example, implementors or implementers now returns
'?plu-instrume=
nta-pe', whereas the previous version would leave out
the 'pe', physician o=
r mathematician now produce correct results,
whereas previously they'd go u=
nrecognized. Generally, more
correspondences are attempted, which perhaps =
gives place to more
errors, but errors are better than failures. Previousl=
y, I had
thought of not translating 'the' and 'a' since these words are oft=
en
left out in Glosa. This version translates them and 'an' as 'u'.
Some =
definitions in the dictionary consist of several Glosa words, as
for exampl=
e 'gene ma boni' for 'reform' and 'pa grafo' for 'wrote'.
Only the first wo=
rd of these will appear in the results. Finally, the
program now cleans up=
after itself and removes the other file that is
produced during processing=
.
| 30.337662 | 98 | 0.767123 | eng_Latn | 0.995333 |
93611556582c87788b2b4c30e46be9a1556d834f | 2,987 | md | Markdown | README.md | ppknUWr/backend-bbz | 5ee82737d4c65be109677ec9485ec7f59aae62bf | [
"Apache-2.0"
] | 1 | 2021-03-03T19:12:26.000Z | 2021-03-03T19:12:26.000Z | README.md | ppknUWr/backend-bbz | 5ee82737d4c65be109677ec9485ec7f59aae62bf | [
"Apache-2.0"
] | 10 | 2021-03-01T16:11:29.000Z | 2021-10-04T18:36:33.000Z | README.md | ppknUWr/backend-bbz | 5ee82737d4c65be109677ec9485ec7f59aae62bf | [
"Apache-2.0"
] | null | null | null | # backend-bbz
# Code convention
### Ogólne:
Używaj pełnych i zrozumiałych nazw.
Zachowuj dwie linie odstępu między definicjami klas oraz funkcji.
Zachowuj jedną linię odstępu między definicjami metod.
Parametry funkcji, metod oraz elementy w liście oddzielaj przecinkiem i spacją.
```python
def sum_two_numbers(number1, number2):
result = number1 + number2
return result
class ConnectionManager:
def __init__(self, connection_string):
self.connection_string = connection_string
def connect_to_database(self):
#do something
```
Kiedy funkcja, metoda jest rozbudowana i wykonuje kilka kroków, odziel je spacjami.
```python
def calculate_variance(number_list):
sum_list = 0
for number in number_list:
sum_list = sum_list + number
mean = sum_list / len(number_list)
sum_squares = 0
for number in number_list:
sum_squares = sum_squares + number**2
mean_squares = sum_squares / len(number_list)
return mean_squares - mean**2
```
Operacje oddzielaj spacjami:
```python
# Recommended
y = x**2 + 5
z = (x+y) * (x-y)
# Not Recommended
y = x ** 2 + 5
z = (x + y) * (x - y)
```
Zbyt długie linie kodu załamuj (powyżej 79 linii kodu):
```python
def function(arg_one, arg_two,
arg_three, arg_four):
return arg_one
total = (first_variable
+ second_variable
- third_variable)
list_of_numbers = [
1, 2, 3
4, 5, 6
7, 8, 9
]
```
Używaj komentarzy do każdej funkcji i metody:
```python
"""
calculator_add
Function that adds two numbers
@param number1 INT/FLOAT - Number one to add
@param number2 INT/FLOAT - Number two to add
@return INT/FLOAT - Result of adding two numbers
"""
def calculator_add(number1, number2):
return number1 + number2
```
Do problematycznych linii kodu używaj komentarzy jednolinijkowych:
```python
x = b**2 - 4*a*c # delta for quadratic equation
```
### 1. Funkcje i zmienne:
Nazwy funkcji i zmiennych zapisuj małymi literami.
Stałe zapisuj dużymi literami,
Słowa powinny być oddzielone podkreśleniem:
```python
first_name = "Janusz"
last_name = "Kowalski"
id = "532137420"
PI = 3.14
CONSTANT_NUMBER = 5
def my_function():
```
### 2. Klasy i metody:
Nazwy klas zapisuj dużymi literami, każde słowo powinno być zapisane dużą literą,
nie używaj podkreśleń.
Nazwy metod zapisuj małymi literami, słowa odzielaj podkreśleniem.
```python
class Class:
def my_function(self, variable, second_variable):
def multiply_by_two(self, number):
class CarClass:
def __init__(self, name, production_year, id_number="0000000"):
self.name = name
self.production_year = production_year
self.id_number = id_number
def __str__(self):
return f"Name: {self.name}, Year: {self.production_year}, ID: {self.id_number}"
```
### 3. Pliki:
Nazwy plików zapisuj małą literą, słowa odzielaj podkreśleniem:
```python
module.py
my_module.py
student_list.txt
```
### Źródło:
https://realpython.com/python-pep8/#code-layout
https://www.python.org/dev/peps/pep-0008/
| 21.035211 | 83 | 0.72079 | pol_Latn | 0.956045 |
9361b2cab0f6c0dc2d0659d7b5c8e3ca28f75343 | 1,425 | md | Markdown | AlchemyInsights/microsoft-graph-restapi-interface.md | isabella232/OfficeDocs-AlchemyInsights-pr.pt-BR | 61447e6f79e6a5ca40e5e50168da971230d5f9b7 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-19T19:07:30.000Z | 2020-05-19T19:07:30.000Z | AlchemyInsights/microsoft-graph-restapi-interface.md | isabella232/OfficeDocs-AlchemyInsights-pr.pt-BR | 61447e6f79e6a5ca40e5e50168da971230d5f9b7 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-09T06:58:01.000Z | 2022-02-09T06:58:58.000Z | AlchemyInsights/microsoft-graph-restapi-interface.md | isabella232/OfficeDocs-AlchemyInsights-pr.pt-BR | 61447e6f79e6a5ca40e5e50168da971230d5f9b7 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-10-11T19:14:27.000Z | 2021-10-13T10:06:06.000Z | ---
title: Interface Graph API REST da Microsoft
ms.author: pebaum
author: pebaum
manager: mnirke
ms.date: 04/21/2020
ms.audience: Admin
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.custom:
- "7071"
- "9004013"
ms.openlocfilehash: d8ef9f22e495feba26ecc1d3e21b996b199cbe16c6d3fdbf8e2e50893fe15942
ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 08/05/2021
ms.locfileid: "54027182"
---
# <a name="microsoft-graph-rest-api-interface"></a>Interface Graph API REST da Microsoft
Os conjuntos de API Graph no ponto de extremidade v1.0 estão no status de disponibilidade geral (GA) e passaram por um rigoroso processo de revisão e feedback com os clientes para atender às necessidades práticas de produção.
- Para obter mais informações sobre Graph API v1.0, consulte Referência da [API REST da Microsoft Graph REST v1.0](https://docs.microsoft.com/graph/api/overview?toc=.%2Fref%2Ftoc.json&view=graph-rest-1.0).
- Para obter mais informações sobre Graph versão beta da API, consulte [Microsoft Graph referência de ponto](https://docs.microsoft.com/graph/api/overview?toc=.%2Fref%2Ftoc.json&view=graph-rest-beta)de extremidade beta .
Para obter mais informações sobre o microsoft Graph, consulte [Microsoft Graph documentação](https://docs.microsoft.com/graph/).
| 43.181818 | 225 | 0.800702 | por_Latn | 0.853568 |
93620922993c1f5927eacf7710327a5c8b4cbeda | 308 | md | Markdown | content/reference/datatypes/SoftLayer_Notification_Subscriber_Customer.md | BrianSantivanez/githubio_source | d4b2bc7cc463dee4a8727268e78f26d8cd3a20ae | [
"Apache-2.0"
] | null | null | null | content/reference/datatypes/SoftLayer_Notification_Subscriber_Customer.md | BrianSantivanez/githubio_source | d4b2bc7cc463dee4a8727268e78f26d8cd3a20ae | [
"Apache-2.0"
] | null | null | null | content/reference/datatypes/SoftLayer_Notification_Subscriber_Customer.md | BrianSantivanez/githubio_source | d4b2bc7cc463dee4a8727268e78f26d8cd3a20ae | [
"Apache-2.0"
] | null | null | null | ---
title: "SoftLayer_Notification_Subscriber_Customer"
description: ""
date: "2018-02-12"
tags:
- "datatype"
- "sldn"
- "Notification"
classes:
- "SoftLayer_Notification_Subscriber_Customer"
type: "reference"
layout: "datatype"
mainService : "SoftLayer_Notification_Subscriber_Customer"
---
| 20.533333 | 58 | 0.737013 | yue_Hant | 0.492945 |
936263216a42727b7796eb3d85ed54e627303284 | 3,984 | md | Markdown | docs/odbc/reference/develop-app/effect-of-transactions-on-cursors-and-prepared-statements.md | kirabr/sql-docs.ru-ru | 08e3b25ff0792ee0ec4c7641b8960145bbec4530 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/odbc/reference/develop-app/effect-of-transactions-on-cursors-and-prepared-statements.md | kirabr/sql-docs.ru-ru | 08e3b25ff0792ee0ec4c7641b8960145bbec4530 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/odbc/reference/develop-app/effect-of-transactions-on-cursors-and-prepared-statements.md | kirabr/sql-docs.ru-ru | 08e3b25ff0792ee0ec4c7641b8960145bbec4530 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Влияние транзакций на курсоры и подготовленных инструкций | Документы Microsoft
ms.custom: ''
ms.date: 01/19/2017
ms.prod: sql
ms.prod_service: connectivity
ms.reviewer: ''
ms.suite: sql
ms.technology: connectivity
ms.tgt_pltfrm: ''
ms.topic: conceptual
helpviewer_keywords:
- rolling back transactions [ODBC]
- committing transactions [ODBC]
- transactions [ODBC], rolling back
- cursors [ODBC], transaction commits or roll backs
- prepared statements [ODBC]
- transactions [ODBC], cursors
ms.assetid: 523e22a2-7b53-4c25-97c1-ef0284aec76e
caps.latest.revision: 6
author: MightyPen
ms.author: genemi
manager: craigg
ms.openlocfilehash: ceb7aaae6967376c454d32633cda6043bc281825
ms.sourcegitcommit: 1740f3090b168c0e809611a7aa6fd514075616bf
ms.translationtype: MT
ms.contentlocale: ru-RU
ms.lasthandoff: 05/03/2018
ms.locfileid: "32911799"
---
# <a name="effect-of-transactions-on-cursors-and-prepared-statements"></a>Влияние транзакций на курсоры и подготовленных инструкций
Фиксация или откат транзакции приведет к следующим последствиям на курсоры и планы доступа:
- Все курсоры закрываются и удаляются планы доступа для подготовленных инструкций для этого соединения.
- Все курсоры закрываются и планы доступа для подготовленных инструкций для этого соединения остаются без изменений.
- Все курсоры остаются открытыми и планы доступа для подготовленных инструкций для этого соединения остаются без изменений.
Например предположим, что источник данных поведение первой в этом списке, наиболее строгие из этих поведений. Теперь предположим, что приложение делает следующее:
1. Задает режим фиксации для ручной фиксации.
2. Создает результирующий набор заказов на продажу в отчете 1.
3. Создает результирующий набор строк в заказ на продажу в инструкции 2, когда пользователь выделяет в указанном порядке.
4. Вызовы **SQLExecute** для выполнения инструкции позиционированного обновления, который был подготовлен на инструкции 3, когда пользователь обновляет строку.
5. Вызовы **SQLEndTran** зафиксировать инструкции позиционированного обновления.
В связи с источника данных вызова **SQLEndTran** на шаге 5 вызывает закрытие курсоров в инструкциях 1 и 2 и удалить план доступа для всех инструкций. Приложение должно был повторно исполнить наборов инструкций 1 и 2 для повторного создания результата и reprepare инструкцию в инструкцию 3.
В режиме автоматической фиксации, функций, кроме **SQLEndTran** фиксации транзакции:
- **SQLExecute** или **SQLExecDirect** в предыдущем примере вызов **SQLExecute** на шаге 4 фиксирует транзакцию. В этом случае источник данных для закрытия курсора инструкции 1 и 2 и удалить план доступа для всех инструкций для этого соединения.
- **SQLBulkOperations** или **SQLSetPos** в предыдущем примере, предположим, что на шаге 4 приложение вызывает **SQLSetPos** с параметром sql_update инструкции 2, вместо выполнения позиционированные инструкции update на инструкцию 3. Это фиксирует транзакции вызывает закрытие курсоров в инструкциях 1 и 2 источника данных и отменяет все планы доступа для этого соединения.
- **SQLCloseCursor** в предыдущем примере, предположим, что когда пользователь выделяет другой заказ на продажу, приложение вызывает **SQLCloseCursor** на инструкцию 2 перед созданием результат строки для новых продаж порядок. Вызов **SQLCloseCursor** фиксирует **ВЫБЕРИТЕ** инструкцию создал результирующий набор строк и вызывает источника данных закрыть курсор на инструкцию 1, а затем удаляет все планы доступ на этот соединение.
Приложения, особенно на экране в которых пользователь прокручивает вокруг результирующего набора и обновления или удаляются строки, должна быть указана в код решения этой проблемы.
Чтобы определить, как источник данных ведет себя при фиксации или отката транзакции, приложение вызывает **SQLGetInfo** параметры SQL_CURSOR_COMMIT_BEHAVIOR и SQL_CURSOR_ROLLBACK_BEHAVIOR.
| 61.292308 | 436 | 0.795432 | rus_Cyrl | 0.97002 |
9362b0afc6bfcbd98974d8396bae92fdd1697979 | 552 | md | Markdown | README.md | devbytom/gh-webhook | d09f403cb1683833fd3eefce018d43497e5ca67d | [
"MIT"
] | 1 | 2022-01-13T00:43:13.000Z | 2022-01-13T00:43:13.000Z | README.md | 7onn/gh-webhook | d09f403cb1683833fd3eefce018d43497e5ca67d | [
"MIT"
] | null | null | null | README.md | 7onn/gh-webhook | d09f403cb1683833fd3eefce018d43497e5ca67d | [
"MIT"
] | null | null | null | # starting with clojure
The project uses [Midje](https://github.com/marick/Midje/).
## How to run the tests
- `lein midje` will run all tests.
- `lein midje :autotest` will run all the tests indefinitely. It sets up a
watcher on the code files. If they change, only the relevant tests will be
run again.
## How to start the program
- `lein run` will run the core namespace set at `project.clj`
## Available on docker
- `docker build -t gh-webhook:latest .`
- `docker run -p 3000:3000 gh-webhook:latest`
## Http Availability
- http://localhost:3000/ | 30.666667 | 74 | 0.726449 | eng_Latn | 0.957121 |
9362e7df1c11a0487c954825f425a669118a273f | 15 | md | Markdown | README.md | lavenderofme/MySina | 1a5b3c417dec0c14df4f8962c3f614fc88b12a7a | [
"Apache-2.0"
] | null | null | null | README.md | lavenderofme/MySina | 1a5b3c417dec0c14df4f8962c3f614fc88b12a7a | [
"Apache-2.0"
] | null | null | null | README.md | lavenderofme/MySina | 1a5b3c417dec0c14df4f8962c3f614fc88b12a7a | [
"Apache-2.0"
] | null | null | null | # MySina
新浪微博
| 5 | 8 | 0.666667 | vie_Latn | 0.447623 |
9363a896887da60b9e0d6c7b4d9cc0c6b44ab294 | 2,595 | md | Markdown | generated/resources/microsoft.resources/2019-06-01-preview/types.md | anthony-c-martin/bicep-types-az | e0820bb3e25274304ff23820a3d9820786a60079 | [
"MIT"
] | null | null | null | generated/resources/microsoft.resources/2019-06-01-preview/types.md | anthony-c-martin/bicep-types-az | e0820bb3e25274304ff23820a3d9820786a60079 | [
"MIT"
] | 2 | 2021-03-05T08:04:56.000Z | 2021-03-05T20:09:30.000Z | generated/resources/microsoft.resources/2019-06-01-preview/types.md | anthony-c-martin/bicep-types-az | e0820bb3e25274304ff23820a3d9820786a60079 | [
"MIT"
] | null | null | null | # Microsoft.Resources @ 2019-06-01-preview
## Resource Microsoft.Resources/templateSpecs@2019-06-01-preview
* **Valid Scope(s)**: ResourceGroup
### Properties
* **apiVersion**: '2019-06-01-preview' (ReadOnly, DeployTimeConstant)
* **id**: string (ReadOnly, DeployTimeConstant)
* **location**: string (Required)
* **name**: string (Required, DeployTimeConstant)
* **properties**: TemplateSpecProperties
* **systemData**: systemData (ReadOnly)
* **tags**: Dictionary<string,String>
* **type**: 'Microsoft.Resources/templateSpecs' (ReadOnly, DeployTimeConstant)
## Resource Microsoft.Resources/templateSpecs/versions@2019-06-01-preview
* **Valid Scope(s)**: ResourceGroup
### Properties
* **apiVersion**: '2019-06-01-preview' (ReadOnly, DeployTimeConstant)
* **id**: string (ReadOnly, DeployTimeConstant)
* **location**: string (Required)
* **name**: string (Required, DeployTimeConstant)
* **properties**: TemplateSpecVersionProperties (Required)
* **systemData**: systemData (ReadOnly)
* **tags**: Dictionary<string,String>
* **type**: 'Microsoft.Resources/templateSpecs/versions' (ReadOnly, DeployTimeConstant)
## TemplateSpecProperties
### Properties
* **description**: string
* **displayName**: string
* **versions**: Dictionary<string,TemplateSpecVersionInfo> (ReadOnly)
## Dictionary<string,TemplateSpecVersionInfo>
### Properties
### Additional Properties
* **Additional Properties Type**: TemplateSpecVersionInfo
## TemplateSpecVersionInfo
### Properties
* **description**: string (ReadOnly)
* **timeCreated**: string (ReadOnly)
* **timeModified**: string (ReadOnly)
## systemData
### Properties
* **createdAt**: string
* **createdBy**: string
* **createdByType**: 'Application' | 'Key' | 'ManagedIdentity' | 'User'
* **lastModifiedAt**: string
* **lastModifiedBy**: string
* **lastModifiedByType**: 'Application' | 'Key' | 'ManagedIdentity' | 'User'
## Dictionary<string,String>
### Properties
### Additional Properties
* **Additional Properties Type**: string
## TemplateSpecVersionProperties
### Properties
* **artifacts**: TemplateSpecArtifact[]
* **description**: string
* **template**: any
## TemplateSpecArtifact
* **Discriminator**: kind
### Base Properties
* **path**: string (Required)
### template
#### Properties
* **kind**: 'template' (Required)
* **template**: any (Required)
## template
### Properties
* **kind**: 'template' (Required)
* **template**: any (Required)
## Dictionary<string,String>
### Properties
### Additional Properties
* **Additional Properties Type**: string
| 30.892857 | 88 | 0.6921 | yue_Hant | 0.603756 |
936536d542291c8cb0406eec9cfefcc9d4577f20 | 2,533 | md | Markdown | README.md | whoisglover/dgtipcalculator | bac561c15186f7df77b63daed7c1395072ef9a5f | [
"Apache-2.0"
] | null | null | null | README.md | whoisglover/dgtipcalculator | bac561c15186f7df77b63daed7c1395072ef9a5f | [
"Apache-2.0"
] | 1 | 2017-03-16T20:24:19.000Z | 2017-03-16T20:24:19.000Z | README.md | whoisglover/dgtipcalculator | bac561c15186f7df77b63daed7c1395072ef9a5f | [
"Apache-2.0"
] | null | null | null | # dgtipcalculator
Tip Calculator - Pre Work for Codepath iOS Bootcamp 2017
# Pre-work - TipCalculator
TipCalculator is a tip calculator application for iOS.
Submitted by: Danny Glover
Time spent: 12 hours spent in total
## User Stories
The following **required** functionality is complete:
* [X] User can enter a bill amount, choose a tip percentage, and see the tip and total values.
* [X] Settings page to change the default tip percentage.
The following **optional** features are implemented:
* [ ] UI animations
* [X] UPDATE 2: Remembering the bill amount across app restarts (if <10mins)
* [X] Using locale-specific currency and currency thousands separators.
* [X] Making sure the keyboard is always visible and the bill amount is always the first responder. This way the user doesn't have to tap anywhere to use this app. Just launch the app and start typing.
The following **additional** features are implemented:
- [X] Autolayout constraints used to ensure proper display in portrait and landscape modes
- UPDATE 1: limited app to portrait mode to ensure best user experience
- [X] UPDATE 1: Display amount for splitting the total between 1-4 people (lions). Implemented using stack views and auto layout for alignment
- [X] UPDATE 1: Use a UISlider to allow the user to adjust the tip amount (as well as default tip amount)
- [X] UPDATE 2: Add a dark mode setting, this is remembered across app restarts
## Video Walkthrough
Here's a walkthrough of implemented user stories:
<img src='http://i.imgur.com/sZKJZOc.gif' title='Video Walkthrough' width='' alt='Video Walkthrough' />
GIF created with [LiceCap](http://www.cockos.com/licecap/).
## Notes
Had some troubles with the amount displaying properly if I removed the border on the input textfield. I'm not sure if this was an issue with the code or just a display issue with the simulator. I will test further with a real device in the future.
## License
Copyright 2017 Danny Glover
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
| 40.206349 | 247 | 0.757994 | eng_Latn | 0.994216 |
9365d4c29e921fbdaaaa832d0764f97361674abc | 1,113 | md | Markdown | DataSources/McAfee/McAfee_DLP/Ps/pC_nforwardedcefmcafeeepodlp.md | avu2002/Content-Doc | 4208691b3874ac71e102f2b24ccbfa76deee297a | [
"MIT"
] | null | null | null | DataSources/McAfee/McAfee_DLP/Ps/pC_nforwardedcefmcafeeepodlp.md | avu2002/Content-Doc | 4208691b3874ac71e102f2b24ccbfa76deee297a | [
"MIT"
] | null | null | null | DataSources/McAfee/McAfee_DLP/Ps/pC_nforwardedcefmcafeeepodlp.md | avu2002/Content-Doc | 4208691b3874ac71e102f2b24ccbfa76deee297a | [
"MIT"
] | null | null | null | #### Parser Content
```Java
{
Name = n-forwarded-cef-mcafee-epo-dlp
Vendor = McAfee
Product = McAfee DLP
Lms = NitroCefSyslog
DataType = "dlp-alert"
TimeFormat = "epoch"
Conditions = [ """|McAfee|ESM|""", """|359-""", """act=alert""" ]
Fields = [
"""\send=({time}\d{1,100})""",
"""\sdeviceTranslatedAddress=({host}\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})""",
"""\ssrc=({src_ip}\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})""",
"""\sshost=({src_host}[^\s]{1,2000})""",
"""\|McAfee\|ESM\|[^\|]{1,2000}\|359-({signature_id}[^\|]{1,2000})""",
"""\seventId=({alert_id}[^\s]{1,2000})\s[^\s]{1,2000}?=""",
"""\snitroThreat_Name =({alert_name}.+?)\s[^\s]{1,2000}?=""",
"""\snitroObject_Type=({alert_type}.+?)\s[^\s]{1,2000}?=""",
"""\sduser=([^\\=]{1,2000}?\\)?({user}.+?)\s[^\s]{1,2000}?=""",
"""\sdst=({dest_ip}\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})""",
"""\snitroProcess_Name =({process}({directory}(?:[^=]{1,2000})?[\\\/])?({process_name}[^\\\/=]{1,2000}))\s[^\s]{1,2000}="""
]
DupFields = ["directory->process_directory"]
}
``` | 39.75 | 129 | 0.473495 | yue_Hant | 0.162882 |
9366d65eb7f000e7b8d295eb7d0257a15e3809bb | 412 | md | Markdown | CHANGELOG.md | TobiasAppD/Appdynamics-linux-monitoring-extension- | dafe90cf44d51b398e891140e6bed48ce2c8516a | [
"Apache-2.0"
] | 3 | 2015-12-15T14:34:23.000Z | 2021-12-16T10:31:39.000Z | CHANGELOG.md | TobiasAppD/Appdynamics-linux-monitoring-extension- | dafe90cf44d51b398e891140e6bed48ce2c8516a | [
"Apache-2.0"
] | 1 | 2019-02-26T09:25:29.000Z | 2019-02-26T09:25:29.000Z | CHANGELOG.md | TobiasAppD/Appdynamics-linux-monitoring-extension- | dafe90cf44d51b398e891140e6bed48ce2c8516a | [
"Apache-2.0"
] | 6 | 2015-07-31T18:03:09.000Z | 2020-07-08T10:32:46.000Z | # AppDynamics Extensions Linux Monitor CHANGELOG
## 2.1.5 - Jan8, 2021
1. Updated to appd-exts-commons 2.2.4
## 2.1.4 - Jun5, 2020
1. Disk Usgae Bug fix
## 2.1.3 - May 8, 2020
1. Moved Linux monitor to commons 2.2.3 framework
## 2.1.1 - Jan 10, 2019
1. Moved Linux monitor to commons 2.0.0 framework
2. Moved metrics configurations from config.yml to metrics.xml
3. Added filter for CPU, Disks and NFSMounts. | 27.466667 | 62 | 0.718447 | kor_Hang | 0.454446 |
93678eaab9c1deba9495866f6d85487e4fc5041a | 34 | md | Markdown | README.md | mtianyan/Python3Database | 9e40d3f9f5cfe3a34c40ee6e2c0cf4b636236e05 | [
"Apache-2.0"
] | 8 | 2018-11-14T16:15:45.000Z | 2022-01-02T14:24:45.000Z | README.md | mtianyan/Python3Database | 9e40d3f9f5cfe3a34c40ee6e2c0cf4b636236e05 | [
"Apache-2.0"
] | null | null | null | README.md | mtianyan/Python3Database | 9e40d3f9f5cfe3a34c40ee6e2c0cf4b636236e05 | [
"Apache-2.0"
] | 4 | 2019-04-23T03:24:16.000Z | 2021-11-19T02:45:58.000Z | # Python3Database
Python操作三大数据库学习
| 11.333333 | 17 | 0.882353 | zul_Latn | 0.246206 |
9367a730312b9cb777f13eb4c44080b180840ea2 | 106 | md | Markdown | .github/pull_request_template.md | cahlchang/sound-of-cthulhu | c848caa8a32a6e2913f1317e571c8506dddce8e9 | [
"MIT"
] | null | null | null | .github/pull_request_template.md | cahlchang/sound-of-cthulhu | c848caa8a32a6e2913f1317e571c8506dddce8e9 | [
"MIT"
] | 1 | 2020-07-09T10:08:41.000Z | 2020-07-09T10:08:41.000Z | .github/pull_request_template.md | cahlchang/sound-of-cthulhu | c848caa8a32a6e2913f1317e571c8506dddce8e9 | [
"MIT"
] | null | null | null | Fixed # .(issue ID)
## Purpose
- a
- b
- c
## Extra Messages
- x
- y
- z
@gimKondo (or some reviewers)
| 7.571429 | 29 | 0.575472 | eng_Latn | 0.960433 |
93685b4c43177887486713ef21a0f85a7c46c69d | 140 | md | Markdown | README.md | meziaris/myriad-claimer | 03ad5c19d4b64e7b879e5791b1fa85bf21801437 | [
"MIT"
] | null | null | null | README.md | meziaris/myriad-claimer | 03ad5c19d4b64e7b879e5791b1fa85bf21801437 | [
"MIT"
] | null | null | null | README.md | meziaris/myriad-claimer | 03ad5c19d4b64e7b879e5791b1fa85bf21801437 | [
"MIT"
] | 2 | 2021-09-01T01:54:42.000Z | 2021-11-15T02:13:25.000Z | # myriad-scraper
Run a peer in a Docker container so the app has something to connect to
`docker run -d -p 8765:8765 --name gundb gundb/gun` | 46.666667 | 71 | 0.757143 | eng_Latn | 0.989569 |
936875eebe00e86f2b3d1fc694b0520ed98bfc91 | 720 | md | Markdown | 1-machine-learning-foundations/README.md | gilbertosg/machine-learning | 5faee3b89f9100747e5188584ec4e4ce0f70ed98 | [
"MIT"
] | null | null | null | 1-machine-learning-foundations/README.md | gilbertosg/machine-learning | 5faee3b89f9100747e5188584ec4e4ce0f70ed98 | [
"MIT"
] | null | null | null | 1-machine-learning-foundations/README.md | gilbertosg/machine-learning | 5faee3b89f9100747e5188584ec4e4ce0f70ed98 | [
"MIT"
] | 2 | 2017-09-06T01:31:42.000Z | 2019-05-16T13:34:23.000Z | Machine Learning Foundations
---
### Lecture Overview
| Week | Description |
|--------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Week 1 | Welcome |
| [Week 2](https://github.com/gilbertosg/machine-learning/tree/master/1-machine-learning-foundations/Week-2) | Regression: Predicting House Prices |
| Week 3 | Classification: Analyzing Sentiment |
| Week 4 | Clustering and Similarity: Retrieving documents |
| Week 5 | Recommending Products |
| Week 6 | Deep Learning: Searching for Images |
| 51.428571 | 276 | 0.456944 | yue_Hant | 0.149268 |
936983e6d60c05709334f2a058d0ad40112093e7 | 2,289 | md | Markdown | README.md | emertonom/Daydreamify | de140d5bbe722ec6763845eca67dc4006169279e | [
"MIT"
] | 4 | 2018-03-08T21:55:42.000Z | 2019-06-06T00:09:23.000Z | README.md | emertonom/Daydreamify | de140d5bbe722ec6763845eca67dc4006169279e | [
"MIT"
] | null | null | null | README.md | emertonom/Daydreamify | de140d5bbe722ec6763845eca67dc4006169279e | [
"MIT"
] | 1 | 2019-08-05T19:04:12.000Z | 2019-08-05T19:04:12.000Z | This project is intended to convert a "current_device_params" file
from the Cardboard format to the Daydream format, so that the headset
will be marked as supporting the Daydream controller and keyboard,
but will retain its settings for lenses and physical configuration.
The conversion is extremely simple: we append a fixed set of bytes to
the end of the file, and then add that number of bytes to the VarInt
stored starting at the 8th byte of the file (which indicates the file length).
(This may shift the contents back a bit.)
Instructions for use:
Prerequisites:
ADB (from the Android SDK)
Step 1:
Scan a suitable QR code for your headset, to set up lens correction
and so forth. If your device doesn't have a suitable code, you
may need to generate one using the instructions at
http://wwgc.firebaseapp.com/
Step 2:
Create or navigate to a directory you want to use for this purpose,
and open a command line there.
In the command line, execute this command:
adb pull /storage/emulated/0/Cardboard/current_device_params
This will copy the parameters for your headset onto your computer.
Step 3:
Run the daydreamify app.
Step 4:
Run this command:
adb push new_device_params /storage/emulated/0/Cardboard/current_device_params
Step 5:
Go into the phone settings, and open Apps settings.
Scroll down to Google VR Services, and open it.
Click on Storage.
On the screen that comes up, choose Clear Cache, then Clear Data.
Both the Cache line and the Data line should now show 0.
Step 6:
Run Daydream again, and proceed through the setup screens. When it gets
to the screen asking you to set up the controller, if you don't have a
controller, just hit "back" and then relaunch the Daydream app, and
it'll take you to the home Daydream screen (with app choices and such).
Step 7:
When you get to the home Daydream screen, use the hamburger menu to
bring up Settings. The headset should show as "Default."
Click on this setting. Daydream will ask for permission to access files,
then permission to record photos. Give it both. It will bring up a screen
asking you to scan a QR code. DO NOT SCAN THE QR CODE AGAIN.
Instead, hit the "back" button.
Now Daydream should still list your custom headset, but the
Controller and Keyboard options should no longer be grayed out!
| 35.215385 | 78 | 0.784622 | eng_Latn | 0.997401 |
93698f85a50bc2d9452b4024727f8e74bcfbdaa8 | 288 | md | Markdown | _posts/1974-01-04-bomb-threats-made-against-bahamian.md | MiamiMaritime/miamimaritime.github.io | d087ae8c104ca00d78813b5a974c154dfd9f3630 | [
"MIT"
] | null | null | null | _posts/1974-01-04-bomb-threats-made-against-bahamian.md | MiamiMaritime/miamimaritime.github.io | d087ae8c104ca00d78813b5a974c154dfd9f3630 | [
"MIT"
] | null | null | null | _posts/1974-01-04-bomb-threats-made-against-bahamian.md | MiamiMaritime/miamimaritime.github.io | d087ae8c104ca00d78813b5a974c154dfd9f3630 | [
"MIT"
] | null | null | null | ---
title: Bomb threats made against Bahamian
tags:
- Jan 1974
---
Bomb threats made against Bahamian ships in the Miami River causes the planned cleanup to be postponed. [Photo]
Newspapers: **Miami Morning News or The Miami Herald**
Page: **1**, Section: **B**
| 24 | 113 | 0.666667 | eng_Latn | 0.99058 |
936a0165dee7424a78d4a46c41ce71343e8b8907 | 573 | md | Markdown | _posts/2007/2007-08-07-laserjet-memory-upgrade.md | thingles/thingles.github.io | 46b8d98959bcae7ddd6e89d13070fe3e7644d40b | [
"MIT"
] | 2 | 2016-08-23T03:14:22.000Z | 2017-05-04T03:48:16.000Z | _posts/2007/2007-08-07-laserjet-memory-upgrade.md | jthingelstad/thingles.github.io | 46b8d98959bcae7ddd6e89d13070fe3e7644d40b | [
"MIT"
] | 29 | 2017-05-09T01:10:31.000Z | 2017-11-04T20:29:56.000Z | _posts/2007/2007-08-07-laserjet-memory-upgrade.md | jthingelstad/thingles.github.io | 46b8d98959bcae7ddd6e89d13070fe3e7644d40b | [
"MIT"
] | null | null | null | ---
title: LaserJet Memory Upgrade
categories:
- Techie
---
I think I'm like most people and never really think about my printer. Last week though I thought about it a lot as I waited forever to have it print some really complicated documents. I decided to take action and ordered some more memory. I just got done slapping a 128MB DIMM in my HP LaserJet 1320. Why? Because you can and it costs so very little. And I expect I'll have this printer for a good long while. It sure prints faster.

| 52.090909 | 432 | 0.755672 | eng_Latn | 0.998887 |
936a7d5d0b7e8bbdd7a53ee00c893ea08ea2b8f9 | 34 | md | Markdown | README.md | KentuckyFriedTofu/kentuckyfriedtofu.github.io | 658fcaf73970512c6d5616e3d6da36f71f1e5ef8 | [
"MIT"
] | null | null | null | README.md | KentuckyFriedTofu/kentuckyfriedtofu.github.io | 658fcaf73970512c6d5616e3d6da36f71f1e5ef8 | [
"MIT"
] | null | null | null | README.md | KentuckyFriedTofu/kentuckyfriedtofu.github.io | 658fcaf73970512c6d5616e3d6da36f71f1e5ef8 | [
"MIT"
] | null | null | null | Personal website of Junior Tidal
| 17 | 33 | 0.823529 | eng_Latn | 0.97421 |
936a7e552ef7f0907491626bcad0f62007c35cbe | 97 | md | Markdown | _pages/project.md | pymass/pymass.github.io | 8b8d780dc2bdf8a4c0ece7a714963697c559cbf4 | [
"MIT"
] | null | null | null | _pages/project.md | pymass/pymass.github.io | 8b8d780dc2bdf8a4c0ece7a714963697c559cbf4 | [
"MIT"
] | null | null | null | _pages/project.md | pymass/pymass.github.io | 8b8d780dc2bdf8a4c0ece7a714963697c559cbf4 | [
"MIT"
] | null | null | null | ---
title: "Project"
layout: category-project
permalink: /project/
author_profile: false
--- | 16.166667 | 25 | 0.701031 | eng_Latn | 0.520442 |
936aaba048704c54e9df4f839bfcc2996966c00e | 1,885 | md | Markdown | README.md | msubero/agile-engine-test | f4726ce4b0dbd90c24442c5001bb41682518ec3a | [
"MIT"
] | null | null | null | README.md | msubero/agile-engine-test | f4726ce4b0dbd90c24442c5001bb41682518ec3a | [
"MIT"
] | 3 | 2021-03-11T07:23:32.000Z | 2022-02-27T11:16:39.000Z | README.md | msubero/agile-engine-test | f4726ce4b0dbd90c24442c5001bb41682518ec3a | [
"MIT"
] | null | null | null | # Accounting notebook
Node.js and React.js money account system
## About
Web service application that emulates the logic of financial transactions (debit and credit). Real "transactional" work is not done here
## Getting Started
1. Start the server on `projects/api` folder:
```bash
npm run dev
```
Once started, it can now be accessed via port `4040`
2. Run the app on `projects/app` folder:
```bash
npm run start
```
Open on port `3000` to view it in the browser
## Tests
Run on `projects/api` folder:
```bash
npm test
```
## Notes
**Considerations:**
The system emulates debit and credit operations for a single user. We will always have only one financial account.
No security is required.
No real persistence is expected. No DB integrations were made.
**Must have:**
1. Service must store the account value of the single user.
2. The service must be able to accept credit and debit financial transactions and update the account value accordingly.
3. Any transaction that results in a negative amount within the system must be rejected.
4. The application must store the transaction history.
5. Storage must be able to handle multiple transactions at the same time, where read transactions must not lock storage and write transactions must lock read and write operations.
6. Minimal scope of operations required: `GET /transactions`, `POST /transactions` and `GET /transactions/:id`.
**UI/UX requirements:**
1. Simple UI app for this web service
2. The UI app should only display the transaction history list. No other operation is required.
3. The list of transactions should be done in accordion manner. By default, the list shows a short summary (type and amount) of each transaction. The complete information of a transaction must be shown to the user by clicking on it.
4. Credit and debit transactions must be differentiated by color.
## License
MIT
| 29.453125 | 232 | 0.766048 | eng_Latn | 0.993541 |
936abc0ebd4dbb970ad0e930961df87f02be87cd | 553 | md | Markdown | docs/API/Link.TypeExtension.md | rikedyp/link | 98f3a98188633a108e325b5d62f83cabd50b42e1 | [
"MIT"
] | 7 | 2018-04-28T18:46:00.000Z | 2022-01-24T10:43:18.000Z | docs/API/Link.TypeExtension.md | rikedyp/link | 98f3a98188633a108e325b5d62f83cabd50b42e1 | [
"MIT"
] | 348 | 2018-03-28T13:02:11.000Z | 2022-03-19T08:43:54.000Z | docs/API/Link.TypeExtension.md | rikedyp/link | 98f3a98188633a108e325b5d62f83cabd50b42e1 | [
"MIT"
] | 23 | 2018-03-28T13:10:36.000Z | 2022-03-06T11:14:20.000Z | # Link.TypeExtension
ext ← opts ⎕SE.Link.TypeExtension nc
This function is used internally by Link, but is also available for use when implementing extensions to Link using exits like `beforeRead` and `beforeWrite`.
#### Right Argument
- nameclass of item
#### Left argument
- link options namespace used as left argument of [Link.Create](Link.Create.md)
#### Result
- character vector of the extension (without leading `'.'`)\
Note that extension will be (`,'/'`) for unscripted namespaces (name class `¯9`) because they map to directories | 30.722222 | 157 | 0.735986 | eng_Latn | 0.997264 |
936b21e72b8304080c2758f94fed1679bd73f556 | 1,533 | md | Markdown | biztalk/core/role-roles-node.md | SicongLiuSimon/biztalk-docs | 85394b436d277504d9e759c655608888123785bd | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-06-14T19:45:26.000Z | 2019-06-14T19:45:26.000Z | biztalk/core/role-roles-node.md | AzureMentor/biztalk-docs | 16b211f29ad233c26d5511475c7e621760908af3 | [
"CC-BY-4.0",
"MIT"
] | 7 | 2020-01-09T22:34:58.000Z | 2020-02-18T19:42:16.000Z | biztalk/core/role-roles-node.md | AzureMentor/biztalk-docs | 16b211f29ad233c26d5511475c7e621760908af3 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2017-06-23T18:30:28.000Z | 2017-11-28T01:11:25.000Z | ---
title: "Role (Roles Node) | Microsoft Docs"
ms.custom: ""
ms.date: "06/08/2017"
ms.prod: "biztalk-server"
ms.reviewer: ""
ms.suite: ""
ms.tgt_pltfrm: ""
ms.topic: "article"
helpviewer_keywords:
- "Role node [binding file]"
ms.assetid: dfe2a579-7090-4d85-87e5-d627598c4ee8
caps.latest.revision: 6
author: "MandiOhlinger"
ms.author: "mandia"
manager: "anneta"
---
# Role (Roles Node)
The Role node of the Roles node of a binding file specifies information about a role that is bound to a service that is exported with the binding file.
## Nodes in the Role node
The following table lists the properties that can be set for this node of a binding file:
|**Name**|**Node Type**|**Data Type**|**Description**|**Restrictions**|**Comments**|
|--------------|-------------------|-------------------|---------------------|----------------------|------------------|
|Name|Attribute|xs:string|Specifies the name of the role.|Not required|Default value: empty|
|RoleLinkTypeName|Attribute|xs:string|Specifies the name of the role link type associated with the role|Not required|Default value: empty|
|RoleType|Attribute|RoleRefType (SimpleType)|Specifies the role type associated with the role.|Required|Default value: none<br /><br /> Possible values include:<br /><br /> - Unknown<br />- Implements<br />- Uses|
|[Enlisted Parties](../core/enlisted-parties-role-node.md)|Record|ArrayOfEnlistedParty (ComplexType)|Container node for the enlisted parties bound to this role.|Not required|Default value: none| | 51.1 | 221 | 0.677104 | eng_Latn | 0.783133 |
936cc26121b46f759e179e54467b230732c68938 | 3,400 | markdown | Markdown | go/src/github.com/hashicorp/terraform/website/source/docs/providers/google/index.html.markdown | ezgikaysi/koding | 5956d6ca5f804e58452fe2aea548328fdb8acff8 | [
"Apache-2.0"
] | 1 | 2021-09-30T18:44:10.000Z | 2021-09-30T18:44:10.000Z | go/src/github.com/hashicorp/terraform/website/source/docs/providers/google/index.html.markdown | dashersw/koding | b17fe8ac8b7930b7f4b1007bfb75d12e68a24e5a | [
"Apache-2.0"
] | 176 | 2019-12-27T09:01:42.000Z | 2021-08-03T06:19:39.000Z | go/src/github.com/hashicorp/terraform/website/source/docs/providers/google/index.html.markdown | dashersw/koding | b17fe8ac8b7930b7f4b1007bfb75d12e68a24e5a | [
"Apache-2.0"
] | null | null | null | ---
layout: "google"
page_title: "Provider: Google Cloud"
sidebar_current: "docs-google-index"
description: |-
The Google Cloud provider is used to interact with Google Cloud services. The provider needs to be configured with the proper credentials before it can be used.
---
# Google Cloud Provider
The Google Cloud provider is used to interact with
[Google Cloud services](https://cloud.google.com/). The provider needs
to be configured with the proper credentials before it can be used.
Use the navigation to the left to read about the available resources.
## Example Usage
```js
// Configure the Google Cloud provider
provider "google" {
credentials = "${file("account.json")}"
project = "my-gce-project"
region = "us-central1"
}
// Create a new instance
resource "google_compute_instance" "default" {
// ...
}
```
## Configuration Reference
The following keys can be used to configure the provider.
* `credentials` - (Optional) Contents of the JSON file used to describe your
account credentials, downloaded from Google Cloud Console. More details on
retrieving this file are below. Credentials may be blank if you are running
Terraform from a GCE instance with a properly-configured [Compute Engine
Service Account](https://cloud.google.com/compute/docs/authentication). This
can also be specified using any of the following environment variables
(listed in order of precedence):
* `GOOGLE_CREDENTIALS`
* `GOOGLE_CLOUD_KEYFILE_JSON`
* `GCLOUD_KEYFILE_JSON`
* `project` - (Required) The ID of the project to apply any resources to. This
can be specified using any of the following environment variables (listed in
order of precedence):
* `GOOGLE_PROJECT`
* `GCLOUD_PROJECT`
* `CLOUDSDK_CORE_PROJECT`
* `region` - (Required) The region to operate under. This can also be specified
using any of the following environment variables (listed in order of
precedence):
* `GOOGLE_REGION`
* `GCLOUD_REGION`
* `CLOUDSDK_COMPUTE_REGION`
The following keys are supported for backwards compatibility, and may be
removed in a future version:
* `account_file` - __Deprecated: please use `credentials` instead.__
Path to or contents of the JSON file used to describe your
account credentials, downloaded from Google Cloud Console. More details on
retrieving this file are below. The `account file` can be "" if you are running
terraform from a GCE instance with a properly-configured [Compute Engine
Service Account](https://cloud.google.com/compute/docs/authentication). This
can also be specified with the `GOOGLE_ACCOUNT_FILE` shell environment
variable.
## Authentication JSON File
Authenticating with Google Cloud services requires a JSON
file which we call the _account file_.
This file is downloaded directly from the
[Google Developers Console](https://console.developers.google.com). To make
the process more straightforwarded, it is documented here:
1. Log into the [Google Developers Console](https://console.developers.google.com)
and select a project.
2. Click the menu button in the top left corner, and navigate to "Permissions",
then "Service accounts", and finally "Create service account".
3. Provide a name and ID in the corresponding fields, select
"Furnish a new private key", and select "JSON" as the key type.
4. Clicking "Create" will download your `credentials`.
| 35.051546 | 162 | 0.757059 | eng_Latn | 0.987839 |
936d6cc9ccb7623d369d9793f9ffea140361afa4 | 2,398 | md | Markdown | docs/about.md | xotl-uoo/scripts | 7f8dcb9ef64b5d098bec25b99b2cbc7c989a1a26 | [
"MIT"
] | null | null | null | docs/about.md | xotl-uoo/scripts | 7f8dcb9ef64b5d098bec25b99b2cbc7c989a1a26 | [
"MIT"
] | null | null | null | docs/about.md | xotl-uoo/scripts | 7f8dcb9ef64b5d098bec25b99b2cbc7c989a1a26 | [
"MIT"
] | null | null | null |
Hello, my name is **Xotl**. I joined UO Outlands in December 2020 - yes, I am that new! I don't even have a house yet! Like most players new to the server, I started looking for scripts and found several github repositories forked over and over, many in a state of limbo and certainly not maintained. Being an avid fan of Ultima Online, an experienced UOSteam scripter, and a professional programmer, I decided to start my own repository.
I am a scripting *enthusiast* and, like you, always learning how to do things better. If you see anything that needs improvement or have ideas, please send them along.
***
# Goals
## update
I am combing through all abandoned github repositories available, then audit, test, publish the new versions. Since the Razor client is now supporting more commands, when possible, I will also provide Razor versions of these scripts.
## streamline
After doing this for a few years, I have realized there is no script that will fit everyone 100%. Therefore, I try to provide versions that *most* people can use. Feel free to further customize to fit your needs. Much of my work is intended to teach you how to do something, and be a template for customization.
## test
Testing is *huge*, and every script I post will have been tested by me. To help re-testing efforts, as server code can break a script, I record the test dates with each script. Then, after a few months, I can revisit them.
## easier copying
Typical scripts in github require users to go to the [RAW] format, select all the lines of the script, then copy to the clipboard. By publishing them as code snippets, the website offers a [copy] button for you. With one click, the script is in your clipboard ready to paste.
***
# Attribution
Attribution with these scripts is difficult, since most people just copy-paste snippets they find somewhere else, and the "true author" has probably been lost. For this reason, I do not claim to be the original author of every line of code. I will claim, however, to have audited and tested every script. You do not need to give me any attribution for scripts contained herein - please use them as you see fit and enjoy. If you send me a script, please include your name so I can at least give credit where it is due.
***
# Contact
You can send me a private message at **Xotl** in the UO Outlands Discord server. I always appreciate requests, support, and jokes.
| 109 | 517 | 0.775229 | eng_Latn | 0.999873 |
936da29e4fccb22da7db57d7a59bd39af9b01be0 | 1,467 | md | Markdown | _posts/2019-01-21-Skeleton-based-Action-Recognition-of-People-Handling-Objects.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 7 | 2018-02-11T01:50:19.000Z | 2020-01-14T02:07:17.000Z | _posts/2019-01-21-Skeleton-based-Action-Recognition-of-People-Handling-Objects.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | null | null | null | _posts/2019-01-21-Skeleton-based-Action-Recognition-of-People-Handling-Objects.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 4 | 2018-02-04T15:58:04.000Z | 2019-08-29T14:54:14.000Z | ---
layout: post
title: "Skeleton-based Action Recognition of People Handling Objects"
date: 2019-01-21 11:20:11
categories: arXiv_CV
tags: arXiv_CV Pose_Estimation Action_Recognition CNN Recognition
author: Sunoh Kim, Kimin Yun, Jongyoul Park, Jin Young Choi
mathjax: true
---
* content
{:toc}
##### Abstract
In visual surveillance systems, it is necessary to recognize the behavior of people handling objects such as a phone, a cup, or a plastic bag. In this paper, to address this problem, we propose a new framework for recognizing object-related human actions by graph convolutional networks using human and object poses. In this framework, we construct skeletal graphs of reliable human poses by selectively sampling the informative frames in a video, which include human joints with high confidence scores obtained in pose estimation. The skeletal graphs generated from the sampled frames represent human poses related to the object position in both the spatial and temporal domains, and these graphs are used as inputs to the graph convolutional networks. Through experiments over an open benchmark and our own data sets, we verify the validity of our framework in that our method outperforms the state-of-the-art method for skeleton-based action recognition.
##### Abstract (translated by Google)
##### URL
[http://arxiv.org/abs/1901.06882](http://arxiv.org/abs/1901.06882)
##### PDF
[http://arxiv.org/pdf/1901.06882](http://arxiv.org/pdf/1901.06882)
| 56.423077 | 957 | 0.788684 | eng_Latn | 0.988237 |
936dbb7ddec2346702024c77583d803d54522bcf | 122 | md | Markdown | README.md | friskycodeur/tell-your-story | 92c89417a3e12c0b95f174ff74768aa924a45d4a | [
"Apache-2.0"
] | null | null | null | README.md | friskycodeur/tell-your-story | 92c89417a3e12c0b95f174ff74768aa924a45d4a | [
"Apache-2.0"
] | null | null | null | README.md | friskycodeur/tell-your-story | 92c89417a3e12c0b95f174ff74768aa924a45d4a | [
"Apache-2.0"
] | null | null | null | # tell-your-story
A platform to tell your story to the world about the time you were too low in life but yet you're here.
| 40.666667 | 103 | 0.762295 | eng_Latn | 0.99994 |
936df88852c886a2272ccb7c1d05dd07059f627c | 1,384 | md | Markdown | packages/babel-plugin-relay/README.md | lpalmes/relay | 70bf47751a61ad14798f6dea14fe59e2c8a90aa3 | [
"MIT"
] | null | null | null | packages/babel-plugin-relay/README.md | lpalmes/relay | 70bf47751a61ad14798f6dea14fe59e2c8a90aa3 | [
"MIT"
] | null | null | null | packages/babel-plugin-relay/README.md | lpalmes/relay | 70bf47751a61ad14798f6dea14fe59e2c8a90aa3 | [
"MIT"
] | null | null | null | ## babel-plugin-relay
Relay requires a Babel plugin to convert GraphQL tags to runtime artifacts.
A _very_ simplified example of what this plugin is doing:
```js
// It converts this code
const fragment = graphql`
fragment User_fragment on User {
name
}
`;
// To require generated ASTs for fragments and queries
const fragment = require('__generated__/User_fragment.graphql');
```
## Plugin Configuration
`babel-plugin-relay` will discover the config if:
- There is a `relay.config.json`, `relay.config.js` file at the root of the
project (i.e. in the same folder as the `package.json` file).
- The `package.json` file contains a `"relay"` key.
### Supported configuration options for `babel-plugin-relay`
- `artifactDirectory` A specific directory to output all artifacts to. When
enabling this the babel plugin needs `artifactDirectory` to be set as well.
[string]
- `eagerEsModules` This option enables emitting ES modules artifacts.
[boolean][default: false]
- `codegenCommand` The command to run to compile Relay files. [string]
- `isDevVariableName` Name of the global variable for dev mode (e.g. `__DEV__`).
[string]
- `jsModuleFormat` Formatting style for generated files. `commonjs` or `haste`.
Default is `commonjs`. [string]
[Configuration Instructions](https://relay.dev/docs/getting-started/installation-and-setup/#set-up-babel-plugin-relay)
| 33.756098 | 118 | 0.749277 | eng_Latn | 0.928378 |
936e3bab61311309773ca59d0cdf3ad73a0aeaa7 | 1,903 | md | Markdown | README.md | pop-os/progress-streams | 8d6a6bbc7780f9ead55f241439ddd0116db4a50c | [
"MIT"
] | 19 | 2018-10-30T02:01:31.000Z | 2021-07-09T12:37:55.000Z | README.md | pop-os/progress-streams | 8d6a6bbc7780f9ead55f241439ddd0116db4a50c | [
"MIT"
] | 1 | 2019-06-06T18:59:00.000Z | 2019-07-25T17:30:08.000Z | README.md | pop-os/progress-streams | 8d6a6bbc7780f9ead55f241439ddd0116db4a50c | [
"MIT"
] | 5 | 2018-12-02T19:15:32.000Z | 2021-11-11T08:07:18.000Z | # progress-streams
Rust crate to provide progress callbacks for types which implement `io::Read` or `io::Write`.
## Examples
### Reader
```rust
extern crate progress_streams;
use progress_streams::ProgressReader;
use std::fs::File;
use std::io::Read;
use std::sync::Arc;
use std::sync::atomic::{AtomicUsize, Ordering};
use std::thread;
use std::time::Duration;
fn main() {
let total = Arc::new(AtomicUsize::new(0));
let mut file = File::open("/dev/urandom").unwrap();
let mut reader = ProgressReader::new(&mut file, |progress: usize| {
total.fetch_add(progress, Ordering::SeqCst);
});
{
let total = total.clone();
thread::spawn(move || {
loop {
println!("Read {} KiB", total.load(Ordering::SeqCst) / 1024);
thread::sleep(Duration::from_millis(16));
}
});
}
let mut buffer = [0u8; 8192];
while total.load(Ordering::SeqCst) < 100 * 1024 * 1024 {
reader.read(&mut buffer).unwrap();
}
}
```
### Writer
```rust
extern crate progress_streams;
use progress_streams::ProgressWriter;
use std::io::{Cursor, Write};
use std::sync::Arc;
use std::sync::atomic::{AtomicUsize, Ordering};
use std::thread;
use std::time::Duration;
fn main() {
let total = Arc::new(AtomicUsize::new(0));
let mut file = Cursor::new(Vec::new());
let mut writer = ProgressWriter::new(&mut file, |progress: usize| {
total.fetch_add(progress, Ordering::SeqCst);
});
{
let total = total.clone();
thread::spawn(move || {
loop {
println!("Written {} Kib", total.load(Ordering::SeqCst) / 1024);
thread::sleep(Duration::from_millis(16));
}
});
}
let buffer = [0u8; 8192];
while total.load(Ordering::SeqCst) < 1000 * 1024 * 1024 {
writer.write(&buffer).unwrap();
}
}
``` | 24.397436 | 93 | 0.587493 | eng_Latn | 0.4036 |
936e3cf4d682e6a6db271a5bf249431e8a304fab | 864 | md | Markdown | README.md | Benji377/Z80Emu | cc0d12a267e16e1c03c7cc8ef35b70ef9b582e54 | [
"Apache-2.0"
] | 1 | 2021-05-25T08:18:14.000Z | 2021-05-25T08:18:14.000Z | README.md | Benji377/Z80Emu | cc0d12a267e16e1c03c7cc8ef35b70ef9b582e54 | [
"Apache-2.0"
] | 7 | 2021-01-11T09:24:16.000Z | 2021-05-25T16:04:19.000Z | README.md | Benji377/Z80Emu | cc0d12a267e16e1c03c7cc8ef35b70ef9b582e54 | [
"Apache-2.0"
] | 2 | 2021-01-19T14:53:55.000Z | 2021-05-25T08:23:27.000Z | # Z80Emu
## YAZ80Emu - Yet another Z80 emulator!
Yes, another Z80 emulator.\
It's more like a passion project, but we write a Z80 emulator, just for pure fun
## Limitations
Currently there are some limitations:
- The "RST 0x.." instruction is not fully tested
- The "EX .." and "EXX" instructions are not validated but are considered working (hopefully)
- In the Main instruction set are still a lot of instructions unimplemented at the lower part
- The extended instructions can not be accessed currently
- The IX instructions can not be accessed currently, and following to them the IXCB
- The IY instructions can not be accessed currently, and following to them the IYCB
### We work on testing, researching or validating every instruction (some just take their time)
If someone is interested in supporting us, validating instructions would be a good idea!
| 43.2 | 95 | 0.777778 | eng_Latn | 0.99973 |
936ec982cfb7ce2dc68c55cdb9155699a520aeb4 | 2,623 | md | Markdown | index.md | hfpostma/hfpostma.github.io | ab54031a4d825b122bcc6e1ed5d7f012ed4b33ee | [
"MIT"
] | null | null | null | index.md | hfpostma/hfpostma.github.io | ab54031a4d825b122bcc6e1ed5d7f012ed4b33ee | [
"MIT"
] | null | null | null | index.md | hfpostma/hfpostma.github.io | ab54031a4d825b122bcc6e1ed5d7f012ed4b33ee | [
"MIT"
] | null | null | null | ---
layout: front
title: Home
tagline: Supporting tagline
published: true
---
{% include JB/setup %}
<a href="https://www.flickr.com/photos/desiitaly/2304874364" title="View photo on Flickr" target="_blank"><img src="https://farm3.staticflickr.com/2372/2304874364_b7ea60191e_o.jpg" style="width: 800px;"></a><br />
<h5><a href="https://www.flickr.com/people/desiitaly/" title="View user on Flickr" target="_blank">Credit</a></h5>
## Instructions how to publish to your [Jekyll](http://jekyllrb.com/) website hosted on [GitHub](https://github.com/) using [Prose](http://prose.io/#hfpostma) with [Markdown](http://daringfireball.net/projects/markdown/syntax).
For your convenience these instructions can be found also at [http://www.manfred.postma.website/instructions.html](http://www.manfred.postma.website/instructions.html). Make sure you bookmark it as no menu entry does exist to reach it otherwise.
Described is just a user friendly way to edit and publish pages on your website. There are other ways to do the same thing.
* After log in on GitHub go to [http://prose.io/#hfpostma/hfpostma.github.io](http://prose.io/#hfpostma/hfpostma.github.io).
* All files that end with **.md** or **.html** are the texts of your site’s pages. **md** stands for [Markdown](http://www.google.com/url?q=http%3A%2F%2Fdaringfireball.net%2Fprojects%2Fmarkdown%2Fsyntax&sa=D&sntz=1&usg=AFQjCNEptifCNdy4hoJdAyCKUye3PfRngA) which is an easy-to-learn-and-read formatting language that can be interpreted by GitHub and many other applications as a popular standard for web publishers. The nice thing about it is that you can use normal HTML inside of it that will work as well. Thats nice for developers. See also [https://help.github.com/articles/markdown-basics](https://help.github.com/articles/markdown-basics/) and [https://guides.github.com/features/mastering-markdown/](https://guides.github.com/features/mastering-markdown/).
* To Preview click the eye icon on the righthand top.
* Once you are done, click the _Save_ (disk) icon on the right and proceed with _Commit_.
## Feel free to contact me
<a href="http://www.mousewheel.net/contact" target="_blank" title="My contact form on mousewheel.net"><span class="signs">✍</span> Leave a message</a> or <a href="tel:+201008951369">call me directly<br />
<span class="signs">☎</span> +20 100 89 51 369</a> <a href="https://www.timeanddate.com/worldclock/italy/milan" target="_blank"><span class="signs">⌚</span> CET (UTC+02:00)</a>
<a href="/terms.html#top" title="My Terms"><b><< PREV</b></a> | <a href="/data.html#top" title="Personal info"><b>NEXT >></b></a>
| 84.612903 | 764 | 0.739992 | eng_Latn | 0.772707 |
936eef9c6494d3ad24e4a7b7e1a74729674400cf | 15,463 | md | Markdown | README.md | jluisestrada/UniversalFormsToolkit | a82ffc777f21d874c245571acd3e8707fc7501dc | [
"MIT"
] | 1 | 2020-09-14T08:24:18.000Z | 2020-09-14T08:24:18.000Z | README.md | kusanagi2k2/UniversalFormsToolkit | 9c26bb7d86743761e8638468c07d573d042b2b79 | [
"MIT"
] | null | null | null | README.md | kusanagi2k2/UniversalFormsToolkit | 9c26bb7d86743761e8638468c07d573d042b2b79 | [
"MIT"
] | null | null | null | # UniversalFormsToolkit
This small UWP framework for Windows 10 builds input forms that can be used in your views based only on you classes defined in your model's folder.
This framework has an small set of useful controls to help developers to build UWP Applications with PRISM Framework
## Featured Controls
Modal Dialog(Completed)
Window Manager(In Progress)
More details about the controls here.
https://github.com/RikardoPons/UniversalFormsToolkit/blob/master/CONTROLS_README.md
## Nuget Package
https://www.nuget.org/packages/UniversalFormsToolkit.Prism.Controls/
## Features:
- Create an input form from a business object defined in your models.
- You can set the order of the controls using attributes.
- This framework creates common controls like text boxes, combo boxes, DateTimePicker, NumericUpdown and many more.
- Make your forms read-only.
- Add simple validations to your form.
See also:
- UniversalFormsToolkit works better with validations through MVVM Validation Helpers.
- You can add your own logic to validate your bussines objects.
## Nuget:
Available in Nuget
https://www.nuget.org/packages/UniversalFormsToolkit/
Install-Package UniversalFormsToolkit
## Example:
#### 1 - Lets start with a basic class *Student*
```csharp
public class Student
{
private int ID { get; private set; }
public string FirstName { get; set; }
public string LastName{ get; set; }
public DateTime Birthday{ get; set; }
public int Semester{ get; set; }
}
```
#### 2 - After we have installed the nuget package, we must use the AutoGenerateForm.Uwp namespace and finally, we must add "annotations" to the properties of our class that we want to display in our form.
```csharp
using AutoGenerateForm.Uwp;
public class Student
{
//If we don't want the property to be in
//the form we just don't add an annotation
private int ID { get; private set; }
[AutoGenerateProperty]
[Display("First Name")]//These two anotations create one text box called "First Name"
public string FirstName { get; set; }
[AutoGenerateProperty]
[Display("Last Name")]
public string LastName{ get; set; }
[AutoGenerateProperty]
[Display("Birthday")]
public DateTime Birthday{ get; set; }
[AutoGenerateProperty]
[Display("Semester")]
public int Semester{ get; set; }
}
```
#### 3 - In the MainPage we can call your class like this:
```csharp
public sealed partial class MainPage : Page
{
public MainPage()
{
this.InitializeComponent();
this.MyStudent = new Student()
{
FirstName = "Brus",
LastName = "Medina",
Birthday = new Datetime(),
Semester = 6
};
// autogenerator.CurrentDataContext = MyStudent;
}
#region Property
private Student myStudent;
public Student MyStudent
{
get { return myStudent; }
set { myStudent = value; }
}
#endregion
protected override void OnNavigatedTo(NavigationEventArgs e)
{
base.OnNavigatedTo(e);
}
}
```
#### 4 - Finally, we set up our XAML in the next manner:
```xaml
<Grid Background="{ThemeResource ApplicationPageBackgroundThemeBrush}">
<autogenerator:AutoGenerator Margin="24" x:Name="autogenerator" TitleForm="My Student"/>
</Grid>
```
#### 5 - Our view will look like this

#### 6 - Master - Details & Validations using MVVM Validation Helpers(https://www.nuget.org/packages/MvvmValidation/2.0.2)
###### You can use your own validation logic.
##### 7 - We need to implement IValidatable and INotifyDataErrorInfo to our model.
```csharp
public class Student : PropertyChangeBase, IValidatable, INotifyDataErrorInfo
{
private string name;
[AutoGenerateProperty]
[Display("Name")]
public string Name
{
get { return name; }
set
{
name = value;
NotifyPropertyChanged();
Validator.Validate(() => Name);
}
}
private string lastName;
[AutoGenerateProperty]
[Display("Last Name")]
public string LastName
{
get { return lastName; }
set
{
lastName = value;
NotifyPropertyChanged();
Validator.Validate(() => Name);
}
}
private int age;
[IsNumeric]
[Range(0,1000)]
[DecimalCount(2)]
[AutoIncrement(2)]
[AutoGenerateProperty]
[Display("Age")]
public int Age
{
get { return age; }
set
{
age = value;
NotifyPropertyChanged();
}
}
private string phoneNumber;
[AutoGenerateProperty]
[Display("Phone Number")]
public string PhoneNumber
{
get
{
return phoneNumber;
}
set
{
phoneNumber = value;
NotifyPropertyChanged();
}
}
private ObservableCollection<Course> courses;
[AutoGenerateProperty]
[DisplayMemberPathCollection("Name")]
[SelectedItemCollection("SelectedCourse")]
public ObservableCollection<Course> Courses
{
get { return courses; }
set
{
courses = value;
NotifyPropertyChanged();
}
}
private Course selectedCourse;
[Display("Courses")]
public Course SelectedCourse
{
get { return selectedCourse; }
set
{
selectedCourse = value;
NotifyPropertyChanged();
}
}
private DateTime birthDay;
[AutoGenerateProperty]
[Display("Birthday")]
public DateTime BirthDay
{
get { return birthDay; }
set
{
birthDay = value;
NotifyPropertyChanged();
}
}
#region Validations
private ObservableCollection<AutoGenerateForm.Uwp.Models.ValidationModel> validations;
public ObservableCollection<AutoGenerateForm.Uwp.Models.ValidationModel> Validations
{
get { return validations; }
set
{
if (validations == value)
return;
validations = value;
NotifyPropertyChanged();
}
}
public Student()
{
Validator = new ValidationHelper();
this.PropertyChanged += Student_PropertyChanged;
NotifyDataErrorInfoAdapter = new NotifyDataErrorInfoAdapter(Validator);
OnCreated();
ConfigureRules();
}
private async void Student_PropertyChanged(object sender, PropertyChangedEventArgs e)
{
if (e.PropertyName == "Validations")
return;
var result = await Validator.ValidateAllAsync();
if (result != null && result.IsValid)
{
Validations = new ObservableCollection<AutoGenerateForm.Uwp.Models.ValidationModel>();
return;
}
var errors = new List<AutoGenerateForm.Uwp.Models.ValidationModel>();
if (result != null && result.ErrorList!=null && !result.IsValid)
{
foreach (var item in result.ErrorList)
{
var error = new AutoGenerateForm.Uwp.Models.ValidationModel();
var property = this.GetType().GetProperty(item.Target.ToString());
if (property != null)
{
var displayAttribute = AttributeHelper<DisplayAttribute>.GetAttributeValue(property);
error.Label = displayAttribute?.Label;
}
error.ErrorMessage = item.ErrorText;
error.ParentPropertyName = "";
error.PropertyName = item.Target.ToString();
if (item.Target.Equals("SelectedCourse"))
{
error.PropertyName = "Courses";
error.ParentPropertyName = "";
}
errors.Add(error);
}
}
this.Validations = new ObservableCollection<AutoGenerateForm.Uwp.Models.ValidationModel>(errors);
}
private void ConfigureRules()
{
Validator.AddRequiredRule(() => Name, "Name is required field");
Validator.AddRequiredRule(() => LastName, "Last Name is required field");
validator.AddRequiredRule(() => SelectedCourse, "Please select a course");
}
void OnCreated()
{
}
Task<ValidationResult> IValidatable.Validate()
{
return Validator.ValidateAllAsync();
}
public IEnumerable GetErrors(string propertyName)
{
return NotifyDataErrorInfoAdapter.GetErrors(propertyName);
}
public bool HasErrors
{
get
{
return NotifyDataErrorInfoAdapter.HasErrors;
}
}
public event EventHandler<DataErrorsChangedEventArgs> ErrorsChanged
{
add
{
NotifyDataErrorInfoAdapter.ErrorsChanged += value;
}
remove
{
NotifyDataErrorInfoAdapter.ErrorsChanged -= value;
}
}
#endregion
#region EditableEntity Interface implementation
private NotifyDataErrorInfoAdapter notifyDataErrorInfoAdapter;
private ValidationHelper validator;
public ValidationHelper Validator
{
get
{
return validator;
}
set
{
validator = value;
}
}
public NotifyDataErrorInfoAdapter NotifyDataErrorInfoAdapter
{
get
{
return notifyDataErrorInfoAdapter;
}
set
{
notifyDataErrorInfoAdapter = value;
}
}
#endregion
}
```
##### 8 - Well here is nothing special just a class called Course to save use in our ObservableCollection<Course> to select the course for each Student
```csharp
public class Course:PropertyChangeBase
{
public string Name { get; set; }
}
```
##### 8 - The group that contains all the collection of our Students.
```csharp
public class Group : PropertyChangeBase
{
private ObservableCollection<Student> students;
public ObservableCollection<Student> Students
{
get
{
return students;
}
set
{
students = value;
NotifyPropertyChanged();
}
}
}
```
##### 9 - Here is our ViewModel as you can see in our Master-Detail scenario We need to decorate our property SelectedStudent to render the form.
```csharp
public class MainPageViewModel: PropertyChangeBase
{
private Models.Group myGroup;
public Models.Group MyGroup
{
get { return myGroup; }
set
{
myGroup = value;
NotifyPropertyChanged();
}
}
private Student selectedStudent;
[AutoGenerateProperty]
public Student SelectedStudent
{
get { return selectedStudent; }
set
{
selectedStudent = value;
NotifyPropertyChanged();
}
}
public MainPageViewModel()
{
var list = new List<Course>()
{
new Course
{
Name= "Programming",
},
new Course
{
Name="Design"
}
};
MyGroup = new Models.Group();
MyGroup.Students = new ObservableCollection<Student>()
{
new Student
{
Age=20,
Name="Ricardo",
BirthDay= new System.DateTime(1996,12,20),
LastName="Pons",
Courses= new ObservableCollection<Course>(list)
},
new Student
{
Age=22,
Name="Bruno",
LastName="Medina",
Courses= new ObservableCollection<Course>(list)
},
new Student
{
Age=21,
LastName="Ronald",
Name="Becker",
Courses= new ObservableCollection<Course>(list)
}
};
}
}
```
##### 10 - Finally we created our View.
```xaml
<Grid Background="{ThemeResource ApplicationPageBackgroundThemeBrush}">
<Grid.ColumnDefinitions>
<ColumnDefinition Width="277*" />
<ColumnDefinition Width="652*" />
<ColumnDefinition Width="351*" />
</Grid.ColumnDefinitions>
<Grid>
<Grid.RowDefinitions>
<RowDefinition Height="31*" />
<RowDefinition Height="329*" />
</Grid.RowDefinitions>
<TextBlock FontSize="26">My Group</TextBlock>
<ListView
Grid.Row="1"
ItemsSource="{Binding MyGroup.Students, Mode=TwoWay}"
SelectedItem="{Binding SelectedStudent, Mode=TwoWay}">
<ListView.ItemTemplate>
<DataTemplate>
<TextBlock Text="{Binding Name, Mode=TwoWay}" />
</DataTemplate>
</ListView.ItemTemplate>
</ListView>
</Grid>
<autogenerator:AutoGenerator
Grid.Column="1"
Margin="24"
CurrentDataContext="{Binding SelectedStudent, Mode=TwoWay}"
ValidationCollection="{Binding SelectedStudent.Validations, Mode=TwoWay}" />
<ListView Grid.Column="2" Margin="12" ItemsSource="{Binding SelectedStudent.Validations,Mode=TwoWay,UpdateSourceTrigger=PropertyChanged}">
<ListView.ItemTemplate>
<DataTemplate>
<StackPanel>
<TextBlock Text="{Binding Label, Mode=TwoWay}" Foreground="Red" />
<TextBlock Text="{Binding ErrorMessage, Mode=TwoWay}" Margin="0,8,0,0"/>
</StackPanel>
</DataTemplate>
</ListView.ItemTemplate>
</ListView>
</Grid>
```
#### 11 - Our view will look like this

| 27.368142 | 205 | 0.533726 | yue_Hant | 0.408718 |
936f7eab42f7e2ebe7bfd6eeeadef0b05187782e | 8,131 | md | Markdown | docs/policy/rdt.md | isabella232/cri-resource-manager | 333194858bcb541f557f61d14c542efc1691fbdd | [
"Apache-2.0"
] | 1 | 2020-12-15T08:38:38.000Z | 2020-12-15T08:38:38.000Z | docs/policy/rdt.md | Dodan/cri-resource-manager | 45d1c9ec6eef2d1303bdb1fbfc2289d00c9a4f32 | [
"Apache-2.0"
] | 1 | 2021-02-24T00:29:35.000Z | 2021-02-24T00:29:35.000Z | docs/policy/rdt.md | Dodan/cri-resource-manager | 45d1c9ec6eef2d1303bdb1fbfc2289d00c9a4f32 | [
"Apache-2.0"
] | null | null | null | # RDT (Intel® Resource Director Technology)
## Background
Intel® RDT provides capabilities for cache and memory allocation and
monitoring. In Linux system the functionality is exposed to the user space via
the [resctrl](https://www.kernel.org/doc/Documentation/x86/intel_rdt_ui.txt)
filesystem. Cache and memory allocation in RDT is handled by using resource
control groups. Resource allocation is specified on the group level and each
task (process/thread) is assigned to one group. In the context of CRI Resource
we use the term 'RDT class' instead of 'resource control group'.
CRI Resource Manager supports all available RDT technologies, i.e. L3 Cache
Allocation (CAT) with Code and Data Prioritization (CDP) and Memory Bandwidth
Allocation (MBA) plus Cache Monitoring (CMT) and Memory Bandwidth Monitoring
(MBM).
## Overview
RDT configuration in CRI-RM is class-based. Each container gets assigned to an
RDT class. In turn, all processes of the container will be assigned to the RDT
CLOS (under `/sys/fs/resctrl`) corresponding the RDT class. CRI-RM will configure
the CLOSes according to its configuration at startup or whenever the
configuration changes.
By default there is a direct mapping between Pod QoS classes and RDT classes:
the containers of the Pod get an RDT class with the same name as its QoS class
(Guaranteed, Burstable or Besteffort). However, that can be overridden with the
rdtclass.cri-resource-manager.intel.com Pod annotation. You can also specify
RDT classes other than Guaranteed, Burstable or Besteffort. In this case, the
Pod can only be assigned to these classes with the Pod annotation, though.
The default behavior can also be overridden by a policy but currently none of
the builtin policies do that.
## Configuration
### Operating Modes
The RDT controller supports three operating modes, controlled by
`rdt.options.mode` configuration option.
- Disabled: RDT controller is effectively disabled and containers will not be
assigned and no monitoring groups will be created. Upon activation of this
mode all CRI-RM specific control and monitoring groups from the resctrl
filesystem are removed.
- Discovery: RDT controller detects existing non-CRI-RM specific classes from
the resctrl filesystem and uses these. The configuration of the discovered
classes is considered read-only and it will not be altered. Upon activation
of this mode all CRI-RM specific control groups from the resctrl filesystem
are removed.
- Full: Full operating mode. The controller manages the configuration of the
resctrl filesystem according to the rdt class definitions in the CRI-RM
configiration. This is the default operating mode.
### RDT Classes
The RDT class configuration in CRI-RM is a two-level hierarchy consisting of
partitions and classes. It specifies a set of partitions each having a set of
classes.
#### Partitions
Partitions represent a logical grouping of the underlying classes, each
partition specifying a portion of the available resources (L3/MB) which will be
shared by the classes under it. Partitions guarantee non-overlapping exclusive
cache allocation - i.e. no overlap on the cache ways between partitions is
allowed. However, by technology, MB allocations are not exclusive. Thus, it is
possible to assign all partitions 100% of memory bandwidth, for example.
#### Classes
Classes represent the actual RDT classes containers are assigned to. In
contrast to partitions, cache allocation between classes under a specific
partition may overlap (and they usually do).
The set of RDT classes can be freely specified, but, it should be ensured that
classes corresponding to the Pod QoS classes are specified. Also, the maximum
number of classes (CLOSes) supported by the underlying hardware must not be
exceeded.
### Example
Below is a config snippet that would allocate (ca.) 60% of the cache lines
exclusively to the Guarenteed class. The remaining 40% is for Burstable and
Besteffort, Besteffort getting only 50% of this. Guaranteed class gets full
memory bandwidth whereas the other classes are throttled to 50%.
```yaml
metadata:
name: cri-resmgr-config.default
namespace: kube-system
data:
...
rdt: |+
# Common options
options:
# One of Full, Discovery or Disabled
mode: Full
# Set to true to disable creation of monitoring groups
monitoringDisabled: false
l3:
# Make this false if CAT must be available
optional: true
mb:
# Make this false if MBA must be available
optional: true
partitions:
exclusive:
# Allocate 60% of all cache IDs to the "exclusive" partition
l3Allocation: "60%"
mbAllocation: ["100%"]
classes:
Guaranteed:
# Allocate all of the partitions cache lines to "Guarenteed"
l3Schema: "100%"
shared:
# Allocate 40% of all cache IDs to the "shared" partition
# These will NOT overlap with the cache lines allocated for "exclusive" partition
l3Allocation: "40%"
mbAllocation: ["50%"]
classes:
Burstable:
# Allow "Burstable" to use all cache lines of the "shared" partition
l3Schema: "100%"
BestEffort:
# Allow "Besteffort" to use half of the cache lines of the "shared" partition
# These will overlap with those used by "Burstable"
l3Schema: "50%"
```
The configuration also supports far more fine-grained control, e.g. per
cache-ID configuration (i.e. different sockets having different allocation) and
Code and Data Prioritization (CDP) allowing different cache allocation for code
and data paths.
```yaml
...
partitions:
exclusive:
l3Allocation: "60%"
mbAllocation: ["100%"]
classes:
# Automatically gets 100% of what was allocated for the partition
Guaranteed:
shared:
l3Allocation:
# 'all' denotes the default and must be specified
all: "40%"
# Specific cache allotation for cache-ids 2 and 3
2-3: "20%"
mbAllocation: ["100%"]
classes:
Burstable:
l3Schema:
all:
unified: "100%"
code: "100%"
data: "80%"
mbSchema:
all: ["80%"]
2-3: ["50%"]
...
...
```
In addition, if the hardware details are known, raw bitmasks or bit numbers
("0x1f" or 0-4) can be used instead of percentages in order to be able to
configure cache allocations exactly as required. The bits in this case are
corresponding to those in /sys/fs/resctrl/ bitmasks. You can also mix relative
(percentage) and absolute (bitmask) allocations. For cases where the resctrl
filesystem is mounted with `-o mba_MBps` Memory bandwidth must be specifed in
MBps.
```yaml
...
partitions:
exclusive:
# Specify bitmask in bit numbers
l3Allocation: "8-19"
# MBps value takes effect when resctrl mount option mba_MBps is used
mbAllocation: ["100%", "100000MBps"]
classes:
# Automatically gets 100% of what was allocated for the partition
Guaranteed:
shared:
# Explicit bitmask
l3Allocation: "0xff"
mbAllocation: ["50%", "2000MBps"]
classes:
# Burstable gets 100% of what was allocated for the partition
Burstable:
BestEffort:
l3Schema: "50%"
# Besteffort gets 50% of the 50% (i.e. 25% of total) or 1000MBps
mbSchema: ["50%", "1000MBps"]
```
See `rdt` in the [example ConfigMap spec](/sample-configs/cri-resmgr-configmap.example.yaml)
for another example configuration.
### Dynamic Configuration
RDT supports dynamic configuration i.e. the resctrl filesystem is reconfigured
whenever a configuration update e.g. via the [Node Agent](../node-agent.md) is
received. However, the configuration update is rejected if it is incompatible
with the set of currently running containers - e.g. the new config is missing a
class that a running container has been assigned to.
| 38.719048 | 92 | 0.711106 | eng_Latn | 0.997459 |
9371b5bf25cc3d6e11fe1a17d9bcfd836a0243f1 | 6,990 | md | Markdown | README.md | rhyanbridle/cordova-plugin-sms-retriever | fa7ffd8e891ff1cab4a672cdbfa11ad1845dc5cb | [
"MIT"
] | 1 | 2021-12-16T12:23:32.000Z | 2021-12-16T12:23:32.000Z | README.md | rhyanbridle/cordova-plugin-sms-retriever | fa7ffd8e891ff1cab4a672cdbfa11ad1845dc5cb | [
"MIT"
] | null | null | null | README.md | rhyanbridle/cordova-plugin-sms-retriever | fa7ffd8e891ff1cab4a672cdbfa11ad1845dc5cb | [
"MIT"
] | null | null | null | | License | Platform | Contribute |
| --- | --- | --- |
|  |  | [](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=G33QACCVKYD7U) |
# cordova-plugin-sms-retriever
Cordova plugin to receive verification SMS in Android using the [SMS Retriever API](https://developers.google.com/identity/sms-retriever/overview).
## Installation
Add the plugin with Cordova CLI (v6 or later):
```bash
cordova plugin add cordova-plugin-sms-retriever
```
Add the dependent [cordova-support-google-services](https://github.com/chemerisuk/cordova-support-google-services "cordova-support-google-services") plugin:
```bash
cordova plugin add cordova-support-google-services
```
Create your project and Android app in [Firebase Console](https://console.firebase.google.com/ "Firebase Console"), then download the **google-services.json** file into your `platforms/android` folder.
[Sign your APK](https://cordova.apache.org/docs/en/latest/guide/platforms/android/#signing-an-app "sign your APK") with a keystore file if you haven't done it already.
## Methods
### SMSRetriever.startWatch(successCallback, failureCallback)
Start listening for a single incoming [verification SMS](https://developers.google.com/identity/sms-retriever/verify#1_construct_a_verification_message "verification SMS"). This will later raise the **onSMSArrive** event when a valid SMS arrives. Example usage:
```javascript
SMSRetriever.startWatch(function(msg) {
// Wait for incoming SMS
console.log(msg);
}, function(err) {
// Failed to start watching for SMS
console.error(err);
});
```
**Notice:** The API will timeout **5 minutes** after starting if no valid SMS has arrived. Also, the API stops listening for SMS after a valid SMS has been detected.
### SMSRetriever.getHashString(successCallback, failureCallback)
Get the 11-character hash string for your app using the [AppSignatureHelper](https://github.com/googlesamples/android-credentials/blob/master/sms-verification/android/app/src/main/java/com/google/samples/smartlock/sms_verify/AppSignatureHelper.java "AppSignatureHelper") class.
```javascript
SMSRetriever.getHashString(function(hash) {
// Hash string returned OK
console.log(hash);
}, function(err) {
// Error retrieving hash string
console.error(err);
});
```
**Warning:** Google advices against dynamically retrieving your hash code before sending the SMS:
> Do not use hash strings dynamically computed on the client in your verification messages.
Therefore, do **not** invoke this method from the published app. The hash is the same for all users, and bound to your keystore signing keys, so you can get it once and never again call the `getHashString()` method.
## Events
### onSMSArrive
Triggered when a [verification SMS](https://developers.google.com/identity/sms-retriever/verify#1_construct_a_verification_message "verification SMS") with the proper 11-character hash string has arrived. You need call **startWatch()** first. Example usage:
```javascript
document.addEventListener('onSMSArrive', function(args) {
// SMS arrived, get its contents
console.info(args.message);
// To Do: Extract the received one-time code and verify it on your server
});
```
## Construct a verification SMS
The verification SMS message you send to the user must:
Be no longer than 140 bytes
Begin with the prefix <#>
Contain a one-time code that the client sends back to your server to complete the verification flow
End with an 11-character hash string that identifies your app
Otherwise, the contents of the verification message can be whatever you choose. It is helpful to create a message from which you can easily extract the one-time code later on. For example, a valid verification message might look like the following:
<#> 123ABC is your ExampleApp code. FAw9qCX9VSu
## Demo App
To test this plugin in a Cordova app using the provided sample:
1. Create a blank cordova app as you regularly do.
2. Install it following the previous instructions.
3. Replace your `www` folder with the one provided here at the `demo` folder
4. Start the app in your emulator or device and test the plugin.
5. When you are satisfied, kindly send a donation using the PayPal button on this page.
## Screenshots
Here are some screens from the **SMSReceiverDemo** sample app included in the demo folder. Feel free to try this demo in whatever device you find.


## About this Plugin
### Prerequisites
This plugin requires the Google Play Services version 10.2 or newer in order to work properly. You must also install the Google Play Services support plugin as explained in the **Install** section.
### Does the plugin work in the Android emulator?
The plugin will work in your emulator as long as you are using a **Google Play** ABI or System Image, instead of the regular Google APIs ones. This is because these images include the Google Play Store and Google Play Services.
### Does the plugin still work with the app minimized?
When the app is sent to the background, as long as Android has not unloaded it to recover memory, SMS watching will remain active and working correctly for 5 minutes.
### Does the plugin require SMS permissions?
**No**, the plugin does not require any kind of permissions because it relies on the [SMS Retriever API](https://developers.google.com/identity/sms-retriever/overview "SMS Retriever API") created by Google.
### How this plugin has been tested?
I have tested this plugin with success on:
- Android 5.1.1 emulator
- Android 6.0 emulator
- Android 7.1.1 emulator
- BLU Energy Mini (Android 5.1 Lollipop)
- BLU Vivo 5 Mini (Android 6.0 Marshmallow)
- Samsung Galaxy I9190 (Android 4.4.2 KitKat)
## Contributing
Please consider contributing with a small **donation** using the PayPal button if you like this plugin and it works as expected. No PayPal account is needed.
[](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=G33QACCVKYD7U)
For support, you may post in the **GitHub Issues** section. Before reporting that *X does not work*, please compare the plugin behavior across different devices and emulators in order to locate the exact source of the problem.
## How to post Issues
If you are convinced that the plugin needs to be fixed / updated, kindly **post your issue in full detail**, including Android version, device brand and name, Cordova and cordova-android versions.
Please don't expect me to instantly reply with a magical solution or a new plugin version, but I'll try to help in whatever I can. I'm interested in mantaining this plugin in a working condition, so try to send useful, constructive feedback whenever possible.
| 46.6 | 290 | 0.769099 | eng_Latn | 0.941383 |
9371c0513b56185751918dce51c3770cab2377fe | 392 | md | Markdown | _pages/CS2311/chap4.md | chebil/chebil.github.io | 5fe1c7a6b2c772889d43bd64e72d73b508968c21 | [
"MIT"
] | null | null | null | _pages/CS2311/chap4.md | chebil/chebil.github.io | 5fe1c7a6b2c772889d43bd64e72d73b508968c21 | [
"MIT"
] | null | null | null | _pages/CS2311/chap4.md | chebil/chebil.github.io | 5fe1c7a6b2c772889d43bd64e72d73b508968c21 | [
"MIT"
] | null | null | null | ---
title: Stacks
layout: single
classes: wide
permalink: /DataStructure/chap4
author_profile: false
# toc: True
# toc_label: "On this page"
# toc_icon: "cog"
# toc_sticky: False
sidebar:
nav: "DataStructure"
---
# Slides
<iframe height="400px" width="100%" src="https://drive.google.com/file/d/15lwKkjZY5lQzk2yIrkBJwgWtWF5KN37q/preview" frameborder="0" allowfullscreen="true"></iframe>
| 20.631579 | 164 | 0.739796 | eng_Latn | 0.211709 |
937384d3636f2ed0dccab4324be25bca11aeea6d | 3,013 | md | Markdown | docs/kubernetes/gpu.md | arodus/acs-engine | 372d59a6b45aa1fc9342eba3d5818f3ec1a24d57 | [
"MIT"
] | null | null | null | docs/kubernetes/gpu.md | arodus/acs-engine | 372d59a6b45aa1fc9342eba3d5818f3ec1a24d57 | [
"MIT"
] | null | null | null | docs/kubernetes/gpu.md | arodus/acs-engine | 372d59a6b45aa1fc9342eba3d5818f3ec1a24d57 | [
"MIT"
] | null | null | null | # Microsoft Azure Container Service Engine - Using GPUs with Kubernetes
If you created a Kubernetes cluster with one or multiple agent pool(s) whose VM size is `Standard_NC*` or `Standard_NV*` you can schedule GPU workload on your cluster.
The NVIDIA drivers are automatically installed on every GPU agent in your cluster, so you don't need to do that manually, unless you require a specific version of the drivers. Currently, the installed driver is version 378.13.
To make sure everything is fine, run `kubectl describe node <name-of-a-gpu-node>`. You should see the correct number of GPU reported (in this example shows 2 GPU for a NC12 VM):
```
[...]
Capacity:
alpha.kubernetes.io/nvidia-gpu: 2
cpu: 12
[...]
```
If `alpha.kubernetes.io/nvidia-gpu` is `0` and you just created the cluster, you might have to wait a little bit. The driver installation takes about 12 minutes, and the node might join the cluster before the installation is completed. After a few minute the node should restart, and report the correct number of GPUs.
## Running a GPU-enabled container
When running a GPU container, you will need to specify how many GPU you want to use. If you don't specify a GPU count, kubernetes will asumme you don't require any, and will not map the device into the container.
You will also need to mount the drivers from the host (the kubernetes agent) into the container.
On the host, the drivers are installed under `/usr/lib/nvidia-378`.
Here is an example template running TensorFlow:
```
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
labels:
app: tensorflow
name: tensorflow
spec:
template:
metadata:
labels:
app: tensorflow
spec:
containers:
- name: tensorflow
image: tensorflow/tensorflow:latest-gpu
command: ["python main.py"]
imagePullPolicy: IfNotPresent
env:
- name: LD_LIBRARY_PATH
value: /usr/lib/nvidia:/usr/lib/x86_64-linux-gnu
resources:
requests:
alpha.kubernetes.io/nvidia-gpu: 2
volumeMounts:
- mountPath: /usr/local/nvidia/bin
name: bin
- mountPath: /usr/lib/nvidia
name: lib
- mountPath: /usr/lib/x86_64-linux-gnu/libcuda.so.1
name: libcuda
volumes:
- name: bin
hostPath:
path: /usr/lib/nvidia-378/bin
- name: lib
hostPath:
path: /usr/lib/nvidia-378
- name: libcuda
hostPath:
path: /usr/lib/x86_64-linux-gnu/libcuda.so.1
```
We specify `alpha.kubernetes.io/nvidia-gpu: 1` in the resources requests, and we mount the drivers from the host into the container.
Note that we also modify the `LD_LIBRARY_PATH` environment variable to let python know where to find the driver's libraries.
Some libraries, such as `libcuda.so` are installed under `/usr/lib/x86_64-linux-gnu` on the host, you might need to mount them separatly as shown above based on your needs.
| 39.644737 | 318 | 0.693661 | eng_Latn | 0.992787 |
9373daebf01cc7a0ea019c49ecb7e7a465640391 | 169 | md | Markdown | README.md | CrystalVulpine/CSTL | b02d084ee1914d8448dda21a84ac0339305fdb78 | [
"Unlicense"
] | null | null | null | README.md | CrystalVulpine/CSTL | b02d084ee1914d8448dda21a84ac0339305fdb78 | [
"Unlicense"
] | null | null | null | README.md | CrystalVulpine/CSTL | b02d084ee1914d8448dda21a84ac0339305fdb78 | [
"Unlicense"
] | null | null | null | # CSTL
CSTL is a C library that provides some features of the C++ STL in C. It was built purely for fun and convenience, and is not a standardized project by any means.
| 56.333333 | 161 | 0.763314 | eng_Latn | 1.000007 |
937423827cbbb88a466451dbeb76f83708ce703c | 11,234 | md | Markdown | wdk-ddi-src/content/ndis/nc-ndis-miniport_halt.md | aktsuda/windows-driver-docs-ddi | a7b832e82cc99f77dbde72349c0a61670d8765d3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/ndis/nc-ndis-miniport_halt.md | aktsuda/windows-driver-docs-ddi | a7b832e82cc99f77dbde72349c0a61670d8765d3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/ndis/nc-ndis-miniport_halt.md | aktsuda/windows-driver-docs-ddi | a7b832e82cc99f77dbde72349c0a61670d8765d3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NC:ndis.MINIPORT_HALT
title: MINIPORT_HALT
author: windows-driver-content
description: NDIS calls a miniport driver's MiniportHaltEx function to free resources when a miniport adapter is removed, and to stop the hardware.
old-location: netvista\miniporthaltex.htm
old-project: netvista
ms.assetid: b8d452b4-bef3-4991-87cf-fac15bedfde4
ms.author: windowsdriverdev
ms.date: 4/25/2018
ms.keywords: MINIPORT_HALT, MINIPORT_HALT callback, MiniportHaltEx, MiniportHaltEx callback function [Network Drivers Starting with Windows Vista], miniport_functions_ref_aa826b59-f204-43ea-81b6-f1bab84a7a23.xml, ndis/MiniportHaltEx, netvista.miniporthaltex
ms.prod: windows-hardware
ms.technology: windows-devices
ms.topic: callback
req.header: ndis.h
req.include-header: Ndis.h
req.target-type: Windows
req.target-min-winverclnt: Supported in NDIS 6.0 and later.
req.target-min-winversvr:
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance: WlanAssociation, WlanConnectionRoaming, WlanDisassociation, WlanTimedAssociation, WlanTimedConnectionRoaming, WlanTimedConnectRequest, WlanTimedLinkQuality, WlanTimedScan
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib:
req.dll:
req.irql: PASSIVE_LEVEL
topic_type:
- APIRef
- kbSyntax
api_type:
- UserDefined
api_location:
- Ndis.h
api_name:
- MiniportHaltEx
product:
- Windows
targetos: Windows
req.typenames:
---
# MINIPORT_HALT callback function
## -description
NDIS calls a miniport driver's
<i>MiniportHaltEx</i> function to free resources when a miniport adapter is
removed, and to stop the hardware. This function puts the miniport into the Halted state, where no other callback can occur (including <a href="https://msdn.microsoft.com/7c88ff02-e791-4642-ad40-78f2ef2cba7d">MiniportShutdownEx</a>). For more information about miniport driver states, see <a href="https://msdn.microsoft.com/b47e2cbe-9da3-4600-9afe-b082e60b87fb">Miniport Adapter States and Operations</a>.
<div class="alert"><b>Note</b> You must declare the function by using the <b>MINIPORT_HALT</b> type. For more information,
see the following Examples section.</div><div> </div>
## -parameters
### -param MiniportAdapterContext [in]
A handle to a context area that the miniport driver allocated in its
<a href="https://msdn.microsoft.com/b146fa81-005b-4a6c-962d-4cb023ea790e">MiniportInitializeEx</a> function.
The miniport driver uses this context area to maintain state information for a miniport adapter.
### -param HaltAction [in]
The reason for halting the miniport adapter. It can be one of the following values:
#### NdisHaltDeviceDisabled
NDIS is halting the miniport adapter in response to a Plug and Play (PnP) remove message.
#### NdisHaltDeviceInstanceDeInitialized
NDIS is halting the miniport adapter in response to an intermediate driver calling the
<a href="https://msdn.microsoft.com/badfab43-ba58-4711-a181-af87dcfeba4d">
NdisIMDeInitializeDeviceInstance</a> function.
#### NdisHaltDevicePoweredDown
NDIS is halting the miniport adapter because the system is going to a sleeping state.
#### NdisHaltDeviceSurpriseRemoved
The miniport adapter has been surprise removed and the hardware is not present.
#### NdisHaltDeviceFailed
The miniport adapter is being removed because of a hardware failure. Either the miniport driver
called the
<a href="https://msdn.microsoft.com/library/windows/hardware/ff563661">NdisMRemoveMiniport</a> function or a
bus driver did not power up the NIC on resume.
#### NdisHaltDeviceInitializationFailed
NDIS could not initialize the miniport adapter for an unknown reason after the
<a href="https://msdn.microsoft.com/b146fa81-005b-4a6c-962d-4cb023ea790e">MiniportInitializeEx</a> function completed successfully.
#### NdisHaltDeviceStopped
NDIS is halting the miniport adapter in response to a PnP stop device message.
## -returns
None
## -remarks
A driver specifies the
<i>MiniportHaltEx</i> entry point when it calls the
<a href="https://msdn.microsoft.com/bed68aa8-499d-41fd-997b-a46316913cc8">
NdisMRegisterMiniportDriver</a> function.
NDIS can call
<i>MiniportHaltEx</i> at any time after a driver's
<a href="https://msdn.microsoft.com/b146fa81-005b-4a6c-962d-4cb023ea790e">MiniportInitializeEx</a> function
returns successfully. If the driver controls a physical NIC,
<i>MiniportHaltEx</i> should stop the NIC. If an NDIS intermediate driver calls the
<a href="https://msdn.microsoft.com/badfab43-ba58-4711-a181-af87dcfeba4d">
NdisIMDeInitializeDeviceInstance</a> function, NDIS calls the
<i>MiniportHaltEx</i> function for the driver's virtual device.
<i>MiniportHaltEx</i> must free all resources that were allocated in
<a href="https://msdn.microsoft.com/b146fa81-005b-4a6c-962d-4cb023ea790e">MiniportInitializeEx</a> for a device.
<i>MiniportHaltEx</i> also frees any other resources that the driver allocated in
subsequent operations for that device. The driver must call the reciprocals of the
<b>Ndis<i>Xxx</i></b> functions with which it originally allocated the resources. As a general
rule, a
<i>MiniportHaltEx</i> function should call the reciprocal
<b>Ndis<i>Xxx</i></b> functions in reverse order to the calls the driver made from
<i>MiniportInitializeEx</i>.
If a NIC generates interrupts, a miniport driver's
<i>MiniportHaltEx</i> function can be preempted by the driver's
<a href="https://msdn.microsoft.com/810503b9-75cd-4b38-ab1f-de240968ded6">MiniportInterrupt</a> function until the
<i>MiniportHaltEx</i> call to the
<a href="https://msdn.microsoft.com/bc0718b6-4c71-41a8-bab6-a52991b284d9">
NdisMDeregisterInterruptEx</a> function returns. Such a driver's
<i>MiniportHaltEx</i> function should disable interrupts, and call
<b>
NdisMDeregisterInterruptEx</b> as soon as possible. Note that a driver can keep getting interrupts
until
<b>
NdisMDeregisterInterruptEx</b> returns.
<b>
NdisMDeregisterInterruptEx</b> does not return until the driver finishes all the scheduled DPCs (see
the
<a href="https://msdn.microsoft.com/345715fb-878c-44d8-bf78-f3add10dd02b">MiniportInterruptDPC</a> function for
more information).
If the driver has a
<a href="https://msdn.microsoft.com/76e59376-58a4-4e35-bac4-ec5938c88cd7">NetTimerCallback</a> function that is
associated with a timer object that could be in the system timer queue,
<i>MiniportHaltEx</i> should call the
<a href="https://msdn.microsoft.com/library/windows/hardware/ff561624">NdisCancelTimerObject</a> function. If
<b>NdisCancelTimerObject</b> fails, the timer could have already fired. In this case, the driver should
wait for the timer function to complete before the driver returns from
<i>MiniportHaltEx</i>.
NDIS does not call
<i>MiniportHaltEx</i> if there are outstanding OID requests or send requests. NDIS
submits no further requests for the affected device after NDIS calls
<i>MiniportHaltEx</i>.
If the driver must wait for any operation to complete,
<i>MiniportHaltEx</i> can use the
<a href="https://msdn.microsoft.com/library/windows/hardware/ff564651">NdisWaitEvent</a> function or the
<a href="https://msdn.microsoft.com/library/windows/hardware/ff563677">NdisMSleep</a> function.
NDIS calls
<i>MiniportHaltEx</i> at IRQL = PASSIVE_LEVEL.
<h3><a id="Examples"></a><a id="examples"></a><a id="EXAMPLES"></a>Examples</h3>
To define a <i>MiniportHaltEx</i> function, you must first provide a function declaration that identifies the type of function you're defining. Windows provides a set of function types for drivers. Declaring a function using the function types helps <a href="https://msdn.microsoft.com/2F3549EF-B50F-455A-BDC7-1F67782B8DCA">Code Analysis for Drivers</a>, <a href="https://msdn.microsoft.com/74feeb16-387c-4796-987a-aff3fb79b556">Static Driver Verifier</a> (SDV), and other verification tools find errors, and it's a requirement for writing drivers for the Windows operating system.
For example, to define a <i>MiniportHaltEx</i> function that is named "MyHaltEx", use the <b>MINIPORT_HALT</b> type as shown in this code example:
<div class="code"><span codelanguage=""><table>
<tr>
<th></th>
</tr>
<tr>
<td>
<pre>MINIPORT_HALT MyHaltEx;</pre>
</td>
</tr>
</table></span></div>
Then, implement your function as follows:
<div class="code"><span codelanguage=""><table>
<tr>
<th></th>
</tr>
<tr>
<td>
<pre>_Use_decl_annotations_
VOID
MyHaltEx(
NDIS_HANDLE MiniportAdapterContext,
NDIS_HALT_ACTION HaltAction
)
{...}</pre>
</td>
</tr>
</table></span></div>
The <b>MINIPORT_HALT</b> function type is defined in the Ndis.h header file. To more accurately identify errors when you run the code analysis tools, be sure to add the _Use_decl_annotations_ annotation to your function definition. The _Use_decl_annotations_ annotation ensures that the annotations that are applied to the <b>MINIPORT_HALT</b> function type in the header file are used. For more information about the requirements for function declarations, see <a href="https://msdn.microsoft.com/232c4272-0bf0-4a4e-9560-3bceeca8a3e3">Declaring Functions by Using Function Role Types for NDIS Drivers</a>.
For information about _Use_decl_annotations_, see <a href="http://go.microsoft.com/fwlink/p/?linkid=286697">Annotating Function Behavior</a>.
## -see-also
<a href="https://msdn.microsoft.com/3ca03511-a912-4ee3-bd9f-1bd8e6996c48">Adapter States of a Miniport Driver</a>
<a href="https://msdn.microsoft.com/fd57a2b1-593d-412b-96b5-eabd3ea392e0">Halting a Miniport Adapter</a>
<a href="https://msdn.microsoft.com/b47e2cbe-9da3-4600-9afe-b082e60b87fb">Miniport Adapter States and Operations</a>
<a href="https://msdn.microsoft.com/20047ee2-ba37-47c2-858f-36e31ae19154">Miniport Driver Reset and Halt Functions</a>
<a href="https://msdn.microsoft.com/b146fa81-005b-4a6c-962d-4cb023ea790e">MiniportInitializeEx</a>
<a href="https://msdn.microsoft.com/810503b9-75cd-4b38-ab1f-de240968ded6">MiniportInterrupt</a>
<a href="https://msdn.microsoft.com/345715fb-878c-44d8-bf78-f3add10dd02b">MiniportInterruptDPC</a>
<a href="https://msdn.microsoft.com/0f33ae87-164e-40dc-a915-28211a0d74b7">
MiniportReturnNetBufferLists</a>
<a href="https://msdn.microsoft.com/library/windows/hardware/ff561624">NdisCancelTimerObject</a>
<a href="https://msdn.microsoft.com/badfab43-ba58-4711-a181-af87dcfeba4d">
NdisIMDeInitializeDeviceInstance</a>
<a href="https://msdn.microsoft.com/library/windows/hardware/ff563575">NdisMDeregisterInterruptEx</a>
<a href="https://msdn.microsoft.com/library/windows/hardware/ff563654">NdisMRegisterMiniportDriver</a>
<a href="https://msdn.microsoft.com/library/windows/hardware/ff563661">NdisMRemoveMiniport</a>
<a href="https://msdn.microsoft.com/library/windows/hardware/ff563677">NdisMSleep</a>
<a href="https://msdn.microsoft.com/library/windows/hardware/ff564651">NdisWaitEvent</a>
<a href="https://msdn.microsoft.com/76e59376-58a4-4e35-bac4-ec5938c88cd7">NetTimerCallback</a>
| 35.550633 | 608 | 0.759836 | eng_Latn | 0.623118 |
9374f369eabb811e3d6575af010b457787c3a239 | 8,492 | md | Markdown | _posts/2020/2/2020-02-09-more-than-just-api-docs.md | Day-Shift-Jobs/day-shift-jobs.github.io | 14760d0f4f3abee6f03eb1ca9eaf5561d35b6462 | [
"BSD-3-Clause"
] | 1 | 2022-01-01T19:25:48.000Z | 2022-01-01T19:25:48.000Z | _posts/2020/2/2020-02-09-more-than-just-api-docs.md | Day-Shift-Jobs/day-shift-jobs.github.io | 14760d0f4f3abee6f03eb1ca9eaf5561d35b6462 | [
"BSD-3-Clause"
] | 2 | 2021-05-20T20:59:25.000Z | 2022-02-26T09:45:36.000Z | _posts/2020/2/2020-02-09-more-than-just-api-docs.md | Day-Shift-Jobs/day-shift-jobs.github.io | 14760d0f4f3abee6f03eb1ca9eaf5561d35b6462 | [
"BSD-3-Clause"
] | null | null | null | ---
title: "From API docs to developer portals"
permalink: /blog/from-api-docs-to-developer-portals/
categories:
- api-doc
keywords:
rebrandly: https://idratherbewriting.site/apidocstodevportals
categories:
- api-doc
- podcasts
- stitcher
keywords: usability, API, design web
bitlink: https://idratherbewriting.site/devportalsintro
podcast_link: https://dts.podtrac.com/redirect.mp3/s3.us-west-1.wasabisys.com/idbwmedia.com/podcasts/devportalsintro.mp3
podcast_file_size: "3.04 MB"
podcast_duration: "32:29"
podcast_length: "29943785"
description: "One comment I often hear from API workshop participants and other readers is that they want a more advanced API course. I've been thinking about what that more advanced course would involve, in addition to what might be involved in leveling up at my work, and I've come to a realization that I need to transition more from API documentation to developer portal strategies. Developer portal strategies includes API documentation but also encompasses broader concerns as well, not too different from content strategy. "
---
I also recorded this post as a conversational podcast (rather than a narrated one).
{% include audio.html %}
## Parallels with content strategy
The transition from API documentation to developer portals mirrors the same transition that took place in tech comm domain from documentation to content strategy. When that transition happened, I resisted calling myself a content strategist because I felt like much of the tech writer's role already involves making many decisions about content strategy. But it's not always the case, and probably much less often than I assumed. The meaning of "content strategy" forked into several different connotations whether you're in marketing, tech comm, or running an SEO business, which further complicated the direction.
Regardless of parallels, in many circles related to API docs, I keep seeing an emphasis on "developer portals." Sometimes anything that isn't strictly "API reference documentation" is considered part of the developer portal, such as a Getting Started tutorial, how you authorize your calls, your publishing platform, your contributor workflows with Git, etc. In many ways, a developer portal is similar to a documentation portal, but the developer portal has some unique traits and considerations given the developer audience.
In this post, I'll outline a few details of what's involved in running and managing a developer portal that goes beyond mere content development. There are many topics to cover, but I'll limit the focus here to brief bullet points. I do elaborate a little more in the podcast, but not much.
## Developer portal topics
The following topics are roughly grouped into four categories: tools, policies and procedures, high-level strategies, and user flows. If you have feedback on any of these topics or directions, I'd love to hear it.
### Tools
* Authoring and publishing toolchain selection and implementation
* Federated search and findability
* Git workflows and permissions
* Review and monitoring of Git commits and contributors
* Implementation of style/grammar checkers at the platform level
* The design and style of the docs (the theme)
* Security tickets and the developer portal
* Configuration of PDF output and generation
* Understanding how the doc toolchain builds and publishes content from end to end
* Management of the cloud console and resources for distributing files and storing assets
* Verification scripts to perform automated checking (such as looking for broken links or style inconsistencies)
* Search engine optimization and discoverability in search engine results
### Policies and procedures
* Release process for SDKs, sample apps, and other code artifacts
* Contributor processes (e.g., pull requests or other strategies for outside contributors)
* Localization processes, priorities, and workflows
* Style guide development and implementation
* Processes for whitelisting beta partners
* Processes for collecting feedback from beta partners
* Onboarding guide for contributors, including training
* Monitoring and handling incoming tickets from other groups
* Content audit of the entire site, with a current list of owners and contact points
* Ensuring all docs meet common guidelines, including release notes for changes, etc.
* Intake processes for product launches that span across field engineers, marketing, and support, etc.
* Deprecation old content and establishing correct processes
* Defining the review process for docs (this might include various levels, such as doc team, product team, field engineers, legal team, and beta partners) prior to publication
* Defining the internal team's workflow of tickets via a Scrum or agile methodology
* Defining policies for when PDFs are delivered
* Establishing standards for REST API reference content such as using OpenAPI
* Reviewing auto-generated reference docs from library-based APIs (e.g., Javadoc, Doxygen) to ensure proper standards and tagging
* Defining templates for docs that provide standards for content such as CLI docs, schemas, and other structured information that doesn't have an industry standard
* Defining and implementation the display of specification information for different products — the attributes, display approach, and data storage approach
* Awareness and understanding of legal red flags and danger zones, as well as the legal review process and resolution
* Strategies for versioning content
* Managing sprint planning for docs and maintaining team momentum of priority items
* Understanding product naming and branding, and then enforcing this against individual teams that might launch new feature names in unapproved ways
* Publicizing updates to mailing list to communicating changes to all relevant parties
### High-level strategies
* Aligning the team's efforts and priorities with larger org's priorities
* Coordination and partnerships with other doc teams, including aligning on similar directions, tools, or processes
* Analyzing trending support tickets and products (even when not filed by these teams against docs), and syncing with engineering teams on resolutions
* Syncing with field engineers and other groups on a regular basis to gather input points from customers and product roadmaps
* Assessing incoming requests and deciding how to approach them, whether to work on them
or not
* Understanding your team's correct fit in the organizational chart (e.g., Engineering, Marketing) and vying for correct placement
* Reporting upwards with weekly, monthly, and yearly reporting cadences
* Understanding larger initiatives in executive strategies/reports and connecting the dots with all of these products
* Developing strategies for funneling information from developers back to product teams
* Providing input on developer satisfaction surveys and then taking action on the results
* Building rapport with key stakeholders and providing regular updates about the performance of their docs, etc.
### User flows
* User journeys from marketing landing pages to docs
* How content across the portal all fits and flows together — docs, marketing, support, console
* Arrangement of multiple doc sets into a master index or starting point
* Movement from marketing pages into documentation
* Integrating the support path from the docs
* Developer console logins and flows, including in-app help
* Ensuring the homepage and other marketing pages on the site properly match messaging in the documentation
* Building trust with developers
* Understanding the customer side of each developer product
* Awareness and review of pages outside the docs that include the post-login console, marketing pages, Stack Overflow, GitHub, etc.
* Understanding analytics and regularly investigating trending pages, then prioritizing updates based on analytics
* Understanding how every product fits together as a whole across the developer portal (rather than only understanding the docs you work on)
* Driving developers to sign up to newsletters and other forms of outreach from the docs
* Researching developer journeys on competitor sites
{% include random_ad.html %}
## Conclusion
In all of these points, I have barely mentioned anything about content development. I'm just trying to paint a picture of the broader concerns involved in managing a developer portal.
| 49.372093 | 615 | 0.801578 | eng_Latn | 0.999144 |
937575b8f0f24422c77e3809fab1e1758d880d6e | 1,642 | md | Markdown | results/crinacle/harman_in-ear_2017-1/Westone UM Pro 10/README.md | liqi1982/AutoEq | 24d82474d92a53ac70b5f20a8ab4ecf71539598f | [
"MIT"
] | 1 | 2019-05-06T03:16:21.000Z | 2019-05-06T03:16:21.000Z | results/crinacle/harman_in-ear_2017-1/Westone UM Pro 10/README.md | liqi1982/AutoEq | 24d82474d92a53ac70b5f20a8ab4ecf71539598f | [
"MIT"
] | null | null | null | results/crinacle/harman_in-ear_2017-1/Westone UM Pro 10/README.md | liqi1982/AutoEq | 24d82474d92a53ac70b5f20a8ab4ecf71539598f | [
"MIT"
] | null | null | null | # Westone UM Pro 10
See [usage instructions](https://github.com/jaakkopasanen/AutoEq#usage) for more options and info.
### Parametric EQs
In case of using parametric equalizer, apply preamp of **-7.1dB** and build filters manually
with these parameters. The first 5 filters can be used independently.
When using independent subset of filters, apply preamp of **-7.1dB**.
| Type | Fc | Q | Gain |
|:--------|:---------|:-----|:---------|
| Peaking | 147 Hz | 0.83 | -2.3 dB |
| Peaking | 354 Hz | 0.47 | -4.6 dB |
| Peaking | 1460 Hz | 1.26 | -5.6 dB |
| Peaking | 3358 Hz | 0.5 | 7.6 dB |
| Peaking | 18913 Hz | 1.16 | -22.8 dB |
| Peaking | 23 Hz | 1.12 | 0.8 dB |
| Peaking | 6319 Hz | 3.3 | 3.2 dB |
| Peaking | 7630 Hz | 2.34 | -3.4 dB |
| Peaking | 14434 Hz | 3.15 | 5.0 dB |
### Fixed Band EQs
In case of using fixed band (also called graphic) equalizer, apply preamp of **-8.5dB**
(if available) and set gains manually with these parameters.
| Type | Fc | Q | Gain |
|:--------|:---------|:-----|:--------|
| Peaking | 31 Hz | 1.41 | 0.6 dB |
| Peaking | 62 Hz | 1.41 | -0.6 dB |
| Peaking | 125 Hz | 1.41 | -3.4 dB |
| Peaking | 250 Hz | 1.41 | -4.8 dB |
| Peaking | 500 Hz | 1.41 | -3.0 dB |
| Peaking | 1000 Hz | 1.41 | -3.1 dB |
| Peaking | 2000 Hz | 1.41 | 0.7 dB |
| Peaking | 4000 Hz | 1.41 | 7.9 dB |
| Peaking | 8000 Hz | 1.41 | 1.3 dB |
| Peaking | 16000 Hz | 1.41 | -9.7 dB |
### Graphs
 | 42.102564 | 156 | 0.573082 | eng_Latn | 0.688004 |
93759ceae99295078c3e534e18d9e0a139417c35 | 1,597 | md | Markdown | README.md | czycha/unstated-subscribe-hoc | 28f74e2ab3a7ac62f6fc88076e416837687d8e71 | [
"MIT"
] | 1 | 2018-11-05T17:34:17.000Z | 2018-11-05T17:34:17.000Z | README.md | czycha/unstated-subscribe-hoc | 28f74e2ab3a7ac62f6fc88076e416837687d8e71 | [
"MIT"
] | 2 | 2021-05-09T18:05:39.000Z | 2021-09-01T14:19:55.000Z | README.md | czycha/unstated-subscribe-hoc | 28f74e2ab3a7ac62f6fc88076e416837687d8e71 | [
"MIT"
] | null | null | null | # unstated-subscribe-hoc
> Use [Unstated](https://npm.im/unstated)'s `<Subscribe />` component as a higher-order component.
## Install
```bash
yarn add unstated-subscribe-hoc
```
```bash
npm i unstated-subscribe-hoc
```
## Usage
```js
import subscribe from 'unstated-subscribe-hoc'
const SubscribedReactComponent = subscribe(
ReactComponent,
{
propName: UnstatedContainerComponent,
propName2: UnstatedContainerComponent2
}
)
```
## Example
```jsx
import React from 'react'
import subscribe from 'unstated-subscribe-hoc'
import CartContainer from './CartContainer' // Some Unstated.Container
import DisplayContainer from './DisplayContainer' // Another Unstated.Container
const Example = ({ cart, display }) => (
<div className={display.state.hideExample && 'hide'}>
{cart.state.items.length} item{cart.state.items.length === 1 ? '' : 's'} in your cart
</div>
)
// Subscribe Example to CartContainer (prop 'cart') and DisplayContainer (prop 'display')
export default subscribe(Example, { cart: CartContainer, display: DisplayContainer })
```
**Works with defined instances of `Container` also:**
```js
const cart = new CartContainer({ items: ['A super nice shirt'] })
export default subscribe(Example, { cart, display: DisplayContainer })
```
## Why?
I found myself passing in the containers as props anyway and I figured I'd make a shortcut for it.
## Credits
This is a very small module that could not be made possible without [James Kyle](https://github.com/jamiebuilds)'s [Unstated](https://npm.im/unstated).
## License
MIT © James Anthony Bruno
| 24.569231 | 151 | 0.722605 | eng_Latn | 0.896068 |
937633c0c8f5cd48a27f3a4a0aabd678395e8eec | 71 | md | Markdown | README.md | viniciussz7/projeto-site | 53cc6f902c0778ce7f7df742545e1cd32010f079 | [
"MIT"
] | null | null | null | README.md | viniciussz7/projeto-site | 53cc6f902c0778ce7f7df742545e1cd32010f079 | [
"MIT"
] | null | null | null | README.md | viniciussz7/projeto-site | 53cc6f902c0778ce7f7df742545e1cd32010f079 | [
"MIT"
] | null | null | null | # projeto-site
Projeto de site criado durante o curso de git e github
| 23.666667 | 55 | 0.774648 | por_Latn | 0.999622 |
93767d61203df4e4d009f847f6b439add7945e03 | 3,887 | md | Markdown | README.md | machinaide/twinbase | ff2815c5f808977949757fde48ecc392ee154a87 | [
"MIT"
] | 3 | 2021-08-05T04:54:38.000Z | 2022-03-17T07:14:06.000Z | README.md | machinaide/twinbase | ff2815c5f808977949757fde48ecc392ee154a87 | [
"MIT"
] | 1 | 2021-02-16T10:29:19.000Z | 2021-02-16T10:29:19.000Z | README.md | machinaide/twinbase | ff2815c5f808977949757fde48ecc392ee154a87 | [
"MIT"
] | 1 | 2021-02-16T10:24:53.000Z | 2021-02-16T10:24:53.000Z | # Twinbase
Twinbase is an open source platform for managing and distributing [digital twin documents](https://doi.org/10.1109/ACCESS.2020.3045856).
It is built on git and can be hosted on free-of-charge GitHub services.
See an example server live at [dtw.twinbase.org](https://dtw.twinbase.org).
Twinbase is at __*initial development*__ phase and backwards incompatible changes are expected frequently.
Update mechanisms are not yet implemented.
Twinbase is hosted by Aalto University where its development was initiated as a result of the experience from multiple digital twin related projects.
Twinbase is used as a tool for building the Digital Twin Web introduced in Section III of [this article](https://doi.org/10.1109/ACCESS.2020.3045856).
Experiences with Twinbase are used to develop further versions of the [digital twin document standard](https://github.com/AaltoIIC/dt-document).
## Using Twinbase
You can browse the web interface of this Twinbase from the URL shown on the `baseurl` field in the [/docs/index.yaml](/docs/index.yaml) file if this Twinbase is properly configured.
You can fetch twin documents in Python with the [dtweb-python](https://github.com/juusoautiosalo/dtweb-python) library. Available as `dtweb` from pip.
## To create your own Twinbase
1. Create a new repository with the "Use this template" button on the [twinbase/twinbase](https://github.com/twinbase/twinbase) page. (Sign in to GitHub if you can't see the button.)
2. In the newly created repository, activate GitHub Actions from the Actions tab if necessary, and manually run the "File modifier" workflow. (This will modify the files to match your GitHub account. Running the workflow several times will not cause any harm.)
3. Activate GitHub Pages from Settings > Pages > Source to `main` branch and `/docs` folder.
4. A link to Twinbase will be shown at the Pages page. If you have not made any domain customizations, it is in the form \<username\>.github.io/\<repository name\>.
5. Unfortunately any updates from the template repository must be made manually. But you can also just make another Twinbase and copy your twin folders and files there.
Forks can be used as well and might make updating easier, but their use has not been properly tested.
### Creating new twins to your Twinbase
Recommended method to create new twins is to use the new-twin page found on the front page of each Twinbase.
After creating a twin, you need to activate its DT identifier with one of these methods:
- To activate the automatically generated dtid.org identifier, send the values of dt-id and hosting-iri of each twin to [this form](https://dtid.org/form).
- Or you can overwrite the dt-id with the URL given by any [URL shortener service](https://en.wikipedia.org/wiki/URL_shortening#Services) or the [perma-id](https://github.com/perma-id/w3id.org) service. The URL needs to redirect to the hosting-iri.
## To start developing Twinbase
Contribution guidelines are not yet established, but useful contributions are welcome! For development, you can try this:
1. Create your own Twinbase using the Template.
2. Modify your Twinbase as you wish in GitHub.
3. Fork [twinbase/twinbase](https://github.com/twinbase/twinbase). (Do not activate Actions to avoid unnecessary commits.)
4. Manually copy the useful modifications from the repository created with the Template.
5. Submit a pull request.
Local development is a bit tricky as Twinbase uses GitHub Actions as an intergal part of the platform, but feel free to try!
## Support
There are currently no official support mechanisms for Twinbase, but [Juuso](https://juu.so) may be able to help.
## Thanks
Twinbase uses
- [mini.css](https://minicss.org/) to stylize web pages and
- [json-view](https://github.com/pgrabovets/json-view) to display digital twin documents.
Thanks to the developers for the nice pieces of software!
| 64.783333 | 260 | 0.780551 | eng_Latn | 0.995952 |
9376db945cbfbd586d77bfb951f1b2351fd8de6f | 11,795 | md | Markdown | doc/design/model_zoo_pai.md | YishuiLi/sqlflow | 54f2b90add31e72a39b6070df6303be08001d780 | [
"Apache-2.0"
] | 1 | 2020-01-10T02:32:52.000Z | 2020-01-10T02:32:52.000Z | doc/design/model_zoo_pai.md | YishuiLi/sqlflow | 54f2b90add31e72a39b6070df6303be08001d780 | [
"Apache-2.0"
] | null | null | null | doc/design/model_zoo_pai.md | YishuiLi/sqlflow | 54f2b90add31e72a39b6070df6303be08001d780 | [
"Apache-2.0"
] | null | null | null | # Model Zoo on Alibaba PAI
## Introduction
[Alibaba PAI](https://www.alibabacloud.com/product/machine-learning) is an end-to-end platform that provides various machine learning algorithms to meet users' data mining and analysis requirements. PAI supports TensorFlow and many other machine learning frameworks. SQLFlow model zoo also works on PAI. [Model Zoo Design Doc](model_zoo.md) is a high level design for SQLFlow model zoo. This document is about how to do model training, model prediction, and model analysis using SQLFlow model zoo on PAI.
## Concepts
Besides what described in [Model Zoo Concepts](model_zoo.md#Concepts), there are several new concepts about Alibaba PAI.
1. The **PAI platform** or **PAI** for short is an end-to-end platform that provides various machine learning algorithms to meet your data mining and analysis requirements. PAI supports TensorFlow and many other machine learning frameworks. See [PAI Introduction](https://www.alibabacloud.com/help/doc-detail/67461.htm) for more details.
1. **PAI TensorFlow** is the deployment of TensorFlow on **PAI**, with necessary optimization and further development that makes it possible to cooperate efficiently with other [Alibaba Cloud](https://www.alibabacloud.com/) components such as MaxCompute.
1. A **PAI program** is a python program that is developed base on TensorFlow, MXNet or [other machine learning frameworks supported by PAI](https://www.alibabacloud.com/help/doc-detail/69688.htm).
1. A **PAI task** is an instance of a **PAI program** that is being executed by the **PAI platform**.
## Background
The background of SQLFlow has been discussed thoroughly in [Model Zoo Background](model_zoo.md#Background). We'll explain how to submit a *PAI task* to the *PAI platform* in this section.
PAI requires a *PAI program* to expose several command line options to communicate with [MaxCompute](https://www.alibabacloud.com/product/maxcompute) and OSS([Alibaba Cloud Object Storage Service](https://www.alibabacloud.com/product/oss)). Typically, a user can submit the PAI program by:
1. Use the MaxCompute CLI program [`odpscmd`](https://www.alibabacloud.com/help/doc-detail/27971.htm):
```bash
$ odpscmd -e "pai -name tensorflow -Dscript=file:///my/training/script.py -Dother_param=other_param_value..."
```
When a user runs the above command, the CLI will submit the task to PAI then print a logview URL and an instance ID to stdout, after which the user can safely terminate the CLI. The user can check the logview to get the running status of the task or run:
```bash
$ odpscmd -e "wait instanceID"
```
to attach to the running task.
1. Use the MaxCompute Python Package [`pyodps`](https://pyodps.readthedocs.io/en/latest/):
```python
from odps import ODPS
conn = ODPS(...)
inst = conn.run_xflow('tensorflow', parameters={'script': 'file:///my/training/script.py',
'other_param': 'other_param_value' ... })
print(instance.get_logview_address())
```
This code snippet asynchronously submits a task to PAI and prints the logview URL to stdout.
**Note** that the code snippets here are only for demonstration. In practice, there may be multiple python scripts in a PAI program. If so, we can pack all the python scripts/dependencies of the PAI program into a tarball that can be passed to the `-Dscript` option. PAI will automatically unpack the tarball. And PAI has another command line option to specify which python script is the entry of the program. For details, please refer to the user guide of [odpscmd](https://www.alibabacloud.com/help/doc-detail/27971.htm) and [pyodps](https://pyodps.readthedocs.io/en/latest/).
## The Design
### Versioning and Releasing
Versioning and releasing in PAI model zoo is the same as what's described in [Model Zoo Design](model_zoo.md#Versioning-and-Releasing). The only requirement is that the base docker image of the SQLFlow model zoo should incorporate both `odpscmd`(which is already in place) and `pyodps`. We propose to use `pyodps` at the moment because it is easier to be integrated into the new workflow framework that is in progress.
### Submitter Programs of PAI Model Zoo
We propose to require the model source directory of a model zoo Docker image to be a legal python package with the same name as the image itself, besides, the python package should expose all the model classes. In that case, whether or not a model is executed on PAI, its [submitter program](model_zoo.md#Submitter-Programs) can easily import the *model definition*s.
For example, if we refer to a *model definition* `MyAwesomeModel` by `sqlflow/my_awesome_model/MyAwesomeModel`, it implies that:
- the Docker image `sqlflow/my_awesome_model` has a directory `/my_awesome_model`
- the *model definition* is a python class `MyAwesomeModel` defined in a python script `/my_awesome_model/name_as_you_wish.py`
- the file `/my_awesome_model/__init__.py` has a line `from .name_as_you_wish import DNNClassifier`
Considering the above example, SQLFlow should:
1. For a model zoo model that is to be executed in a Docker container, SQLFlow generates **one** submitter program:
- `submitter.py` would run in a Docker container as described in [Submitter Programs](model_zoo.md#Submitter-Programs)
```python
import os
import sys
sys.path += ['/']
import my_awesome_model
# Do my stuff here ...
```
1. For a model zoo model that is to be executed on PAI, SQLFlow generates **two** programs, i.e. a submitter program and a PAI entry file:
- `pai_entry.py` would run on PAI as described in [Background](#Background)
```python
import my_awesome_model
# Define all the PAI required command line options
# Do my stuff here ...
```
- `submitter.py` would run in a Docker container as described in [Submitter Programs](model_zoo.md#Submitter-Programs)
```python
import os
import tarfile
from odps import ODPS
TARBALL = "/submitters/my_awesome_model.tar.gz"
archive = tarfile.open(tarball, "w|gz")
# '.' is always in sys.path
archive.add('/my_awesome_model')
archive.add('/submitters/pai_entry.py', arcname=".")
archive.add('/submitters/requirements.txt', arcname=".") # ** see below
archive.close()
# Submit the tarball to PAI
conn = ODPS(...)
inst = conn.run_xflow("tensorflow",
"my_project",
parameters={"script": "file://" + TARBALL,
"entry": "pai_entry.py",
"other_param": "other_param_value" ... })
```
** Considering that `MyAwesomeModel` may depend on third party python packages that are not in place on PAI, we propose to introduce a mechanism to analyze the model's dependencies automatically before submitting it to PAI. The mechanism may be simply implemented in the Docker container as a `pip list` command followed by a set subtraction command(e.g. [the comm utility](https://linux.die.net/man/1/comm)), the result file is packed into the tarball that is to be submitted to PAI.
Now let's sum up. Currently, each deployment of SQLFlow has been configured to use only one platform. So we assume that all the tasks of the deployment of SQLFlow on PAI will be submitted to PAI.
When a user submits a SELECT statement as described in [Model Zoo Background](model_zoo.md#Background), SQLFlow should take the following actions:
1. SQLFlow Checks whether the entity after `TO TRAIN/PREDICT/ANALYZE` is from a SQLFlow model zoo. For example, `"models.sqlflow.org/sqlflow/my_awesome_model/MyAwesomeModel"` implies that the model is from model zoo `models.sqlflow.org`, and a plain `DNNClassier` implies that the model is a premade estimator. The actual mechanism may be more complicated and is still under progress.
1. Case A: the model **is not** from a model zoo:
- The SQLFlow server generates a submitter program and a single-file PAI program.
- There are no model source files. SQLFlow calls the submitter program to submit the PAI program to PAI.
1. Case B: the model **is** from a model zoo:
- The SQLFlow server generates a submitter program and a single-file PAI program.
- SQLFlow pulls the Docker image and calls k8s API to launch it on a k8s cluster to execute the following command:
```bash
docker run --rm -it \
-v /var/sqlflow/submitters:/submitters sqlflow/my_awesome_model \
python /submitters/submitter.py
```
- There are model source files. The submitter program does the following in the Docker container as the [code snippet above](#L85-L106):
- Analyzes model dependencies.
- Packs all the dependencies as well as the model source files to a tarball.
- Submits the tarball to PAI.
### PAI Model Zoo Data Schema
For security reasons, we propose to leverage the existing user access control of MaxCompute. As a result of this consideration, the PAI model zoo table should be built on MaxCompute, which is typically the data source of a PAI training program. The model zoo table of PAI should contain the following fields:
| model ID | creator | model zoo release | model definition | submitter program | data converter | model parameter file path | metrics | datetime | logview | statement | SQLFlow version | name |
|----------|---------|-------------------|------------------|-------------------|----------------|---------------------------|---------|----------|---------|-----------|-----------------|------|
1. *model ID*
1. *creator*
1. *model zoo release*
1. *model definition*
1. *submitter program*
1. *data converter*
1. *model parameter file path*
1. *metrics*, the metrics that measure the training results, e.g. AUC, loss, F1 etc.
1. *datetime*, a timestamp when the user start training.
1. *logview*, logview URL of a PAI task.
1. *statement*, the SQL statement which submitted the training task.
1. *SQLFlow version*, the version of SQLFlow which generated the submitter program.
1. *name*, the same meaning as its namesake in `odpscmd -e "pai -name ...`, defaults to "tensorflow"
The 1st to 7th fields are consistent with the [Model Zoo Data Schema](model_zoo.md#Model-Zoo-Data-Schema) where they are introduced. The 8th to 13th fields are used to ease usage on PAI, because they are only useful for users of PAI model zoo, we can simply omit these extra fields when publishing a PAI model to `model.sqlflow.org`. Similarly, we can keep the extra fields unfilled when submitting a PAI task using a published model from `model.sqlflow.org`.
### Model Sharing in PAI Model Zoo
Model sharing in PAI model zoo is nearly the same as [Model Zoo Model Sharing](model_zoo.md#Model-Sharing). The only difference is, for security reasons, users can only access the models they are authorized.
For example, suppose there is a model `my_first_model` that is trained by `an_analyst`, if another analyst wants to use the trained model, she not only needs to use the full name `an_analyst/my_first_model`, but also needs to have access to the model. The access control mechanism is based on an SSO system or similar systems.
### Model Publication in PAI Model Zoo
For the same security reasons, in addition to `models.sqlflow.org`, we propose to deploy a private Docker registry with stricter access control for model publication. Each user can enjoy all the models authorized from both public and private repositories. Other than this, model publication in PAI model zoo is the same as [Model Zoo Model Publication](model_zoo.md#Model-Publication).
We encourage model developers to develop their models in high level TensorFlow APIs(tf.keras) to ensure TensorFlow version compatibility.
| 68.575581 | 578 | 0.730649 | eng_Latn | 0.987855 |
9376e598c99d6dd9616d5ce2899ec8b75c5895d4 | 3,754 | md | Markdown | docs/connect/jdbc/connecting-to-sql-server-with-the-jdbc-driver.md | pbarryuk/sql-docs | d44fa4170c2f586f264e31906c7916a74d080aef | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/connect/jdbc/connecting-to-sql-server-with-the-jdbc-driver.md | pbarryuk/sql-docs | d44fa4170c2f586f264e31906c7916a74d080aef | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/connect/jdbc/connecting-to-sql-server-with-the-jdbc-driver.md | pbarryuk/sql-docs | d44fa4170c2f586f264e31906c7916a74d080aef | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Connecting to SQL Server with the JDBC Driver | Microsoft Docs"
ms.custom: ""
ms.date: "07/11/2018"
ms.prod: sql
ms.prod_service: connectivity
ms.reviewer: ""
ms.technology: connectivity
ms.topic: conceptual
ms.assetid: 94bcfbe3-f00e-4774-bda8-bb7577518fec
author: MightyPen
ms.author: genemi
manager: jroth
---
# Connecting to SQL Server with the JDBC Driver
[!INCLUDE[Driver_JDBC_Download](../../includes/driver_jdbc_download.md)]
One of the most fundamental things that you'll do with the [!INCLUDE[jdbcNoVersion](../../includes/jdbcnoversion_md.md)] is to make a connection to a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] database. All interaction with the database occurs through the [SQLServerConnection](../../connect/jdbc/reference/sqlserverconnection-class.md) object, and because the JDBC driver has such a flat architecture, almost all interesting behavior touches the SQLServerConnection object.
If a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] is only listening on an IPv6 port, set the java.net.preferIPv6Addresses system property to make sure that IPv6 is used instead of IPv4 to connect to the [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]:
```java
System.setProperty("java.net.preferIPv6Addresses", "true");
```
The topics in this section describe how to make and work with a connection to a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] database.
## In This Section
|Topic|Description|
|-----------|-----------------|
|[Building the Connection URL](../../connect/jdbc/building-the-connection-url.md)|Describes how to form a connection URL for connecting to a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] database. Also describes connecting to named instances of a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] database.|
|[Setting the Connection Properties](../../connect/jdbc/setting-the-connection-properties.md)|Describes the various connection properties and how they can be used when you connect to a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] database.|
|[Setting the Data Source Properties](../../connect/jdbc/setting-the-data-source-properties.md)|Describes how to use data sources in a Java Platform, Enterprise Edition (Java EE) environment.|
|[Working with a Connection](../../connect/jdbc/working-with-a-connection.md)|Describes the various ways in which to create an instance of a connection to a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] database.|
|[Using Connection Pooling](../../connect/jdbc/using-connection-pooling.md)|Describes how the JDBC driver supports the use of connection pooling.|
|[Using Database Mirroring (JDBC)](../../connect/jdbc/using-database-mirroring-jdbc.md)|Describes how the JDBC driver supports the use of database mirroring.|
|[JDBC Driver Support for High Availability, Disaster Recovery](../../connect/jdbc/jdbc-driver-support-for-high-availability-disaster-recovery.md)|Describes how to develop an application that will connect to an AlwaysOn availability group.|
|[Using Kerberos Integrated Authentication to Connect to SQL Server](../../connect/jdbc/using-kerberos-integrated-authentication-to-connect-to-sql-server.md)|Discusses a Java implementation for applications to connect to a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] database using Kerberos integrated authentication.|
|[Connecting to an Azure SQL database](../../connect/jdbc/connecting-to-an-azure-sql-database.md)|Discusses connectivity issues for databases on SQL Azure.|
## See Also
[Overview of the JDBC Driver](../../connect/jdbc/overview-of-the-jdbc-driver.md)
| 81.608696 | 494 | 0.74374 | eng_Latn | 0.929143 |
93774930578432b92956d11543c88a1225b7bd95 | 3,107 | md | Markdown | docs/profiling/analyzing-performance-tools-data.md | 1DanielaBlanco/visualstudio-docs.es-es | 9e934cd5752dc7df6f5e93744805e3c600c87ff0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/profiling/analyzing-performance-tools-data.md | 1DanielaBlanco/visualstudio-docs.es-es | 9e934cd5752dc7df6f5e93744805e3c600c87ff0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/profiling/analyzing-performance-tools-data.md | 1DanielaBlanco/visualstudio-docs.es-es | 9e934cd5752dc7df6f5e93744805e3c600c87ff0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Analizar datos de herramientas de rendimiento | Microsoft Docs
ms.date: 11/04/2016
ms.topic: conceptual
helpviewer_keywords:
- performance, viewing data
- performance reports, performance data
- profiling tools, performance data
- performance, analyzing
- performance tools, reports
- Profiling Tools,data views
- Profiling Tools,reports
ms.assetid: ae3e198a-b994-4ecb-a633-dec98bd4fd45
author: mikejo5000
ms.author: mikejo
manager: jillfra
ms.workload:
- multiple
ms.openlocfilehash: aa0b3cfa9ef43e6e93058958c27d0d2992de723c
ms.sourcegitcommit: 2193323efc608118e0ce6f6b2ff532f158245d56
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 01/25/2019
ms.locfileid: "55031657"
---
# <a name="analyze-performance-tools-data"></a>Analizar datos de herramientas de rendimiento
Los informes de rendimiento de las herramientas de generación de perfiles [!INCLUDE[vsprvs](../code-quality/includes/vsprvs_md.md)] le permiten ver y analizar problemas de rendimiento de la aplicación. En esta sección se proporciona información general de los informes de rendimiento y las vistas que pueden utilizarse para examinar los datos de rendimiento.
## <a name="common-tasks"></a>Tareas comunes
|Tarea|Contenido relacionado|
|----------|---------------------|
|**Usar reglas de rendimiento para identificar problemas rápidamente:** Las reglas de rendimiento de las Herramientas de generación de perfiles identifican problemas comunes y ayudan a ir fácilmente al código fuente que contiene el problema. Los temas de ayuda detallados a menudo pueden sugerir una solución.|- [Usar reglas de rendimiento para el análisis de datos](../profiling/using-performance-rules-to-analyze-data.md)|
|**Comprender los detalles de la vista de informes:** Las vistas de informes de las Herramientas de generación de perfiles proporcionan datos de rendimiento totales de los procesos, subprocesos, módulos y funciones de una ejecución de generación de perfiles. Los datos que aparecen dependen del método de generación de perfiles utilizado para recopilar los datos.|- [Vistas de informes de rendimiento](../profiling/performance-report-views.md)|
|**Configurar, ordenar y filtrar las vistas de informes:** Puede especificar y ordenar las columnas de datos que deben mostrarse en un informe, ordenar las filas del informe y filtrar los datos para incluir únicamente un segmento de tiempo que especifique.|- [Personalizar las vistas de informes de las herramientas de rendimiento](../profiling/customizing-performance-tools-report-views.md)|
## <a name="related-sections"></a>Secciones relacionadas
[Comparación de archivos de datos de rendimiento](../profiling/comparing-performance-data-files.md)
[Guardar y exportar datos de herramientas de rendimiento](../profiling/saving-and-exporting-performance-tools-data.md)
## <a name="see-also"></a>Vea también
[Explorador de rendimiento](../profiling/performance-explorer.md)
[Generación de perfiles en Visual Studio](../profiling/index.md)
[Primer vistazo a la generación de perfiles](../profiling/profiling-feature-tour.md) | 69.044444 | 448 | 0.784358 | spa_Latn | 0.951899 |
93780efe56ebe4130f63869e99baf03d73e7a865 | 45 | md | Markdown | server/routes/README.md | DevelopersGuild/dgwebsite2 | f2a83826a68a553ed2dafd7305a6396976bd4fcc | [
"MIT"
] | 1 | 2015-04-04T00:17:13.000Z | 2015-04-04T00:17:13.000Z | server/routes/README.md | DevelopersGuild/dgwebsite2 | f2a83826a68a553ed2dafd7305a6396976bd4fcc | [
"MIT"
] | null | null | null | server/routes/README.md | DevelopersGuild/dgwebsite2 | f2a83826a68a553ed2dafd7305a6396976bd4fcc | [
"MIT"
] | 3 | 2015-05-28T01:20:13.000Z | 2016-01-15T22:20:47.000Z | # Developers' Guild Website
**Routes**
WIP
| 7.5 | 27 | 0.688889 | kor_Hang | 0.775981 |
937954588cba85a261194b92b6707d1d024f21a4 | 429 | md | Markdown | src/newsletters/2014/48/visual-studio-2015-features.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | 1 | 2022-02-03T19:26:09.000Z | 2022-02-03T19:26:09.000Z | src/newsletters/2014/48/visual-studio-2015-features.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | 13 | 2020-08-19T06:27:35.000Z | 2022-02-26T17:45:11.000Z | src/newsletters/2014/48/visual-studio-2015-features.md | dotnetweekly/dnw-gatsby | b12818bfe6fd92aeb576b3f79c75c2e9ac251cdb | [
"MIT"
] | null | null | null | ---
_id: 5a88e1aebd6dca0d5f0d282c
title: "Visual Studio 2015 Features!"
url: 'http://geekswithblogs.net/TATWORTH/archive/2014/11/27/visual-studio-2015-features.aspx'
category: articles
slug: 'visual-studio-2015-features'
user_id: 5a83ce59d6eb0005c4ecda2c
username: 'bill-s'
createdOn: '2014-11-28T20:48:26.000Z'
tags: []
---
Microsoft have released some more information about Visual Studio 2015 and the excellent new features!
| 30.642857 | 102 | 0.787879 | eng_Latn | 0.490359 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.