Search is not available for this dataset
query
stringlengths 1
13.4k
| pos
stringlengths 1
61k
| neg
stringlengths 1
63.9k
| query_lang
stringclasses 147
values | __index_level_0__
int64 0
3.11M
|
---|---|---|---|---|
Google Satellite imagery gone wild with QGIS 2.18 I'm using the OpenLayers plugin (version 1.4.3) for my satellite imagery. When I'm at 1:565 or less (the number on the right is bigger) everything is fine. See image below. When I'm at 1:564 or greater the satellite image is incorrectly aligned / scaled. In this case, I change the number is the bottom readout (circled in red) to avoid moving the map in any way. I am using an API key. And can zoom into 1:282 before the image goes away (what appears to be the plugin's way of letting you know the imagery isn't supported at that scale. I haven't noticed this behavior with Bing imagery, or Google Hybrid, but can't zoom in as far either either case. | Scale-dependent displaying of a layer using OpenLayers plugin? When I am creating a layer in QGIS the layer position/lines does not stay the same when I zoom in closer than, it seems like 1: 1000. I am adding 2 images to explain the problem. The moment I zoom in closer than 1: 1 000 then the original layer does not display correctly anymore. Please, can you assist in this matter? | Creating Ellipse in WGS84, but with metric parameters I have the function --Ellipse(x,y,rx,ry,rotation,#of segments in 1/4 of ellipse) SELECT ST_Translate( ST_Rotate( ST_Scale( ST_Buffer(ST_Point(0,0), 0.5, $6), $3, $4), $5), $1, $2) I have points in WGS84 (x,y), radiuses in meters (rx, ry), rotation angle in degrees. And I need a new WGS84 geometry as a result. As you can see I can't call this function without conversions between WGS84 and some metric SRID. But my points aren't in a definite region. So a SRID which is optimal for one point, throws an error for another point during conversion. Is there a common way to get an optimal metric projection (SRID) for a WGS84 point? Or at least to use some rough, but world-wide universal metric projection? Or maybe there's a trick for this case? | eng_Latn | 29,100 |
How to load a .osm file as a basemap tiles in openlayers 3? I am working on an app that needs to be used offline so I have downloaded a map in an .osm file, I need to load it as a basemap. I did some research and found this code I used but it shows the map as a vector layer: new ol.layer.Vector({ source: new ol.source.Vector({ format: new ol.format.OSMXML(), url: 'Maps/Tangier_Medina.osm', projection: 'EPSG:4326' }) }) I thought at first that specifying the format ol.format.OSMXML() will be enough but I need it to be a basemap with tiles. Anyone knows how please ? | Using local OpenStreetMap (*.OSM) map with Openlayers 3? What do I need to load a .OSM map in a client side javascript app? (Imagine an offline webapp). | Georeferencing old map in QGIS? I have an old map over a part of Laos and wondering how do I georeference the map with the help of the Grid lines and other Longitude and Latitude information on the scanned map? I have searched for information on the internet and tried many times. My scanned map wont overly correctly on the base map. Any base map would be fine. As long as I can check the result. Do I add X/East and Y/North values from the grid line crossings as marked and written in some parts of the map? How do I begin georeferencing this map? Do I have to add zeros after the two digits? For example if the Easting is 24 (its in KM right? Do I add like three/four zeros to convert it to meters? I georeferenced the map with the option "From Map Canvas" in the georeferencer tool and I succeeded but I want to know the coordinate of a point on my old scanned map and write them as X/East and Y/North to georeference the map. I hope I am not babbling and someone understands what I mean! We can set the CRS to WGS84, EPSG:4326. | eng_Latn | 29,101 |
Projecting INEGI shapefiles from Mexico - not lining up I have two INEGI shapefiles from Mexico (2000 and 2010) that I want to line up with one another. The 2010 file projects in the right location and I would like the 2000 file to match/line up with the 2010 one. The shapefiles contain various cities in Mexico - some line up almost identically (see example of Minatitlan) while others are dramatically different (see example below of Tetepango). I originally posted a question about getting data from 2000 to project in the correct location in Mexico, which made the cities are least be in the generally correction location! See the response here: What can I do to make the 2000 file match/line up with the 2010 one? 2010/2000 Data Source: Projected Coordinate System: Conica Conforme de Lambert Projection: Lambert_Conformal_Conic False_Easting: 2500000.00000000 False_Northing: 0.00000000 Central_Meridian: -102.00000000 Standard_Parallel_1: 17.50000000 Standard_Parallel_2: 29.50000000 Scale_Factor: 1.00000000 Latitude_Of_Origin: 12.00000000 Linear Unit: Meter Geographic Coordinate System: ITRF92 Datum: D_GRS_1980 Prime Meridian: Greenwich Angular Unit: Degree | How to make two feature layers congruent using ArcGIS for Desktop? I have two feature layers (polyline/polygon) which should be congruent but there's an offset of about 220 meters. The Georeferencing tool only works with raster layers. How do I do the equivalent with a feature layer? | Assurance that ST_Envelope works with geography as expected Using PostGIS, I have a table of data with geographic points defined as last_known_coordinates GEOGRAPHY(POINT,4326). I want to find all of these within a geographic bounding box with given top left and bottom right geographic coordinates, and be sure that the calculations are geographic, not geometric, e.g. that the results account for the curvature of the earth. The query I'm using contains the fragment: last_known_coordinates && ST_MakeEnvelope(?, ?, ?, ?, 4326)::geography As you can tell I'm a bit new to this, and I know what I want, but not necessarily how to do it correctly. Other answers hint that this is correct, but I'd like to know for sure: Will this point data and SQL fragment take into account the curvature of the Earth in the bounding box? | eng_Latn | 29,102 |
Why does XLAT mean 'translate'? It is said that XLAT is an abbr of translate. But I don't understand how come it ends up like that? There is no site on internet would explain it but they're all agree that XLAT is shorten of translate. I can get that LAT is in the end part of translate (LATe) but what about the X? | Why can "trans" be replaced with an x? I can't think of an example, so I may be wrong about this, but I think I've seen people replace the prefix "trans" as in transport with an x. "Cross" makes sense, as in "railroad crossing", and I guess "chris" is sort of a punny extension, the extension being like "chris cross" and the example being "xmas", but why trans? | Calculating extent of custom tiles? Problem The picture is just an example of my case, but I am left with blank space on the left when I load all the tiles and I get error message in the console: Failed to load resource: the server responded with a status of 404 (Not Found) http://path/to/the/tile//base/8/6/53.png etc. I could guess and limit the extent of the layer, but I would really like to know is there any way to calculate it and also be able to get rid of the error message. What I know about the tiles? Originally they are served as TMS They represent png images 256*256 They are in projection: EPSG:3857 In OpenLayers2 they were called like TMS layer In OpenLayers3 (v.3.8.2) they are called like XYZ layer (that includes -y): var baseLayer = new ol.layer.Tile({ source: new ol.source.XYZ({ url: 'http://path/to/the/tiles/base/{z}/{x}/{-y}.png' }) }); Thoughts and questions How can I calculate extent of the tiles? Or in another words how can I find out zmin zmax xmin ymin xmax ymax or whole range of x/y? What does x/y really represent? Are they coordinates of the left bottom corner or center of the tile? In which direction does x/y increase? How can I figure out x/y of at least one of the tile, for example left bottom one? (I have modified the question for better understanding of the problem) | eng_Latn | 29,103 |
Linear motion on surface of the earth I am looking for a formula to compute the following: Assume you are given a GPS coordinate $(\alpha, \beta)=(\text{latitude}, \text{longitude})$ in decimal format (not in the format involving minutes and seconds), a time $t$ and a speed vector $v=(v_x, v_y)$. Now a point moves from $(\alpha, \beta)$ with speed $v$ for the time $t$. The question is: Which GPS coordinate does it end up at (, assuming the earth is a ball with radius $R$)? I am sure there is a formula for that already but I can´t find it. The solution can be an approximation, since I need this for a real world problem. I know that this problem is not strictly well-defined since the speed vector is in Cartesian coordinates, so I am assuming that the earth is locally flat. Still, there should be some almost-solution to my problem, since I am dealing with rather short times and small speed vectors which makes the curvature of the earth almost irrelavant. Any help will be appreciated! Leon | How to calculate a heading on the earths surface? Given an initial position and a subsequent position, each given by latitude and longitude in the WGS-84 system. How do you determine the heading in degrees clockwise from true north of movement? | Projecting sp objects in R I have a number of shapefiles in different CRSs (mostly WGS84 lat/lon) that I'd like to transform into a common projection (likely Albers Equal Area Conic, but I may ask for help on choosing in another question once my problem gets better-defined). I spent a few months doing spatial stats stuff in R, but it was 5 years ago. For the life of me, I cannot remember how to transform an sp object (e.g. SpatialPolygonsDataFrame) from one projection to another. Example code: P4S.latlon <- CRS("+proj=longlat +datum=WGS84") hrr.shp <- readShapePoly("HRR_Bdry"), verbose=TRUE, proj4string=P4S.latlon) # Shapefile available at # http://www.dartmouthatlas.org/downloads/geography/hrr_bdry.zip # but you must rename all the filenames to have the same # capitalization for it to work in R Now I have a SpatialPolygonsDataFrame with appropriate projection information, but I'd like to transform it to the desired projection. I recall there being a somewhat unintuitively-named function for this, but I can't remember what it is. Note that I do not want just to change the CRS but to change the coordinates to match ("reproject", "transform", etc.). Edit Excluding AK/HI which are annoyingly placed in Mexico for this shapefile: library(taRifx.geo) hrr.shp <- subset(hrr.shp, !(grepl( "AK-" , hrr.shp@data$HRRCITY ) | grepl( "HI-" , hrr.shp@data$HRRCITY )) ) proj4string(hrr.shp) <- P4S.latlon | eng_Latn | 29,104 |
Trigonometry problem involving vectors An airplane is headed 123 degrees with an air speed of 310 mph with a wind of 40 mph on a bearing of 200 degrees. Find the ground speed to the nearest mph, and the true course of the airplane to the nearest degree. I tried to draw the vectors but I am unable to figure it out.Please help me. | Airplane ground speed and direction, given airspeed and wind An airplane heads northeast at an airspeed of 900 km/hr, but there is a wind blowing from the west at 60 km/hr. In what direction does the plane end up flying? (angle). What is its speed relative to the ground? I drew a picture. I think I am having a hard time understanding how to solve the problem. I want to add the wind vector to the plane vector - is that as simple as 840 km/hr? After really adding the vectors together, do I do: (added vectors)cos(45)? | What units does the rotation argument expect when creating objects? I want to create a lamp in blender using python. I want to give the lamp some rotation. As per the I can set the initial rotation using a tuple of numbers in the function call like this rotation=(0.0, 0.0, 0.0) this is the specifc documentation for the rotation parameter: rotation: (float array of 3 items in [-inf, inf], (optional)) – Rotation, Rotation for the newly added object What units is this parameter specified with? It does not seem to be in degrees or radian. When I try with my example code here where I use rotation=(0, 1, 0) to get 1 degree on the y axis. I end up getting 57.296 degrees after running the code. import bpy bpy.ops.object.lamp_add(type='AREA', view_align=False, location=(0, 0, 0), rotation=(0, 1, 0), layers=(True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False)) | eng_Latn | 29,105 |
Drawing appropriately coordinate grids I am having an issue while trying to show the coordinates in the print layout in QGIS. While the coordinate format is in decimal and suffix, it is ok. I need the format in degree, minutes, and suffix. But when I had selected that option, it was showing both north and south points in the latitude, and also the east and west in the longitude. How to solve this problem? There was a like this one, but could not solve my problem. | Adding Grids in Project Manager of QGIS I am using QGIS 3.12. I am new to QGIS after having worked with ArcGIS Desktop for a while. I am having issues with adding a grid to my project. I am using CRS ESPG:3587 WGS 84 / Pseudo-Mercator and noticed that the grid is showing the wrong coordinates and have included the parameters I have set for the grids. | Calculate distance between two latitude-longitude points? (Haversine formula) How do I calculate the distance between two points specified by latitude and longitude? For clarification, I'd like the distance in kilometers; the points use the WGS84 system and I'd like to understand the relative accuracies of the approaches available. | eng_Latn | 29,106 |
My axles kept on breaking Two weeks ago my rear axle break into two pieces and I replace it with a new one. Yeterday that axle broke while trying to lift the front wheel up. Do you know what might be causing this? | Reoccurring Bent Axles, any reasons or upgrades? Earlier this month I had a issue with a broken axle. It was the original that was sold with the bike and lasted around 5,000 miles. I went to my LBS and bought a new axle (unbranded, labeled, etc, paid $13, came with cones, spacers, and hardware). I did the install myself and had everything adjusted nicely. The next time I rode it was on my commute to work, but I didn't get 4 miles into it when I found that I bent the axle. I took the wheel to the shop but they wouldn't do anything about it since I didn't have them do the work. However, I figured this was a fluke and asked them for a new axle and to true the wheel. Once again I did the install of the axle and only got 15 miles out of it this time before bending it. Before I got home I felt the wheel get stiff, but it stayed true. The question is what will be a cause of bending 2 axles in 20 miles? I think the guy at the shop said it was a cro-moly axle (while talking to another person today at the shop, these may not be), and he couldn't believe that I bent it. If so, where would such an axle fall in the quality hierarchy? Or is my hub toast? While the wheel was being repaired, I swapped to a new original wheel for about 2 weeks. I have not had any issues with it, but I wanted to keep it as new as possible. Reading other questions I saw a comment that said not to use a hub where the bearing was catching. I did have one in this wheel but I figured since it is a cheap wheel I would just burn it up and move on to a better one. Besides, it has lasted 5k miles without issue to this point. The bike in question is a 2008 Schwinn High Timber with Joytech freewheel hubs. I do use a trailer with this bike using a Burley axle hitch. While I had the hitch installed on both axles I bent, I did not have either trailer hooked up to them. Something I would like to note is that when I removed the first bent axle (axle #2) that the cones on it had a wavy bearing track. I didn't think too much of it, and I put them on axle #3. Only having a few miles on them shouldn't have made a difference? | How can I stitch a panorama correctly if I moved the camera along the horizontal axis? Here in Argentina, we have . All the houses and walls on that street have some kind of mosaic stuck to it, and it's very cool. It was made by a local artist . Because to this piece of urban art is two blocks long, I've decided to make a panorama of it, by moving myself on an horizontal axis while taking photos. I mean, I took one photo, walked one step deeper along the street, took another photo, and so on. When I tried to stitch it in AutoPano, the following deformed thing came out: () And the other side of the block: () After this, I've learned about parallax error and why you have to avoid moving when making panoramas. I mean, there are a lot of connection errors on both images. Especially in the second one, the part with the corner is quite problematic to stitch because to as I moved, the perspective of the view changed a lot. So, is there any way to stitch this kind of panorama correctly? Would this only work on plain walls? | eng_Latn | 29,107 |
Buffering in QGIS3 is returning giant circles I'm trying to buffer some lines in QGIS. However, the result I'm returning are giant circles. I've tried several different CRS and made sure the project and layer had the same CRS but the result is still the same. The CRS is also not decimal but meters. Not sure what I'm missing. Any suggestions? | Understanding QGIS buffer tool units I have had no luck getting the buffer tool to accept anything but degrees as units of measure. I have found lots of stuff saying the layer needs to be reprojected and saved but it hasn't worked at all for me. Is there a way I could create a buffer without using ftools or at least force the units to meters somehow? As a workaround I converted meters to degrees (lat) and used that but the final product needs to be as close to reality as possible. Things I've tried: setting every unit option I could find to meters (where possible). setting everything to NAD83/Maryland (data is for Washington, DC) and saving it as such (as layers in ESRI shape files). reimporting the reprojected layers setting relevant layers to Google Mercator The was tried followed by creating a buffer. Many were tried in combination. QGIS 1.7.3 Slackware64 current (qgis from SBo-13.37 repo, tried on multilib and plain 64it with same results) | Local Coordinate to Geocentric The ultimate question, I need a transformation matrix to take a point in local space representing a roughly 500m x 500m place in New Mexico centered at -108.619987456 long 36.234600064 lat. The final output needs to be in geocentric coordinates and I can during initialization get any number of points on the map such as the corners, center, western most along the center etc. During run time I would like to have a transformation matrix that can be applied to a position in local space expressed in meters and get back an approximate GCC coordinate. The center of the point in GCC: -1645380 -4885138 3752889 The bottom left corner of the map in GCC: -1644552, -4881054, 3749220 During run time I need to be able to multiply (-250, -250, 0) with the transformation matrix and get something close to (-1644552, -4881054, 3749220). I've spent more than a week trying to research a solution for this and so far nothing. Given an identity matrix as our starting position and orientation. Then using geotrans and the known lat, long, and height of the starting position I get an x,y,z geocentric coordinate. A vector from the origin of (0,0,0) gives both the up and translation for the matrix. However, I need the forward and right so that I can pass a distance in meters from the origin into the transformation matrix and get a roughly accurate GCC. Do I have all of the inputs I need to calculate the right and forward? Inputs Origin: 0, 0, 0 Global: -1645380, -4885138, 3752889 Up (Normalized): Global - Origin Desired Outputs Right: ? ? ? Forward: ? ? ? So with the right and forward added to the up and translation I already calculated I would have a transformation matrix. I could then apply the matrix to a vector of say (50, 50, 0) and get something within about 0-3 cm of the output if I feed the lat/long back into geotrans. This matrix would only ever be used small maps of about 500m x 500m so the curvature of the earth is negligible. In reply to whuber, I don't know your level of experience so I will start with the very basics. An identity matrix is a 4x4 matrix that you can multiply by a vector and it returns the vector unchanged. Such as below. 1 0 0 x=0 0 1 0 y=0 0 0 1 z=0 0 0 0 1 The x, y, and z of the matrix are how much to translate the vector by and the first 3 rows and columns are the rotation. Below is a tranformation matrix that does nothing to the orientation of a vector but does translate. This is what you would want for two axis aligned worlds in different locations. 1 0 0 13 0 1 0 50 0 0 1 -7 0 0 0 1 If you were to multiply a vector of (10, 10, 10) with the above transformation matrix you would get an output of (23, 60, 3). My problem is the axes are not aligned and the "plane" of the world I am trying to get the local coordinate converted to is projected on the ellipsoid of the earth. Geotrans is library that you can use to pass a coordinate from one system such as geodetic or GDC (lat, long, height) and get back another such as geocentric or GCC (x,y,z). For example: If my game world was representing a map centered exactly on the prime meridian and equator the transformation matrix would look something like below. I am just guesstimating the Y and Z rotations and might have them flipped and/or the wrong sign. 0 0 1 6378137 0 1 0 0 1 0 0 0 0 0 0 1 | eng_Latn | 29,108 |
Can I use AR view if my phone doesn't have a gyroscope? My 'AR' view in Pokemon Go doesn't work. Every time I switch it to on it gives me an orientation error. I found out that it could be because my phone lacks a Gyroscope. My question is, will I be able to play Pokemon Go in AR mode? | How do I get Pokémon Go to detect my phone's orientation? When initiating a battle it starts off in AR mode, but then warns me saying my phone's orientation is not detected. I am then forced to turn off AR mode. Has anybody experienced this? If so, what are the solutions? Per comments, I am unsure this is a permissions issue. I have everything turned on. | I'm rotating an object on two axes, so why does it keep twisting around the third axis? I see questions come up quite often that have this underlying issue, but they're all caught up in the particulars of a given feature or tool. Here's an attempt to create a canonical answer we can refer users to when this comes up - with lots of animated examples! :) Let's say we're making a first-person camera. The basic idea is it should yaw to look left & right, and pitch to look up & down. So we write a bit of code like this (using Unity as an example): void Update() { float speed = lookSpeed * Time.deltaTime; // Yaw around the y axis using the player's horizontal input. transform.Rotate(0f, Input.GetAxis("Horizontal") * speed, 0f); // Pitch around the x axis using the player's vertical input. transform.Rotate(-Input.GetAxis("Vertical") * speed, 0f, 0f); } or maybe // Construct a quaternion or a matrix representing incremental camera rotation. Quaternion rotation = Quaternion.Euler( -Input.GetAxis("Vertical") * speed, Input.GetAxis("Horizontal") * speed, 0); // Fold this change into the camera's current rotation. transform.rotation *= rotation; And it mostly works, but over time the view starts to get crooked. The camera seems to be turning on its roll axis (z) even though we only told it to rotate on the x and y! This can also happen if we're trying to manipulate an object in front of the camera - say it's a globe we want to turn to look around: The same problem - after a while the North pole starts to wander away to the left or right. We're giving input on two axes but we're getting this confusing rotation on a third. And it happens whether we apply all our rotations around the object's local axes or the world's global axes. In many engines you'll also see this in the inspector - rotate the object in the world, and suddenly numbers change on an axis we didn't even touch! So, is this an engine bug? How do we tell the program we don't want it adding extra rotation? Does it have something to do with Euler angles? Should I use Quaternions or Rotation Matrices or Basis Vectors instead? | eng_Latn | 29,109 |
Changing rotation for each data driven page dynamically? I have a .mxd using data driven pages. In the .mxd there is a shape containing foto locations displayed with an arrow. At the top of the arrow there is a label containing the foto id visualized, the roation of the arrow is the direction of view of each photo. The problem now is, that the data driven pages have different rotations, so the labels are not positioned right in each page. I am using the adjusting placement by rotation setting. How can I dynamically change the rotation for each data driven page for each lable? | Rotating hatch labels based on orientation changes in Data Driven Pages Strip maps? I've created a strip map for a linear feature and have it loaded as a data driven page. I want to show stationing for the line. The stationing displays properly on the first data driven page but on each subsequent page the stationing labels get messed up as the map orientation changes. I want the stationing labels perpendicular to the linear feature as shown in the first image. The tick marks seem to rotate and stay perpendicular to the line but the labels won't. How can I fix this? | Workflow for UV orientation How to I deal with unwrapped UVs orientation? Delete the default Cube. Add a Cylinder Edit mode and edge select the Top and the Bottom and a single connecting edge. Make them into seams UV Unwrap (standard Unwrap mode) The unwrapped UV now shows two circles above a row of quads. I can save this and open it in an image editor and paint over those areas and get the texture to align on the cylinder back in blender. But, the pattern(just happens to be text) that was inserted in place of the row of quads is now upside down on the model. Any attempt to flip or align just makes things worse, from upside down to mirrored or the Top/Bottom not aligning. So my real question is how could I have known that the quads represented an upside down view of the world? Do other blender users export UVs and mark them with arrows and colours just to see how they align on a model before actually painting for real? I tried selecting the edges in step 3 in different orders before making them into seams and it made no difference. I also looked at the Normals but they appear to be only Inside/Outside not Up/Down. Blender 2.70 | eng_Latn | 29,110 |
How can I decode a Hill Cipher without a key? On my exam, we had to solve this problem: message matrix is: [20, 19, 14; 17, 0 10] and the ciphertext matrix is: [18, 15, 16; 3, 24, 24]. Find key matrix. I couldn't solve this problem, because I tried to find key matrix by multiplying [20, 19, 14; 17, 0 10]^-1 and [18, 15, 16; 3, 24, 24], but I wasn't able to calculate [20, 19, 14; 17, 0 10]^-1, because it's not a square matrix. How can I decode a Hill Cipher without a key? | Hill Cipher known plaintext attack I know a plaintext - ciphertext couple of length 6 for a hill cipher where its key is a [3x3] matrix. Based on what I've read and learned, to attack and crack keys of [n x n], if we know a plaintext - ciphertext duo of length $n^2$ then we have our set of n equations with n variables, and this is generally solvable. However in my case there is only length of $6$ instead of $n^2 = 9$ . My question is, how will I solve this problem and find the key? Or, since this is an obvious homework question, what should be my way of thinking? I cannot think of anything else but matrix multiplications and inverses but they do not help me at all. | Georeferencing old map in QGIS? I have an old map over a part of Laos and wondering how do I georeference the map with the help of the Grid lines and other Longitude and Latitude information on the scanned map? I have searched for information on the internet and tried many times. My scanned map wont overly correctly on the base map. Any base map would be fine. As long as I can check the result. Do I add X/East and Y/North values from the grid line crossings as marked and written in some parts of the map? How do I begin georeferencing this map? Do I have to add zeros after the two digits? For example if the Easting is 24 (its in KM right? Do I add like three/four zeros to convert it to meters? I georeferenced the map with the option "From Map Canvas" in the georeferencer tool and I succeeded but I want to know the coordinate of a point on my old scanned map and write them as X/East and Y/North to georeference the map. I hope I am not babbling and someone understands what I mean! We can set the CRS to WGS84, EPSG:4326. | eng_Latn | 29,111 |
How can I add a graticule similar to the DeLorme Atlas to a map in QGIS? I've got a map in NJ State Plane (Feet) projection and want to add a grid to it with lat and long intervals of 1 minute, and 0.1 minute tick marks. I'm trying to do this in the grid tab in the print composer but getting wonky results (probably some user error here! | Why is the QGIS map composer grid showing unreasonable lat/lon? I need to print a map with coordinates in the format of Degree, Minute and second. CRS is set to WGS 84 (EPSG 3857) for both layers and on the project properties it is set to enable transformation on the fly. But then, when in map composer I get the following coordinates, which make no sense to me. | US / Mexican border data I'm interested in locating free GIS shapefile data representing the US / Mexican border. I have found a point layer depicting the border crossing points but I'm interested in a polyline feature representing the actual border. | eng_Latn | 29,112 |
lets say i have a point on the map New York, NY, USA Latitude: 40.712784 | Longitude: -74.005941 If i move 10 metres north in 20 seconds ,can i use the coordinates i have to calculate the new coordinates?. | I'm looking for an algorithm which when given a latitude and longitude pair and a vector translation in meters in Cartesian coordinates (x,y) would give me a new coordinate. Sort of like a reverse Haversine. I could also work with a distance and a heading transformation, but this would probably be slower and not as accurate. Ideally, the algorithm should be fast as I'm working on an embedded system. Accuracy is not critical, within 10 meters would be good. | I have latitude and longitude as 19.0649070739746 and 73.1308670043945 respectively. In this case both coordinates are 13 decimal places long, but sometimes I also get coordinates which are 6 decimal places long. Do fewer decimal points affect accuracy, and what does every digit after the decimal place signify? | eng_Latn | 29,113 |
I'm currently developing a 3D game engine in C# using OpenTK. I have basic game objects, and each game object has transform (translation, rotation and scaling). A game object can have components (much like Unity). One game object (The camera game object) has three components, A camera, free move and a free look component. The free move component allows the user to move the camera around the scene, whilst the free look camera is SUPPOSED to rotate the camera. However, I get this odd effect: (Note that, the plane isn't changing position, it seems the camera is ... turning upside down?) On top of that, once I rotate the camera, moving the camera (using the arrow keys) gets really weird, up can become down, left can become right. It's all very odd. Anywho, here is the update method for my Free Look component: protected override void OnUpdate(object sender, UpdateEventArgs e) { Rectangle bounds = Game.GetCurrentGame().Window.Bounds; Vector2 center = new Vector2(bounds.Left + (bounds.Width / 2), bounds.Top + (bounds.Height / 2)); Vector2 mousePosition = new Vector2(System.Windows.Forms.Cursor.Position.X, System.Windows.Forms.Cursor.Position.Y); Vector2 deltaPosition = center - mousePosition; bool rotX = deltaPosition.X != 0; bool rotY = deltaPosition.Y != 0; if(rotY) Transform.Rotate(new Vector3(0, 1, 0), MathHelper.DegreesToRadians(-deltaPosition.X * sensitivity)); if(rotX) Transform.Rotate(new Vector3(1, 0, 0), MathHelper.DegreesToRadians(-deltaPosition.Y * sensitivity)); if (rotY || rotX) System.Windows.Forms.Cursor.Position = new Point((int)center.X, (int)center.Y); base.OnUpdate(sender, e); } here is the Rotate method in my transform class: public void Rotate(Vector3 axis, float angle) { rotation = (Quaternion.FromAxisAngle(axis, angle) * rotation).Normalized(); } I hope someone can help me get to the bottom of this, I've been so frustrated! If there's any other code you need to see, just let me know. | I see questions come up quite often that have this underlying issue, but they're all caught up in the particulars of a given feature or tool. Here's an attempt to create a canonical answer we can refer users to when this comes up - with lots of animated examples! :) Let's say we're making a first-person camera. The basic idea is it should yaw to look left & right, and pitch to look up & down. So we write a bit of code like this (using Unity as an example): void Update() { float speed = lookSpeed * Time.deltaTime; // Yaw around the y axis using the player's horizontal input. transform.Rotate(0f, Input.GetAxis("Horizontal") * speed, 0f); // Pitch around the x axis using the player's vertical input. transform.Rotate(-Input.GetAxis("Vertical") * speed, 0f, 0f); } or maybe // Construct a quaternion or a matrix representing incremental camera rotation. Quaternion rotation = Quaternion.Euler( -Input.GetAxis("Vertical") * speed, Input.GetAxis("Horizontal") * speed, 0); // Fold this change into the camera's current rotation. transform.rotation *= rotation; And it mostly works, but over time the view starts to get crooked. The camera seems to be turning on its roll axis (z) even though we only told it to rotate on the x and y! This can also happen if we're trying to manipulate an object in front of the camera - say it's a globe we want to turn to look around: The same problem - after a while the North pole starts to wander away to the left or right. We're giving input on two axes but we're getting this confusing rotation on a third. And it happens whether we apply all our rotations around the object's local axes or the world's global axes. In many engines you'll also see this in the inspector - rotate the object in the world, and suddenly numbers change on an axis we didn't even touch! So, is this an engine bug? How do we tell the program we don't want it adding extra rotation? Does it have something to do with Euler angles? Should I use Quaternions or Rotation Matrices or Basis Vectors instead? | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,114 |
I am computing the distance of a map I am getting from ArcGIS' REST services by using a calculator as provided by . In a nutshell, it uses a GeodeticCalculator for distance computation based on latitude and longitude. I then use the distance and my display width to compute a scale. My display is currently 362*230 (custom display hence the off measurements). When comparing the distances on screen to the distances as given by Google maps, the measurement always overestimated. However, it did so by a factor of roughly 1.2. I not use this as a correction coefficient and am accurate to about 1m in 1 km. While this solution works (for now), I would like to know where this offset could come from. Below is a bit of code to show how the computation is done. Note that scale is a field of the class this method resides in. /** Relevant fields */ private GeodeticCalculator calculator = new GeodeticCalculator(CRS.decode("EPSG:3857")); private static final double CORRECTION = 1.2; /** More code left out for reasons of brevity. */ public double computeScale(double latMin, double latMax, double lonMin, double lonMax) { calculator.setStartingPosition(new DirectPosition2D(latMin, lonMin)); calculator.setDestinationPosition(new DirectPosition2D(latMax, lonMax)); double distance = calculator.getOrthodromicDistance(); scale = distance / (Calibration.getTableWidth() / 10) / CORRECTION; return distance; } | I'm currently reading documentation about map projections to understand the source code of the Proj4 project. The scale factor is named in a variety of sources I read. This sources explained its definition and its value for some projections. In the source code of Proj4, for the mercator projection (sphere and ellipse case), the scale factor influences the coordinates on the projection : //P->k0 is the scale factor xy.x = P->k0 * lp.lam; xy.y = - P->k0 * log(pj_tsfn(lp.phi, sin(lp.phi), P->e)); Why and how I should use the scale factor during the computation of the projection ? Is there any valuable resources on the web ? This question in asked in the sense of projection computation. I can find the formula for the inverse and forward projection as well as the scale factor in a various of resources, but no one explains how I should use both in a algorithm. You have the definition of the projection and the definition of the scale factor, but it's not clearly written that I should multiply or divide the result by the scale factor. Is it a general rule : if I find the formula for any projection with the related scale factor, should I, in all cases, always divide or multiply the results by the scale factor ? | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,115 |
If I have a spawner at y level 12 and AFK there will it run my farms at the the same x and z level above ground (will it load the chunks?) | What's the max height of blocks for a redstone contraption to still work even I'm in the surface below? Like if I build a pillar to the max height of blocks and build a redstone contraption there, is it still going to work even I'm on the surface below? | I've been modeling an F-15, but when I rotate the flaps on Y-Axis the Flaps offset from the wing and they look like they are about to fall out, so I end up correcting it by rotating in the X and Z-axis to align them correctly. However when I rotate them back from the Y-Axis they look distorted again because I did rotations on the X and Z axis. I tried riding it, no help. I saw something about custom orientation to fix it but that was not clearly explained and it was done on an ideal plane that was (0,0,0); mine is pre-rotated on all axis. I'll take scripts if needed and I'm using Blender 2.8. When I rotate it on the Y-Axis: Fixing it using X and Z rotation: Rotating it back on the Y distorted it again: Here's the wing with ailerons and flaps: | eng_Latn | 29,116 |
How to set my widget to bottom face my object or any face that ı wanted? İn edit mode: shift + S in object mode: Ctrl + alt + shift + C not helping... In 3dsmax for example there is a button which allows you can play with your widget without interferes with your object. so in short ı wanna do the thing same in the picture Thanks people. ı appreciated. Its solved. | Change pivot or local origin of an object How can I change the local origin of an object, without changing the object location in the world space. This is the object located at position (0,0,0) in the world space This is the same object located at position (0,1,0) in the world space, but with its local origin offset so that the object is in the same place Is there a way to do this in a single step? | Bake Simple Deform animation into mesh for Unity3D export? I'm trying to use a .blend file with a keyframe animation on the angle of a Simple Deform modifier in Unity3D. Unity3D however does not know how to handle modifiers and hence cannot play the animation. How can I bake/save the animation to the .blend file so that Unity3D can play it? | eng_Latn | 29,117 |
Rigid body dynamics: Ideal rod with 2 masses A rigid body is formed by two masses $m_1$ and $m_2$ ($m_1$ is not equal to $m_2$) joined by a rod of negligible mass and length $4L$. The system is initially at rest on a smooth horizontal surface. If a horizontal and perpendicular force is applied to the middle of the rod, then it follows that: I think the answer is d), but I'm not sure. This is how I reasoned: Considering that $m_1$ is left and $m_2$ is right, the center of mass (CM) is not in the middle because the masses are different, what means the CM is either left or right from the middle of the rod depending on the relationship between $m_1$ and $m_2$. However this relationship is not stated, but we can make the following assumption: let's say that $m_1 = 2m_2$. Then, the center of mass would be left closer to $m_1$. First, it is obvious that if the surface is frictionless, the system will accelerate from rest according to Newton's 2nd law for the CM. But... how can I determine if the system rotates or not? Will it have angular acceleration or not? In general, how can I know exactly when a rigid body has both linear and angular acceleration? I know that according to Newton's 2nd law for rotational dynamics, angular acceleration ($\alpha$) depends on inertia (of the system) and net torque. But since net torque is calculated respect to a rotation axis (and inertia too) how can I choose it and which is it? My professor told us there's sometimes a natural axis rotation, but I don't get it... The rod doesn't have an end fixed! Please, I want a clear and deep explanation! | If I push or hit an object in space will it rotate or move along a straight line? If I push or hit an object in space (vacuum and no gravitation) in direction what is not going trough its centroid, will it rotate or move along in straight line? I expect that on earth it will depend on what is less difficult for the object (rotation or linear movement). So the object will do some kind of combination of both movements (rotating and also moving along the direction of impulse or force). But how could an object "decide" what to do in space, where is not resistance? | Local Coordinate to Geocentric The ultimate question, I need a transformation matrix to take a point in local space representing a roughly 500m x 500m place in New Mexico centered at -108.619987456 long 36.234600064 lat. The final output needs to be in geocentric coordinates and I can during initialization get any number of points on the map such as the corners, center, western most along the center etc. During run time I would like to have a transformation matrix that can be applied to a position in local space expressed in meters and get back an approximate GCC coordinate. The center of the point in GCC: -1645380 -4885138 3752889 The bottom left corner of the map in GCC: -1644552, -4881054, 3749220 During run time I need to be able to multiply (-250, -250, 0) with the transformation matrix and get something close to (-1644552, -4881054, 3749220). I've spent more than a week trying to research a solution for this and so far nothing. Given an identity matrix as our starting position and orientation. Then using geotrans and the known lat, long, and height of the starting position I get an x,y,z geocentric coordinate. A vector from the origin of (0,0,0) gives both the up and translation for the matrix. However, I need the forward and right so that I can pass a distance in meters from the origin into the transformation matrix and get a roughly accurate GCC. Do I have all of the inputs I need to calculate the right and forward? Inputs Origin: 0, 0, 0 Global: -1645380, -4885138, 3752889 Up (Normalized): Global - Origin Desired Outputs Right: ? ? ? Forward: ? ? ? So with the right and forward added to the up and translation I already calculated I would have a transformation matrix. I could then apply the matrix to a vector of say (50, 50, 0) and get something within about 0-3 cm of the output if I feed the lat/long back into geotrans. This matrix would only ever be used small maps of about 500m x 500m so the curvature of the earth is negligible. In reply to whuber, I don't know your level of experience so I will start with the very basics. An identity matrix is a 4x4 matrix that you can multiply by a vector and it returns the vector unchanged. Such as below. 1 0 0 x=0 0 1 0 y=0 0 0 1 z=0 0 0 0 1 The x, y, and z of the matrix are how much to translate the vector by and the first 3 rows and columns are the rotation. Below is a tranformation matrix that does nothing to the orientation of a vector but does translate. This is what you would want for two axis aligned worlds in different locations. 1 0 0 13 0 1 0 50 0 0 1 -7 0 0 0 1 If you were to multiply a vector of (10, 10, 10) with the above transformation matrix you would get an output of (23, 60, 3). My problem is the axes are not aligned and the "plane" of the world I am trying to get the local coordinate converted to is projected on the ellipsoid of the earth. Geotrans is library that you can use to pass a coordinate from one system such as geodetic or GDC (lat, long, height) and get back another such as geocentric or GCC (x,y,z). For example: If my game world was representing a map centered exactly on the prime meridian and equator the transformation matrix would look something like below. I am just guesstimating the Y and Z rotations and might have them flipped and/or the wrong sign. 0 0 1 6378137 0 1 0 0 1 0 0 0 0 0 0 1 | eng_Latn | 29,118 |
SVG to GeoJSON conversion - incorrect render I have a vector image in SVG format.(without georeferences) I am trying to display it using OpenLayers to display properties of different polygons in it. I tried converting it to GeoJSON but the resulting render appears to be stretched (ratios are not preserved). Is there a transformation or a map projection that I am missing or recreating a GeoJSON from scratch the only relevant choice? | Transformation issue with SVG to GeoJSON conversion I have a vector image in SVG format. (without georeferences) I am trying to display it using OpenLayers to display properties of different polygons in it. I tried converting it to GeoJSON but the resulting render appears to be stretched (ratios are not preserved). Is there a transformation or a map projection that I am missing or recreating a GeoJSON from scratch the only relevant choice? | Understanding QGIS buffer tool units I have had no luck getting the buffer tool to accept anything but degrees as units of measure. I have found lots of stuff saying the layer needs to be reprojected and saved but it hasn't worked at all for me. Is there a way I could create a buffer without using ftools or at least force the units to meters somehow? As a workaround I converted meters to degrees (lat) and used that but the final product needs to be as close to reality as possible. Things I've tried: setting every unit option I could find to meters (where possible). setting everything to NAD83/Maryland (data is for Washington, DC) and saving it as such (as layers in ESRI shape files). reimporting the reprojected layers setting relevant layers to Google Mercator The was tried followed by creating a buffer. Many were tried in combination. QGIS 1.7.3 Slackware64 current (qgis from SBo-13.37 repo, tried on multilib and plain 64it with same results) | eng_Latn | 29,119 |
Custom auto properties in C# I have the following class with auto properties: class Coordinates { public Coordinates(int x, int y) { X = x * 10; Y = y * 10; } public int X { get; set; } public int Y { get; set; } } As you can see from the constructor I need the value to be multiplied by 10. Is there anyway to do it without removing autoproperties? I tried the following not thinking that it causes recursion and then everything goes fubar public int X { get {return X;} set{ X *= 10;} } I would like to assign values to X and Y multiplied by 10. Coordinates coords = new Coordinates(5, 6); // coords.X = 50 coords.Y = 60 coords.X = 7; // this gives 7 to X but I would like it to be 70. | C# 3.0 Auto-Properties - Is it possible to add custom behaviour? I would like to know if there is any way to add custom behaviour to the auto property get/set methods. An obvious case I can think of is wanting every set property method to call on any PropertyChanged event handlers as part of a System.ComponentModel.INotifyPropertyChanged implementation. This would allow a class to have numerous properties that can be observed, where each property is defined using auto property syntax. Basically I'm wondering if there is anything similar to either a get/set template or post get/set hook with class scope. (I know the same end functionality can easily be achieved in slightly more verbose ways - I just hate duplication of a pattern) | How do I make an object face the mouse position, but only on one plane? I want a game object to turn and look towards where the mouse was clicked (in world space). It works well when considering all 3 dimensions, but I really only care for the plane formed by the x and z axis, not the y. I've tried to fix this by setting the y of the directionTarget to the y of the object's position, but that gives weird results. My code looks like this: Quaternion rotationTarget; float rotationLerpProgress; float rotationLerpDuration = 1f; // Use this for initialization void Start () { } // Update is called once per frame void Update () { if (Input.GetMouseButtonDown (0)) { var ray = Camera.main.ScreenPointToRay (Input.mousePosition); RaycastHit rayHit; if (Physics.Raycast (ray, out rayHit)) { var rayHitPoint = rayHit.point; var rotationDirection = transform.position - rayHitPoint; rotationDirection.Normalize (); //rotationDirection.y = transform.position.y; rotationTarget = Quaternion.LookRotation (rotationDirection); rotationLerpProgress = Time.deltaTime; } //targetPosition = Camera.main.ScreenToWorldPoint (Input.mousePosition); } if (rotationTarget != transform.rotation) { transform.rotation = Quaternion.Lerp (transform.rotation, rotationTarget, rotationLerpProgress / rotationLerpDuration); rotationLerpProgress += Time.deltaTime; if (rotationLerpProgress >= rotationLerpDuration) { transform.rotation = rotationTarget; } } } I must be missing something. How do I do this right? | eng_Latn | 29,120 |
Transform coordinates I have two shapes, one called bike_stations and another one called quite_streets, both contains geographic information about elements in Madrid, Spain and each one come from two different sources. The coordinates in quite_streets are like this: 431376.8435674606589600,4464668.6104000005871058 and the coordinates in bike_stations are like this: -3.7246532000000001,40.3914722000000026 So, if I put them together in QGIS with OSM base map I have the folloging result: Bike_stations appears in the Guinea Gulf and quite_streets at the north of Algeria, when they should be located at the X. Does anyone know any method in QGIS, PostGIS or maybe Java to transform the coordinates in order to be equal to the coordinates of the other shape? | Lat/Long co-ordinates are not plotting onto EPSG:3857 OSM basemap correctly Excuse me if this is a stupid question, but I've honestly tried to find the answer in a few threads here on SE, and I cannot seem to follow the suggested steps correctly. Project is in (WGS 84 / Pseudo-Mercator) BaseMap (OpenStreetMap) layer is set to EPSG:3857 Vector data has been imported from CSV, and has Lat/Long coordinates, set to EPSG:3857. The layer appears right in the middle of the map, around 0,0 obviously. Why is this, what am I doing wrong? | Georeferencing old map in QGIS? I have an old map over a part of Laos and wondering how do I georeference the map with the help of the Grid lines and other Longitude and Latitude information on the scanned map? I have searched for information on the internet and tried many times. My scanned map wont overly correctly on the base map. Any base map would be fine. As long as I can check the result. Do I add X/East and Y/North values from the grid line crossings as marked and written in some parts of the map? How do I begin georeferencing this map? Do I have to add zeros after the two digits? For example if the Easting is 24 (its in KM right? Do I add like three/four zeros to convert it to meters? I georeferenced the map with the option "From Map Canvas" in the georeferencer tool and I succeeded but I want to know the coordinate of a point on my old scanned map and write them as X/East and Y/North to georeference the map. I hope I am not babbling and someone understands what I mean! We can set the CRS to WGS84, EPSG:4326. | eng_Latn | 29,121 |
Graphics not centering I am trying to center graphics in TexShop. This is the code that I am using but the graphics are not getting centering whatever I try to do. \begin{figure}[H] \centering \includegraphics[height = 2.3in]{/Users/LOCATION/Matrix1.png} \caption{Generated Matrix} \end{figure} The code is giving me this image I tried using \begin{center} ... end{center} instead of \centering but nothing seems to be fixing the problem. EDIT 1 : Full Code \documentclass[11pt]{article} \usepackage{graphicx} \usepackage{amsmath} \begin{document} \begin{figure}[H] \centering \caption{Generated Matrix} \includegraphics[height = 2.3in] {/Users/Max/Desktop/Math IA/PhysicsDrawings/Matrix1.png} \end{figure} \end{document} | How to include graphics with spaces in their path? I want to import graphics into my main input file using the macro \includegraphics. It does not work if the filename contains spaces. also discusses this subject, but there is no solution there. My compilation routine is latex->dvips->ps2pdf (because of PSTricks). | Local Coordinate to Geocentric The ultimate question, I need a transformation matrix to take a point in local space representing a roughly 500m x 500m place in New Mexico centered at -108.619987456 long 36.234600064 lat. The final output needs to be in geocentric coordinates and I can during initialization get any number of points on the map such as the corners, center, western most along the center etc. During run time I would like to have a transformation matrix that can be applied to a position in local space expressed in meters and get back an approximate GCC coordinate. The center of the point in GCC: -1645380 -4885138 3752889 The bottom left corner of the map in GCC: -1644552, -4881054, 3749220 During run time I need to be able to multiply (-250, -250, 0) with the transformation matrix and get something close to (-1644552, -4881054, 3749220). I've spent more than a week trying to research a solution for this and so far nothing. Given an identity matrix as our starting position and orientation. Then using geotrans and the known lat, long, and height of the starting position I get an x,y,z geocentric coordinate. A vector from the origin of (0,0,0) gives both the up and translation for the matrix. However, I need the forward and right so that I can pass a distance in meters from the origin into the transformation matrix and get a roughly accurate GCC. Do I have all of the inputs I need to calculate the right and forward? Inputs Origin: 0, 0, 0 Global: -1645380, -4885138, 3752889 Up (Normalized): Global - Origin Desired Outputs Right: ? ? ? Forward: ? ? ? So with the right and forward added to the up and translation I already calculated I would have a transformation matrix. I could then apply the matrix to a vector of say (50, 50, 0) and get something within about 0-3 cm of the output if I feed the lat/long back into geotrans. This matrix would only ever be used small maps of about 500m x 500m so the curvature of the earth is negligible. In reply to whuber, I don't know your level of experience so I will start with the very basics. An identity matrix is a 4x4 matrix that you can multiply by a vector and it returns the vector unchanged. Such as below. 1 0 0 x=0 0 1 0 y=0 0 0 1 z=0 0 0 0 1 The x, y, and z of the matrix are how much to translate the vector by and the first 3 rows and columns are the rotation. Below is a tranformation matrix that does nothing to the orientation of a vector but does translate. This is what you would want for two axis aligned worlds in different locations. 1 0 0 13 0 1 0 50 0 0 1 -7 0 0 0 1 If you were to multiply a vector of (10, 10, 10) with the above transformation matrix you would get an output of (23, 60, 3). My problem is the axes are not aligned and the "plane" of the world I am trying to get the local coordinate converted to is projected on the ellipsoid of the earth. Geotrans is library that you can use to pass a coordinate from one system such as geodetic or GDC (lat, long, height) and get back another such as geocentric or GCC (x,y,z). For example: If my game world was representing a map centered exactly on the prime meridian and equator the transformation matrix would look something like below. I am just guesstimating the Y and Z rotations and might have them flipped and/or the wrong sign. 0 0 1 6378137 0 1 0 0 1 0 0 0 0 0 0 1 | eng_Latn | 29,122 |
How to access vertex normals from a shape-key in Python? I am experimenting with WebGL and I have a mesh that is animated via shape keys. I can get access to the regular coordinates of the shape key pretty easily: if bm.verts.layers.shape: skd_json = {} rval['shape_keys'] = skd_json for skn, sk_l in bm.verts.layers.shape.items(): sk_v = [xyz for v in bm.verts for xyz in v[sk_l]] #sk_vn = [xyz for v in bm.verts for xyz in ???] sk_json = { 'vertices':sk_v, #'vertex_normals': sk_vn } skd_json[skn] = sk_json However, the mesh I want to draw is smooth shaded, so I want to get the vertex normals for each of the shape keys so I can do some basic Phong shading even when the model is deformed. How can I access these vertex normals via the python API? | How to get vertex normals from a shape-key in Python? I'm trying to create an exporter for a format, which includes shape key support. This format requires both vertex coordinates and normals to be specified. This should be logical, because the vertex selected on the image below won't have the same normals: Comparing it to this one: But I actually haven't found any way to export shape key normals. Both Bmesh Vertex and ShapeKey Vertex only store the coordinates. Is there any way to get them? | Determining coordinates of a SHP file i have a SHP file of a building converted from a DXF file. This SHP file is unprojected and have no specific coordinates (origin 0,0). I find WGS84 coordinates (lat, long) of this building from Google Maps. My aim is to replace these coordinates to the SHP file on QGIS. Could you explain the process step by step? (I read lots of similar questions, about affine transformation. But i don't know the paramaters.) What i ask is how to make this transformation in QGIS (step by step)? SHP file coordinates (not projected) Scale: 1:142.670.713 X1,Y1: 16.70824, 253.74534 X2,Y2: 1.03521,-6.34918 X3,Y3: 244.00248,-7.49189 Target WGS84 coordinates X1,Y1: 41.047773, 28.896241 X2,Y2: 41.045452, 28.895929 X3,Y3: 41.045436, 28.899039 I need a second vector layer with coordinates. I have just a SHP file of a building converted from a DXF file. That is not projected. I know where this building is, and can find the WGS84 coordinates from Google Maps. My aim is add WGS84 coordinates to this SHP file and export to replace the building on Google Maps. My Questions are: 1) I think my SHP file has measurements in meters and unprojected. I have WGS84 lat long coordinates from Google Maps. How can i project this SHP file, for being ready to transformation? Right Click -> Set Layer CRS -> Choose WGS84 EPSG: 4326. Is this enough? 2) qgsAffine plugin just need transformation matrix paramaters. But the answer of @Jochen Schwarze is a bit complex for me. Is there a online calculator? (Just give the source XYs and target XYs and calculate the parameters.) I think, I have a projection problem before affine transformation. I have a DXF file of a building from AutoCAD. I saved this DXF file as a SHP file in QGIS. But this SHP file is converted from a plain DXF file and is not referenced/projected, it is drawn from coordinates 0,0 in AutoCAD in meters. And the drawing has meters, WGS84 has decimal degrees. When I set Layer CRS as WGS84 in QGIS, the coordinates don't change. How can I project this SHP file as WGS84 in QGIS? | eng_Latn | 29,123 |
Layers not showing up together in ArcGIS even when in same coordinate system I have two layers in my project and both are being set to same coordinate system. But still they both son't show up together. Can someone please help me out in this. Projected Coordinate System: NAD_1983_UTM_Zone_14N Projection: Transverse_Mercator False_Easting: 500000.00000000 False_Northing: 0.00000000 Central_Meridian: -99.00000000 Scale_Factor: 0.99960000 Latitude_Of_Origin: 0.00000000 Linear Unit: Meter Geographic Coordinate System: GCS_North_American_1983 Datum: D_North_American_1983 Prime Meridian: Greenwich Angular Unit: Degree | Layers with same coordinate system should align/overlap in ArcMap but do not? I put 4 different layers into my ArcMap document, but only one of them will show up on the map! I should be seeing the other 3 layers as well. When I do a Zoom to layer, I can see one layer, but I cannot figure out how to make the other 3 show up at once on top of each other. Before I opened ArcMap, I used ArcCatalog where I right-clicked 3/4 of the shapefiles and clicked Properties → Coordinate system → Import and then clicked on my Trees layer, which has the coordinate system I want the other 3 to have. But when I opened ArcMap the 4 layers did not overlap, nor could I move them above or below each other. | Generating a DEM and DSM from correct LiDAR point classification I have downloaded classified LAS data from I am trying to generate accurate DEM and DSM for further analysis. I created a LAS dataset with a projected CRS of NAD_1983_UTM_Zone_19N in meters and the z CRS in meters as well. My question is on which classifications do I choose for the DEM and DSM. Here is the filter options I have in my dataset: According to unh LiDAR data report class 2 is ground. What I have done so far: DEM I chose the class 2 points and used the las dataset to raster. DSM In the predefined settings I chose first return and used the las dataset to raster tool. Is this an accurate way of generating these two rasters? Do I not have to take into account the the unassigned class 1, noise class 7, reserved 11, reserved 17, reserved 18? Additionally (I can ask this as a separate question if it gets requested to), when using the las dataset to raster tool the sampling value is defaulted to 10. I would like to change it to 2 or 3 (would be meters) to match the LiDAR resolution. I also plan on using the generated DEM and DSM for a least cost path analysis and I want them to be higher resolution. Will making the sampling_value 2 or 3 throw off the results or should I leave it at 10? | eng_Latn | 29,124 |
Transform.Rotate() rotation about one axis also rotates about the other axes i'm new to learning unity and infact game development.Below i have setup a scene to learn about unity and i have encountered a strange problem that doesn't comply with anything that unity has documented. I am rotating the camera (which is a child of my capsule game object, The object with Gradients highlighted in the photo, Capsule game object is a child of nothing) according to the user mouse movements.Here is the code i have written(Please ignore the commented lines for now) Vector3 a; rotateby = Input.GetAxis("Mouse X") * MouseSensitivity;// Rotate by is a float variable rotation.Set(0, rotateby, 0); // Rotation is a Vector3 i've declared transform.Rotate(rotation, Space.Self); /* a = transform.localRotation.eulerAngles; a.Set(0, transform.eulerAngles.y, transform.eulerAngles.z); transform.eulerAngles = a; */ rotateby = Input.GetAxis("Mouse Y"); if (!invertY) { rotateby = -rotateby; } rotation.Set(0, 0, rotateby); transform.Rotate(rotation, Space.Self); /* a = transform.localRotation.eulerAngles; a.Set(0, transform.eulerAngles.y, transform.eulerAngles.z); transform.eulerAngles = a; */ As you can see from the code when the user moves the mouse across Horizontal axes , i'm only updating the angle of rotation across the capsule's y-axis( rotation.Set(0, rotateby, 0) ) without updating the other two angles(x and y), which rotates it across its y axis.Similarly when the user does vertical mouse movement i'm updating only the z-angle of the capsule( rotation.Set(0, 0, rotateby) ) without updating the other two angles(x and y) again, so now it rotates across it's z-axis.Now it does what is expected, it is infact rotating across its z and y axes but it is also rotating across the x-axis in some cases.I am monitoring the values of the three angles(x y z) from the inspector.Look at how the capsule is titled towards the right , across its x-axis and also from the inspector below(look at the x-angle = -35) Now the commented lines of code are a solution to this problem of rotating around the x-axis(as when the x-angle changes i'm again setting it to 0).But what i need to know is what have i done wrong in my code i'm never modifying the x-angle then why is it changing to -35??.I have read about local vs GlobalSpaces again and again but i can't find the answer to this.Also what i think is that it's not a Gimbal Lock because as the parenting order that unity uses is Z-->X-->Y.So Gimbal lock could only occur if i rotate around the X-axis(Which i am not doing in my code) thereby aligning the Y-axis with the Z-axis.Why is this behaviour occuring, why is it rotating around its x-axis .Please Correct me if i am wrong, this is a barrier to my learning process cause i can't continue without clearing my concepts. Thankyou, much obliged ! UPDATE: I have trieed changing the 2nd parameter in transform.Rotate() method to Space.World , the problem of rotation about x-axis still happens but in some different mouse movement order. | Why is the camera tilting around the z axis when I only specified x and y? My goal is to program a camera to point towards the mouse cursor. I attached the following script to the Main Camera. using UnityEngine; using System.Collections; public class CameraController : MonoBehaviour { public float sensitivity; // Update is called once per frame void Update () { transform.Rotate (Input.GetAxis ("Mouse Y") * -1 * sensitivity, Input.GetAxis ("Mouse X") * sensitivity, 0); } } I "told" the camera to rotate around the x and y axes. Whenever it does this, the camera tilts sideways. When I checked the rotation of Main Camera, I saw that the z rotation had changed. Why is it rotating around the z axis when I left it at 0? | Calculating centroids of polygon using ArcPy? I am trying to calculate the centroids of a polygon and add the x- and y-coordinate to their respective fields: arcpy.env.workspace = r"C:/..." input_fc = "shapefiles/precincts_v.shp" arcpy.AddField_management(input_fc, "X", "DOUBLE") arcpy.AddField_management(input_fc, "Y", "DOUBLE") arcpy.CalculateField_management(input_fc, "X", "!SHAPE.CENTROID@DECIMALDEGREES!.split()[0]", "PYTHON") arcpy.CalculateField_management(input_fc, "Y", "!SHAPE.CENTROID@DECIMALDEGREES!.split()[1]", "PYTHON") but I an invalid syntax error. Where is the mistake? Edit: exact error message Traceback (most recent call last): File "C:\Users\mquentel\Dropbox\PLSS\CA voting data\Python\CONSTRUCT_merge_voting_and_census.py", line 17, in <module> arcpy.CalculateField_management(input_fc, "X", "!SHAPE.CENTROID@DECIMALDEGREES!.split()[0]", "PYTHON") File "C:\Program Files (x86)\ArcGIS\Desktop10.4\ArcPy\arcpy\management.py", line 3360, in CalculateField raise e ExecuteError: ERROR 000539: SyntaxError: invalid syntax (<expression>, line 1) Failed to execute (CalculateField). | eng_Latn | 29,125 |
Why when I change a drop shadow do all my drop shadows change? It's always been a mystery for me while creating drop shadow effects that one layer changes the angle of its shadow automatically if I change the angle of the other. For example: in Layer 1 I am setting the drop shadow angle as 0. In Layer 2 I want to set its angle to 90 degrees. As soon as I do this, the first layer changes its angle to 90 degrees. Why is it dependent and how do I remove this link? | Layers with different directional drop shadows in Photoshop I am wondering how to have 2 different directional drop shadows in the one file with Photoshop. e.g. 2 different type layers — one has a drop shadow at 90 degrees, the other at -45 degrees. Currently it seems Photoshop only lets me have 1 universal direction for the whole file, which I find really pointless. I know how to apply different drop shadows and merge down layers to achieve the same thing, but its tedious and not exactly streamlined for making future changes to that layer. How is this achieved? | Mental estimate for tangent of an angle (from $0$ to $90$ degrees) Does anyone know of a way to estimate the tangent of an angle in their head? Accuracy is not critically important, but within $5%$ percent would probably be good, 10% may be acceptable. I can estimate sines and cosines quite well, but I consider division of/by arbitrary values to be too complex for this task. Multiplication of a few values is generally acceptable, and addition and subtraction are fine. My angles are in degrees, and I prefer not have to mentally convert to radians, though I can if necessary. Also, all angles I'm concerned with are in the range of [0, 90 degrees]. I am also interested in estimating arc tangent under the same conditions, to within about 5-degrees would be good. Backstory I'm working on estimating the path of the sun across the sky. I can estimate the declination pretty easily, but now I want to estimate the amount of daylight on any given day and latitude. I've got it down to the arc cosine of the product of two tangents, but resolving the two tangents is now my sticking point. I also want to calculate the altitude of the sun for any time of day, day of the year, and latitude, which I have down to just an arc tangent. | eng_Latn | 29,126 |
Is there a way to interactively rotate labels in QGIS using any of the edit tools? I seem to have put too much detailed information in my previous question, confusing the masses. My apologies. Let me try again from a different angle. I want to rotate labels interactively/visually, using edit tools. I don't want to use the Data defined Rotation method, found under Layer Properties > Placement. Is there a way to interactively rotate labels in QGIS using any of the edit tools? While Data defined Rotation works well, it is tedious to set the text rotations for a feature, having to manually enter and "guesstimate" the text angle value into the attribute fields. Especially for point and polygon features. | What are valid values for "data defined" labeling settings? I'm using QGIS 1.8. I'd like to customize label features on an existing shapefile using the new label engine and it's "data defined settings" option. I'd like to know what the attribute table field types should be for each of the settings: All 7 font options, all 2 buffer options, and all 6 position options. Should they all be numeric fields, or some text, or both? I want to create these fields in the attribute table correctly so that I can map them to the label field settings. I can't find information on the proper field types anywhere. Thanks for your thoughts. | Convert quaternion to a different coordinate system OSVR has a right-handed system: x is right, y is up, z is near. I need to convert orientation data from a sensor in OSVR to a different right-handed system in which x is forward, y is left, and z is up. I need to calculate the transformation that will convert a quaternion from one system to the other. I naively tried: void osvrPoseCallback(const geometry_msgs::PoseStampedConstPtr &msg) { // osvr to ros geometry_msgs::PoseStamped outputMsg; outputMsg.header = msg->header; outputMsg.pose.orientation.x = msg->pose.orientation.y; outputMsg.pose.orientation.y = msg->pose.orientation.z; outputMsg.pose.orientation.z = msg->pose.orientation.x; outputMsg.pose.orientation.w = msg->pose.orientation.w; osvrPosePub.publish(outputMsg); ros::spinOnce(); } But that made things really weird. Facing north pitch is pitch, facing west, pitch is yaw, facing south, pitch is negative pitch... How can I convert my OSVR quaternion to the a corresponding quaternion in the new coordinate system? | eng_Latn | 29,127 |
Points disappeared after changing CRS I'm having a problem with CRS and layers in QGIS & on Geoserver What I'm trying to accomplish to change CRS of layers in importing. I'm importing 'test.geojson' file in QGIS, and when I open points it says that there is WGS84 CRS set, & I'm trying it to use projected CRS since I need meters in EPSG:4326 But as soon as I change CRS with right-click SetCRS/SetLayerCRS and change it to WGS 84/ UTM zone 31N all the points disappear. Does someone know what could be the problem? | How to enable projection transformation in QGIS My QGIS project has two layers, one layer is retrieved by Google layer plugin which fetches the static Google map image whose CRS is +proj=merc +lon_0=0 +lat_ts=0 +x_0=0 +y_0=0 +a=6378137 +b=6378137 +units=m +no_defs and another vector layer is a postgis layer whose CRS is +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs. I digitalized some polygon features from Google Layers and added it to the vector layer. However, QGIS seems do not make the automatic CRS transformation. How can I change its projection? Thanks | Georeferencing old map in QGIS? I have an old map over a part of Laos and wondering how do I georeference the map with the help of the Grid lines and other Longitude and Latitude information on the scanned map? I have searched for information on the internet and tried many times. My scanned map wont overly correctly on the base map. Any base map would be fine. As long as I can check the result. Do I add X/East and Y/North values from the grid line crossings as marked and written in some parts of the map? How do I begin georeferencing this map? Do I have to add zeros after the two digits? For example if the Easting is 24 (its in KM right? Do I add like three/four zeros to convert it to meters? I georeferenced the map with the option "From Map Canvas" in the georeferencer tool and I succeeded but I want to know the coordinate of a point on my old scanned map and write them as X/East and Y/North to georeference the map. I hope I am not babbling and someone understands what I mean! We can set the CRS to WGS84, EPSG:4326. | eng_Latn | 29,128 |
How to get posebone global rotate A bone was rotated 90 degrees on the Y axis. I simply want to get (0, 90, 0), but in Blender it is represented by Euler angles (86.4, 44.9, 66.3). How do I get rotation on the global axis In Python code? import bpy obj = bpy.data.objects["Armature"] for pbone in obj.pose.bones: rotate_x = pbone.rotation_euler.x ##1.5072393417358398 (Euler) rotate_y = pbone.rotation_euler.y ##0.7828118801116943 (Euler) rotate_z = pbone.rotation_euler.z ##1.1571569442749023 (Euler) ''' rotate_x _y _z are all euler angles of Bone. But I want to get these as below. It will be rotate_x = 0 rotate_y = 1.5708(or 90) rotate_z = 0 because rotate 90 degrees on Axis Y. ''' | How to get world-space matrix of any pose bone? Is there a command for getting world-space matrix of pose bones, same as obj.matrix_world for objects? I found pose_bone.matrix, but it gives weird result for me. | Rotating a gyroscope's output values into an earth-relative reference frame For whatever reason, I seem to have gotten myself twisted into quite a confusion on how to process data from my 3 axis gyro and 3 axis accelerometer to get gyro rotation values related to the earth's reference frame. i.e. Given my accelerometer measures gravity (we assume it's stable) along the sensors (x,y,z) reference frame, how do I adjust the values of my gyro, measured along the sensors same (x,y,z) reference frame such that I am actually measuring the rotating about the Earth's frame x (North), y (West), and z (downwards) direction? I just can't seem to wrap my head around the use of the rotation matrix. Do I simply employ a 3d rotation matrix, and multiply the vector, \$[gyro_x, gyro_y, gyro_z]^T\$, by a rotation matrix, R? I then assume the rotation angles, are somehow derived from the filtered accelerometer outputs, but I'm having trouble deciding on what those should be. I have yet to find a concise reference on how to move the rotation and acceleration values from the sensor reference frame into the earth global reference frame. Perhaps someone can help? Thanks | eng_Latn | 29,129 |
The scale in Qgis is not setting to feet I am working in QGIS 2.4. I have changed the CRS of all the layers and in the project properties to EPSG:102739 - NAD_1983_StatePlane_Texas_Central_FIPS_4203_Feet. I am in print composer trying to create a scale bar and I can not figure out why the scale is so off. I am trying to go from feet to miles and it just creates a really long scale bar. Basically 1 mi = 5280 is what I am going for I have already checked each layer and they are all in state plane 4203. I am not sure what I am missing. I checked other threads about scale in QGIS and I am still at a loss. I have tried flipping the conversion around and its still not working. Edit 2: Sounds crazy is there a difference between Qgis for Windows and for Linux. I loaded the exact same data(just copied and pasted the file with the map and data) to a windows computer. Opened it and everything works fine. Edit: The answer given does not address my issues. I have already done everything in that answer. All of my layers have been resaved with projection and the project projection has been changed. | How to get correct scale values? The scale of my map is wrong, no matter what coordinate system I use in my project. The map and all objects show the correct coordinates but the scale of the map shows nonsense and the measurement tool gives wrong dimension. The map should be like 1:10000 but qgis shows 1:1. Anyone an idea? | Converting from DMS to DD using Python in Field Calculator? I need to convert lat/long that is expressed as degrees, minutes, and seconds in the data into decimal degrees. For example, in the data, they are listed as N335042.06 in the Latitude column, and W86031.04 in the Longitude column. I have done this problem before where I created a script which converted DMS to DD, and vice-versa, so I guess I could use bits from that. But the problem I'm having is how to ignore (for lack of a better word) the 'N' and 'W' in the data? Can I skip them? And DMS are listed all together without any symbols or spaces. Can I use len(), range(), split() to specify what part to read from the value? For example, can do the following? N335042.06 where, 33 = degrees 50 = minutes 42.06 = seconds ...? I came across ESRI article, but it's in VB. Will probably use it as a reference but some of the terminology/syntax is different from Python. Final code that works! # Pre-logic def latDD(x): D = int(x[1:3]) M = int(x[3:5]) S = float(x[5:]) DD = D + float(M)/60 + float(S)/3600 return DD # Expression latDD(!Latitude!) | eng_Latn | 29,130 |
If we have a uniform prior probability distribution across all ranges of $ \theta $, does maximum a posteriori become maximum likelihood estimation? While I am studying maximum a posteriori, somehow it just came across my mind that if I have a uniform prior probability distribution, MAP looks like MLE, am I right? | How does a uniform prior lead to the same estimates from maximum likelihood and mode of posterior? I am studying different point estimate methods and read that when using MAP vs ML estimates, when we use a "uniform prior", the estimates are identical. Can somebody explain what a "uniform" prior is and give some (simple) examples of when the MAP and ML estimators would be the same? | Calculating extent of custom tiles? Problem The picture is just an example of my case, but I am left with blank space on the left when I load all the tiles and I get error message in the console: Failed to load resource: the server responded with a status of 404 (Not Found) http://path/to/the/tile//base/8/6/53.png etc. I could guess and limit the extent of the layer, but I would really like to know is there any way to calculate it and also be able to get rid of the error message. What I know about the tiles? Originally they are served as TMS They represent png images 256*256 They are in projection: EPSG:3857 In OpenLayers2 they were called like TMS layer In OpenLayers3 (v.3.8.2) they are called like XYZ layer (that includes -y): var baseLayer = new ol.layer.Tile({ source: new ol.source.XYZ({ url: 'http://path/to/the/tiles/base/{z}/{x}/{-y}.png' }) }); Thoughts and questions How can I calculate extent of the tiles? Or in another words how can I find out zmin zmax xmin ymin xmax ymax or whole range of x/y? What does x/y really represent? Are they coordinates of the left bottom corner or center of the tile? In which direction does x/y increase? How can I figure out x/y of at least one of the tile, for example left bottom one? (I have modified the question for better understanding of the problem) | eng_Latn | 29,131 |
ArcGIS 9 doesn't recognize ArcGIS 10 .lyr files I symbolized a parcel shapefile in ArcGIS 10 with about 2 dozen different colors. I need to use this in ArcGIS 9.3 as well, so I saved it as a .lyr file. This works fine in 10. But when I go to 9.3, I can add the shapefile, but not the .lyr file, or import it for symbolization (geometry doesn't match error). Is there a way to make a .lyr file in 10 so that 9 will recognize it? The .lyr file comes from the same shapefile, so I don't know why it wouldn't recognize it. I want to use this symbolization for a half dozen shapefiles a/o feature classes, so really don't want to manually recreate it. Thanks. | Are layer files compatible between ArcGIS 10 and 9.3? If I save layer files in ArcGIS 10, can they be opened and modified in ArcGIS 9.3? | Creating Ellipse in WGS84, but with metric parameters I have the function --Ellipse(x,y,rx,ry,rotation,#of segments in 1/4 of ellipse) SELECT ST_Translate( ST_Rotate( ST_Scale( ST_Buffer(ST_Point(0,0), 0.5, $6), $3, $4), $5), $1, $2) I have points in WGS84 (x,y), radiuses in meters (rx, ry), rotation angle in degrees. And I need a new WGS84 geometry as a result. As you can see I can't call this function without conversions between WGS84 and some metric SRID. But my points aren't in a definite region. So a SRID which is optimal for one point, throws an error for another point during conversion. Is there a common way to get an optimal metric projection (SRID) for a WGS84 point? Or at least to use some rough, but world-wide universal metric projection? Or maybe there's a trick for this case? | eng_Latn | 29,132 |
Should I use OLS of fixed effects for this model? I'm trying to estimate this model on immigration levels I found in a paper and I'm not sure if they used OLS or fixed effects, it says that its a simple OLS but then mentions adding the country fixed effects. can someone help? | Should I use OLS of fixed effects for this model for estimating level of immigrants? I'm trying to estimate this model on immigration levels I found in a paper and I'm not sure if they used OLS or fixed effects, it says that its a simple OLS but then mentions adding the country fixed effects. can someone help? | Calculating extent of custom tiles? Problem The picture is just an example of my case, but I am left with blank space on the left when I load all the tiles and I get error message in the console: Failed to load resource: the server responded with a status of 404 (Not Found) http://path/to/the/tile//base/8/6/53.png etc. I could guess and limit the extent of the layer, but I would really like to know is there any way to calculate it and also be able to get rid of the error message. What I know about the tiles? Originally they are served as TMS They represent png images 256*256 They are in projection: EPSG:3857 In OpenLayers2 they were called like TMS layer In OpenLayers3 (v.3.8.2) they are called like XYZ layer (that includes -y): var baseLayer = new ol.layer.Tile({ source: new ol.source.XYZ({ url: 'http://path/to/the/tiles/base/{z}/{x}/{-y}.png' }) }); Thoughts and questions How can I calculate extent of the tiles? Or in another words how can I find out zmin zmax xmin ymin xmax ymax or whole range of x/y? What does x/y really represent? Are they coordinates of the left bottom corner or center of the tile? In which direction does x/y increase? How can I figure out x/y of at least one of the tile, for example left bottom one? (I have modified the question for better understanding of the problem) | eng_Latn | 29,133 |
Measure length of line to any point along line I am trying to create linear referencing to point. I have a line to which I have snapped to points to (without breaking it). Now I want to find the distance from the start point of my lines to points snapped to it, this I struggle with. Point 1. 45m from start of line Point 2. 126 from start of line etc. Examples I have seen seem to focus on calculate distance between vertices but I just want to calculate length from start of line to point, for every point. Does someone have an idea how this could be done? | FME - Calculate distance of point along a line I have created an FME workspace which inputs a pipeline as well as other linear features which cross it (waterways, roads, railways, fences, other pipelines, etc). The output is a point file which contains the crossing type. What I would like to do is calculate the distance along the pipe from the start of the pipeline for each intersection. So, as you travel along the pipeline, the distance attribute will increase. I'm thinking that I'll need to use some of the linear referencing tools like MeasureGenerator, MeasureExtractor, or something like that, but I'm not sure how to do so. Does anyone have an idea of how to do this? | How can I rotate an object based on another's offset to it? I have a 3D model of a turret that con rotate around the Y-axis. This turret has a cannon that is significantly off the center of the object. I want the cannon, not the turret, to aim at a specified target. I can only rotate the turret, however, and thus I don't know what equation I need to apply in order to accomplish by objective. The following image illustrates my problem: If I have the turret "LookAt()" the target, a laser originating from the cannon will completely miss said target. If this were a completely top-down scenario, and the cannon were exactly parallel to the turret, then my logic tells me that the fake target should be located at a position that's equal to the actual target plus an offset that's equal to that between the turret and the cannon. However, in my actual scenario, my camera is angled 60º, and the cannon has a slight rotation. The following image illustrates the scenario: I'm not sure exactly why, but if I apply that same offset, it only seems to work while aiming at certain distances from the turret. Is my logic flawed? Am I missing something fundamental here? Final Edit: the solution provided by @JohnHamilton latest update solves this problem with perfect precision. I have now removed the code and images that I used to illustrate my incorrect implementations. | eng_Latn | 29,134 |
Is it possible to render an equirectangular image in Blender Internal? Hear me out. I know there's a "panoramic" mode for cameras now. Thing is, it only works with Cycles. Cycles does not support some things, like particles (halo, line). I need particles, I also need to render an equirectangular render of the scene. What can I do? Even some mulitple camera setup + external tool would work. Note that I need animations though, an image editor won't do. | equirectangular rendering for Blender Internal? Can such frames/videos be rendered in Blender Internal as well? If not, how indirectly? EDIT: The below solution is great as long as you can write your own Python script and only render solid objects. When you need things like particles/hair, it won't work as those won't be reflected. | How can I rotate about an arbitrary point in 3D (instead of the origin)? I have some models that I want to rotate using quaternions in the normal manner, except instead of rotation about the origin, I want it to be offset slightly. I know that you don't say, in 3d space, that you rotate about a point; you say you rotate about an axis. So I'm visualizing it as rotating about a vector whose tail is positioned not at the local origin. All affine transformations in my rendering / physics engine are stored using SQT (scale, quaternion, translation; an idea borrowed from the book Game Engine Architecture.) So I build a matrix each frame from these components and pass it to the vertex shader. In this system, translation is applied, then scale, then rotation. In one specific case, I need to translate an object in world space, scale it, and rotate it about a vertex not centered at the object's local origin. Question: Given the constraints of my current system described above, how can I achieve a local rotation centered about a point other than the origin? Automatic upvote to anyone who can describe how to do this using only matrices as well :) | eng_Latn | 29,135 |
Slope analysis in QGIS I have a 1/3 arc-second DEM downloaded from USGS that I am trying to use to create a slope analysis layer. The DEM is in ESPG 4269 to start and I've re-projected it into NAD83 UTM Zone 11N (the DEM is for eastern Oregon) so the vertical and horizontal units are in meters. The output looks correct except all of the lines through it (see screenshot). Maybe I went wrong in how I projected it? | Why do "stripes" appear on raster? I am reprojecting a SRTM 3 arc-second raster file from WGS84 geographic coordinate system, to projected coordinate system which is using Azimuthal Equidistant projection: gdalwarp -s_srs EPSG:4326 -t_srs "+proj=merc +lat_ts=40.81266 +lon_0=14.414252" -r near -of GTiff C:/vesuvius2_wgs84.tif C:/vesuvius2_aeqd.tif I attached the initial raster file in here (). The issue is that if I use the Nearest neighbor resampling method (-r near), some sort of "stripes" appear on terrain once I reproject the file to Azimuthal Equidistant projection. Here is the how the reprojected raster looks like: And here is its 3d representation: Interestingly similar "stripes" appear in both direction if I reproject the initial raster to UTM: gdalwarp -s_srs EPSG:4326 -t_srs EPSG:32633 -r near -of GTiff C:/vesuvius2_wgs84.tif C:/vesuvius2_epsg32633.tif The "stripes" go away when I use Bilinear or Cubic resampling methods (-r bilinear, -r cubic) instead of the Nearest Neighbor one. Why do these "stripes" appear when Nearest Neighbor resampling method is used? | Local Coordinate to Geocentric The ultimate question, I need a transformation matrix to take a point in local space representing a roughly 500m x 500m place in New Mexico centered at -108.619987456 long 36.234600064 lat. The final output needs to be in geocentric coordinates and I can during initialization get any number of points on the map such as the corners, center, western most along the center etc. During run time I would like to have a transformation matrix that can be applied to a position in local space expressed in meters and get back an approximate GCC coordinate. The center of the point in GCC: -1645380 -4885138 3752889 The bottom left corner of the map in GCC: -1644552, -4881054, 3749220 During run time I need to be able to multiply (-250, -250, 0) with the transformation matrix and get something close to (-1644552, -4881054, 3749220). I've spent more than a week trying to research a solution for this and so far nothing. Given an identity matrix as our starting position and orientation. Then using geotrans and the known lat, long, and height of the starting position I get an x,y,z geocentric coordinate. A vector from the origin of (0,0,0) gives both the up and translation for the matrix. However, I need the forward and right so that I can pass a distance in meters from the origin into the transformation matrix and get a roughly accurate GCC. Do I have all of the inputs I need to calculate the right and forward? Inputs Origin: 0, 0, 0 Global: -1645380, -4885138, 3752889 Up (Normalized): Global - Origin Desired Outputs Right: ? ? ? Forward: ? ? ? So with the right and forward added to the up and translation I already calculated I would have a transformation matrix. I could then apply the matrix to a vector of say (50, 50, 0) and get something within about 0-3 cm of the output if I feed the lat/long back into geotrans. This matrix would only ever be used small maps of about 500m x 500m so the curvature of the earth is negligible. In reply to whuber, I don't know your level of experience so I will start with the very basics. An identity matrix is a 4x4 matrix that you can multiply by a vector and it returns the vector unchanged. Such as below. 1 0 0 x=0 0 1 0 y=0 0 0 1 z=0 0 0 0 1 The x, y, and z of the matrix are how much to translate the vector by and the first 3 rows and columns are the rotation. Below is a tranformation matrix that does nothing to the orientation of a vector but does translate. This is what you would want for two axis aligned worlds in different locations. 1 0 0 13 0 1 0 50 0 0 1 -7 0 0 0 1 If you were to multiply a vector of (10, 10, 10) with the above transformation matrix you would get an output of (23, 60, 3). My problem is the axes are not aligned and the "plane" of the world I am trying to get the local coordinate converted to is projected on the ellipsoid of the earth. Geotrans is library that you can use to pass a coordinate from one system such as geodetic or GDC (lat, long, height) and get back another such as geocentric or GCC (x,y,z). For example: If my game world was representing a map centered exactly on the prime meridian and equator the transformation matrix would look something like below. I am just guesstimating the Y and Z rotations and might have them flipped and/or the wrong sign. 0 0 1 6378137 0 1 0 0 1 0 0 0 0 0 0 1 | eng_Latn | 29,136 |
Unity - How can I make my player's gun point at the cursor, not the player itself? I'm making a 2D top-down shooter in unity. Currently, I'm using the code below to point my player towards my cursor at all times, this works but isn't exactly what I want. //Find the mouse position on the camera view Vector3 mousePoint = theCamera.ScreenToWorldPoint(Input.mousePosition); //Find how the mouse relates to the object's position Vector3 difference = mousePoint - transform.position; difference.Normalize(); //Find wanted angle of rotation float rotZ = Mathf.Atan2(difference.y, difference.x) * Mathf.Rad2Deg; Quaternion newRotation = Quaternion.Euler(new Vector3(0.0f, 0.0f, rotZ + adjustmentAngle)); //Apply wanted angle of rotation transform.rotation = Quaternion.Lerp(transform.rotation, newRotation, Time.deltaTime * smoothing); While this works, my player's gun isn't in the centre of my player so when the gun fires it actually skims past the cursor missing slightly. How can I get my player to rotate so his gun is what points towards the cursor rather than himself? I have a child GameObject at the end of the barrel of his gun used to Raycast and stuff which I figure can be used? | How can I rotate an object based on another's offset to it? I have a 3D model of a turret that con rotate around the Y-axis. This turret has a cannon that is significantly off the center of the object. I want the cannon, not the turret, to aim at a specified target. I can only rotate the turret, however, and thus I don't know what equation I need to apply in order to accomplish by objective. The following image illustrates my problem: If I have the turret "LookAt()" the target, a laser originating from the cannon will completely miss said target. If this were a completely top-down scenario, and the cannon were exactly parallel to the turret, then my logic tells me that the fake target should be located at a position that's equal to the actual target plus an offset that's equal to that between the turret and the cannon. However, in my actual scenario, my camera is angled 60º, and the cannon has a slight rotation. The following image illustrates the scenario: I'm not sure exactly why, but if I apply that same offset, it only seems to work while aiming at certain distances from the turret. Is my logic flawed? Am I missing something fundamental here? Final Edit: the solution provided by @JohnHamilton latest update solves this problem with perfect precision. I have now removed the code and images that I used to illustrate my incorrect implementations. | Defining coordinate reference system with rotation in GeoServer? I am using GeoServer and have a layer in EPSG:900913 ("Google Mercator"). I need to "rotate" the map around certain point (say, 1500000, 7000000) by certain degree (say, 30 degrees clockwise). How could I define such a coordinate system based on EPSG:900913? GeoServer's does not work for my purposes as I need to tile the map later on. As far as I understand this, my only option is to define an own coordinate system. For GeoServer I'd need to . The configuration seems to be straightforward, but I have a difficulty defining my rotated CRS in . I am wondering how to apply a rotation around certain point onto a CRS like Google Mercator: PROJCS["WGS84 / Google Mercator", GEOGCS["WGS 84", DATUM["World Geodetic System 1984", SPHEROID["WGS 84", 6378137.0, 298.257223563, AUTHORITY["EPSG","7030"]], AUTHORITY["EPSG","6326"]], PRIMEM["Greenwich", 0.0, AUTHORITY["EPSG","8901"]], UNIT["degree", 0.017453292519943295], AXIS["Longitude", EAST], AXIS["Latitude", NORTH], AUTHORITY["EPSG","4326"]], PROJECTION["Mercator_1SP"], PARAMETER["semi_minor", 6378137.0], PARAMETER["latitude_of_origin", 0.0], PARAMETER["central_meridian", 0.0], PARAMETER["scale_factor", 1.0], PARAMETER["false_easting", 0.0], PARAMETER["false_northing", 0.0], UNIT["m", 1.0], AXIS["x", EAST], AXIS["y", NORTH], AUTHORITY["EPSG","900913"]] My questions, specifically: How to write a WKT which transform an existing CRS? My guess would be that I need a new PROJCS wrapping an existing one and adding a PROJECTION clause. How would I found out the projection id (like Mercator_1SP above) and the required parameters (the PARAMETER clauses)? Can I "reference" EPSG:900913 in CRS WKT instead of copy-pasting the whole PROJCS clause? | eng_Latn | 29,137 |
Projecting SHP file in QGIS I have a DXF file of a building from AutoCAD. I saved this DXF file as a SHP file in QGIS. But this SHP file is converted from a plain DXF file and is not referenced/projected, it is drawn from coordinates 0,0 in AutoCAD in meters. And the drawing has meters, WGS84 has decimal degrees. When I set Layer CRS as WGS84 in QGIS, the coordinates don't change. How can I project this SHP file as WGS84 in QGIS? | Determining coordinates of a SHP file i have a SHP file of a building converted from a DXF file. This SHP file is unprojected and have no specific coordinates (origin 0,0). I find WGS84 coordinates (lat, long) of this building from Google Maps. My aim is to replace these coordinates to the SHP file on QGIS. Could you explain the process step by step? (I read lots of similar questions, about affine transformation. But i don't know the paramaters.) What i ask is how to make this transformation in QGIS (step by step)? SHP file coordinates (not projected) Scale: 1:142.670.713 X1,Y1: 16.70824, 253.74534 X2,Y2: 1.03521,-6.34918 X3,Y3: 244.00248,-7.49189 Target WGS84 coordinates X1,Y1: 41.047773, 28.896241 X2,Y2: 41.045452, 28.895929 X3,Y3: 41.045436, 28.899039 I need a second vector layer with coordinates. I have just a SHP file of a building converted from a DXF file. That is not projected. I know where this building is, and can find the WGS84 coordinates from Google Maps. My aim is add WGS84 coordinates to this SHP file and export to replace the building on Google Maps. My Questions are: 1) I think my SHP file has measurements in meters and unprojected. I have WGS84 lat long coordinates from Google Maps. How can i project this SHP file, for being ready to transformation? Right Click -> Set Layer CRS -> Choose WGS84 EPSG: 4326. Is this enough? 2) qgsAffine plugin just need transformation matrix paramaters. But the answer of @Jochen Schwarze is a bit complex for me. Is there a online calculator? (Just give the source XYs and target XYs and calculate the parameters.) I think, I have a projection problem before affine transformation. I have a DXF file of a building from AutoCAD. I saved this DXF file as a SHP file in QGIS. But this SHP file is converted from a plain DXF file and is not referenced/projected, it is drawn from coordinates 0,0 in AutoCAD in meters. And the drawing has meters, WGS84 has decimal degrees. When I set Layer CRS as WGS84 in QGIS, the coordinates don't change. How can I project this SHP file as WGS84 in QGIS? | How to find and delete identical points? I recently performed on a merged point feature class in ArcGIS 10.1. I specified "1 meter" XY tolerance for the Integrate command. The resulting point feature appears to be what I want except points that were integrated are now stacked on top of each other, which will be problematic in future processing steps. The attributes for my dataset consist of an OID, Shape, x, y, z. Where x and y = UTM coordinates and z = buffer distance. The command does not appear to work for my dataset (i.e based on Fields: OID, Shape, x, y, z and XY tolerance = 1m) as the output still has stacked points. I am also working with big data--the integrate output has 37 million points. There is a related thread, titled , although I would like to open the question up to python solutions too. What is the best way to find and delete (spatially) duplicate points? Is there a better solution to integrating such that the output does not contain overlapping points? | eng_Latn | 29,138 |
What is the radius of earth What is the real radius of earth? Google says 6378.1 km Wiki says 6371 km And there are references for 6378.7 km I'm using maximum radius 6378.7 km in distance calculation in spherical law of cosines formula. What would be the impact of using maximum radius rather than average radius which is 6371 km. When does using one over the other make sense? | How accurate is approximating the Earth as a sphere? What level of error do I encounter when approximating the earth as a sphere? Specifically, when dealing with the location of points and, for example, the great circle distances between them. Are there any studies on the average and worst case error compared to an ellipsoid? I'm wondering how much accuracy I'd be sacrificing if I go with a sphere for the sake of easier calculations. My particular scenario involves directly mapping WGS84 coordinates as if they were coordinates on a perfect sphere (with the defined by the IUGG) without any transformation. | What's optimal hardness for a kitchen knife blade? Many knife manufacturers provide the hardness value for the knife blade in the specifications - like 53HRC or 57 HRC. What's the optimal hardness for kitchen knives? Do I always prefer the ones with higher hardness all else being equal? | eng_Latn | 29,139 |
"Home" is not translated on localized sites This reads "Home" on esSO, ptSO, ruSO and rus. Strangely enough, the text on jaSO is in Japanese and it's the only site with translation. | "Stack Overflow" showing up on hamburger menu for all sites to link to /questions Why does it say "Stack Overflow" instead of the site name? | Convert quaternion to a different coordinate system OSVR has a right-handed system: x is right, y is up, z is near. I need to convert orientation data from a sensor in OSVR to a different right-handed system in which x is forward, y is left, and z is up. I need to calculate the transformation that will convert a quaternion from one system to the other. I naively tried: void osvrPoseCallback(const geometry_msgs::PoseStampedConstPtr &msg) { // osvr to ros geometry_msgs::PoseStamped outputMsg; outputMsg.header = msg->header; outputMsg.pose.orientation.x = msg->pose.orientation.y; outputMsg.pose.orientation.y = msg->pose.orientation.z; outputMsg.pose.orientation.z = msg->pose.orientation.x; outputMsg.pose.orientation.w = msg->pose.orientation.w; osvrPosePub.publish(outputMsg); ros::spinOnce(); } But that made things really weird. Facing north pitch is pitch, facing west, pitch is yaw, facing south, pitch is negative pitch... How can I convert my OSVR quaternion to the a corresponding quaternion in the new coordinate system? | eng_Latn | 29,140 |
how to snap several vertices to the same z.position I encountered a little problem. Say I have a mesh an the Vertices on top are not in the same Z-Position. I am looking for a method to have them all aligned in Z. My approach is to select the 3 Vertices, grab the blue arrow to constrain the movement in the Z-Axis and want to snap to the vertex highlighted with the red circle. My Snappingsettings But for some reason they don´t have the same z-Position. They seem to keep their Offsets to each other? I used this method in Maya. Do you know a way how to get that working? thanks in Advance Guido. | How to align a cluster of points? If you have a cluster of points selected like this: how do you align the points so that they create a flat plane? | How can I rotate an object based on another's offset to it? I have a 3D model of a turret that con rotate around the Y-axis. This turret has a cannon that is significantly off the center of the object. I want the cannon, not the turret, to aim at a specified target. I can only rotate the turret, however, and thus I don't know what equation I need to apply in order to accomplish by objective. The following image illustrates my problem: If I have the turret "LookAt()" the target, a laser originating from the cannon will completely miss said target. If this were a completely top-down scenario, and the cannon were exactly parallel to the turret, then my logic tells me that the fake target should be located at a position that's equal to the actual target plus an offset that's equal to that between the turret and the cannon. However, in my actual scenario, my camera is angled 60º, and the cannon has a slight rotation. The following image illustrates the scenario: I'm not sure exactly why, but if I apply that same offset, it only seems to work while aiming at certain distances from the turret. Is my logic flawed? Am I missing something fundamental here? Final Edit: the solution provided by @JohnHamilton latest update solves this problem with perfect precision. I have now removed the code and images that I used to illustrate my incorrect implementations. | eng_Latn | 29,141 |
Ortbit camera rotation multiplication order with quaternions? So I switched my camera to use quaternions and the first thing I noticed is that things started to 'roll' even though I'm not using any roll values. I looked it up and I found that people suggest that using rotation = horizontal * rotation * vertical solves the issues instead of rotation = rotation * horizontal * vertical which I thought was the more intuitive thing from doing matrix multiplication. The explanations I read as to why one works and the other doesn't were pretty vague and unclear. My question is: why does it work this way? what's wrong with 'rotation * h * v`? Sample code: float Horizontal = 0; float Vertical = 0; if (Keys['W'] == Key_Held) { Camera->Position += Camera->Forward() * CamMovSpeed * DeltaTime; } if (Keys['S'] == Key_Held) { Camera->Position -= Camera->Forward() * CamMovSpeed * DeltaTime; } if (Keys['A'] == Key_Held) { Camera->Position -= Camera->Right() * CamMovSpeed * DeltaTime; } if (Keys['D'] == Key_Held) { Camera->Position += Camera->Right() * CamMovSpeed * DeltaTime; } if (Keys['E'] == Key_Held) { Camera->Position += vec_Up * CamMovSpeed * DeltaTime; } if (Keys['F'] == Key_Held) { Camera->Position -= vec_Up * CamMovSpeed * DeltaTime; } // mouse int dx = Mouse->x - Mouse->LastX; Mouse->LastX = Mouse->x; int dy = Mouse->LastY - Mouse->y; Mouse->LastY = Mouse->y; if (Mouse->MMB == Key_Held) // pan { Camera->Position -= (Camera->Up()*dy + Camera->Right()*dx) * SpeedMul/2 * DeltaTime; } if (Mouse->RMB == Key_Held) // orbit { Horizontal += dx * SpeedMul*10 * DeltaTime; Vertical += dy * SpeedMul*10 * DeltaTime; } if (Mouse->Wheel) // zoom { Camera->Position += Camera->Forward() * Mouse->Wheel * SpeedMul*10 * DeltaTime; } if (Horizontal != 0 || Vertical != 0) { quat QVertical(vec_Right, -Vertical); quat QHorizontal(vec_Up, Horizontal); Camera->Rotation = QHorizontal * Camera->Rotation * QVertical; // this works //Camera->Rotation = Camera->Rotation * QHorizontal * QVertical; // this doesn't Camera->Rotation.Normalize(); } | I'm rotating an object on two axes, so why does it keep twisting around the third axis? I see questions come up quite often that have this underlying issue, but they're all caught up in the particulars of a given feature or tool. Here's an attempt to create a canonical answer we can refer users to when this comes up - with lots of animated examples! :) Let's say we're making a first-person camera. The basic idea is it should yaw to look left & right, and pitch to look up & down. So we write a bit of code like this (using Unity as an example): void Update() { float speed = lookSpeed * Time.deltaTime; // Yaw around the y axis using the player's horizontal input. transform.Rotate(0f, Input.GetAxis("Horizontal") * speed, 0f); // Pitch around the x axis using the player's vertical input. transform.Rotate(-Input.GetAxis("Vertical") * speed, 0f, 0f); } or maybe // Construct a quaternion or a matrix representing incremental camera rotation. Quaternion rotation = Quaternion.Euler( -Input.GetAxis("Vertical") * speed, Input.GetAxis("Horizontal") * speed, 0); // Fold this change into the camera's current rotation. transform.rotation *= rotation; And it mostly works, but over time the view starts to get crooked. The camera seems to be turning on its roll axis (z) even though we only told it to rotate on the x and y! This can also happen if we're trying to manipulate an object in front of the camera - say it's a globe we want to turn to look around: The same problem - after a while the North pole starts to wander away to the left or right. We're giving input on two axes but we're getting this confusing rotation on a third. And it happens whether we apply all our rotations around the object's local axes or the world's global axes. In many engines you'll also see this in the inspector - rotate the object in the world, and suddenly numbers change on an axis we didn't even touch! So, is this an engine bug? How do we tell the program we don't want it adding extra rotation? Does it have something to do with Euler angles? Should I use Quaternions or Rotation Matrices or Basis Vectors instead? | Why exactly do sometimes universal covers, and sometimes central extensions feature in the application of a symmetry group to quantum physics? There seem to be two different things one must consider when representing a symmetry group in quantum mechanics: The universal cover: For instance, when representing the rotation group $\mathrm{SO}(3)$, it turns out that one must allow also $\mathrm{SU}(2)$ representations, since the negative sign a "$2\pi$ rotation" induces in $\mathrm{SU}(2)$ is an overall phase that doesn't change the physics. Equivalently, all representations of the Lie algebra are what we seek. ($\mathfrak{so}(3) = \mathfrak{su}(2)$, but although every representation of the algebra is one of the universal cover, not every representation of the algebra is one of $\mathrm{SO}(3)$.) Central extensions: In conformal field theory, one has classically the of infinitesimal conformal transformations. From the universal cover treatment one is used to in most other cases, one would expect nothing changes in the quantum case, since we are already seeking only representation of an algebra. Nevertheless, in the quantization process, a appears, which is often interpreted to arise as an "ordering constant" for the now no longer commuting fields, and we have to consider the instead. The question is: What is going on here? Is there a way to explain both the appearence of universal covers and central extensions in a unified way? | eng_Latn | 29,142 |
In a 2D multiplayer game should I send the position of user to the server all the time? In a 2D game where the user moves with the keyboard arrows, should the user send all the time he moves his position (x, y)?. If the user has some speed, the user would send (x, y) like 50 times pixel by pixel in just a second. | Client should send one big packet or multiple smaller ones per frame? The client can do several actions per frame, for instance, requesting a movement, shooting etc. Should i send a packet to the server for each action the client performed during the frame, or should i stack them in an unique packet and send it at the end of the frame? EDIT: i'm using Jmonkeyengine and the provided network API SpiderMonkey. | How can I stitch a panorama correctly if I moved the camera along the horizontal axis? Here in Argentina, we have . All the houses and walls on that street have some kind of mosaic stuck to it, and it's very cool. It was made by a local artist . Because to this piece of urban art is two blocks long, I've decided to make a panorama of it, by moving myself on an horizontal axis while taking photos. I mean, I took one photo, walked one step deeper along the street, took another photo, and so on. When I tried to stitch it in AutoPano, the following deformed thing came out: () And the other side of the block: () After this, I've learned about parallax error and why you have to avoid moving when making panoramas. I mean, there are a lot of connection errors on both images. Especially in the second one, the part with the corner is quite problematic to stitch because to as I moved, the perspective of the view changed a lot. So, is there any way to stitch this kind of panorama correctly? Would this only work on plain walls? | eng_Latn | 29,143 |
undefined control sequence for \measuredangle I just installed MacTeX and texmaker. I'm trying to create a measured angle. The following code: \documentclass[12pt]{article} \begin{document} The measure of $\angle{ABC}$ is expressed as $\measuredangle{ABC}$. \end{document} keeps getting me ! Undefined control sequence. <recently read> \measuredangle l.4 ...angle{ABC}$ is expressed as $\measuredangle {ABC}$. I've tried web searches on my favorite engines for latex undefined control sequence measuredangle, but I keep results that are not useful; they're either listings of control sequences or people who had problems with different control sequences. Thanks! | How to look up a symbol or identify a letter from a math alphabet or other character? I know what my symbol or character looks like, but I don't know what the command is or which math alphabet it came from. How do I go about finding this out? | How to typeset asymmetrical confidence level limits / uncertainties? Below is an example of an asymmetrical limit (or uncertainty) How can I typeset this in a \LaTeX equation? Such pair of numbers should be vertically stacked while being vertically center-aligned with rest of the equation. Solution $X^{+Y}_{-Z}$ | eng_Latn | 29,144 |
Why do pdf/image exports fail with OpenLayers Bing layers? When I try to export a print composition it generates a 0kb file. Is not possible to open this file. It happened the same with export as image option. The print composition model is coming from a .qpt file created with QGIS 1.8. The problem is not in the print composition created with QGIS 1.8. I have the same problem starting with a new composition. The print works regularly if I hide the two maps in the print composition. I can print all the elements that are not maps. The maps are coming from OpenLayers plugin/bing What shoud I check to solve it? | OpenLayers low resolution and/or shifts in QGIS print composer? I'm having trouble with the OpenLayers Plugin of Qgis and the map composer: If I create an OSM-background layer and if I want to export this, the OSM-Layer looks perfectly all right in the normal qgis program window. But in the map composer and after export the layer has shifted relatively to my other shape layers (EPSG:32633 - WGS 84 / UTM zone 33N). The second thing is that the output resolution of the exported osm-layer is very, very poor. A really bad way of getting around this whole trouble would be to increase the screen resolution and make a screenshot of the map composition window of qgis. But I don't think this would be very professional. It also would cause a lot of pain :) I'm using Qgis 1.8.0-Lisboa under Linux. The openlayers plugin is version 0.92. | Georeferencing old map in QGIS? I have an old map over a part of Laos and wondering how do I georeference the map with the help of the Grid lines and other Longitude and Latitude information on the scanned map? I have searched for information on the internet and tried many times. My scanned map wont overly correctly on the base map. Any base map would be fine. As long as I can check the result. Do I add X/East and Y/North values from the grid line crossings as marked and written in some parts of the map? How do I begin georeferencing this map? Do I have to add zeros after the two digits? For example if the Easting is 24 (its in KM right? Do I add like three/four zeros to convert it to meters? I georeferenced the map with the option "From Map Canvas" in the georeferencer tool and I succeeded but I want to know the coordinate of a point on my old scanned map and write them as X/East and Y/North to georeference the map. I hope I am not babbling and someone understands what I mean! We can set the CRS to WGS84, EPSG:4326. | eng_Latn | 29,145 |
ArcGIS Javascript Viewer with ArcGIS Server I'm certain this question must have been asked/answered, but I just can't find it. I'm running ArcGIS Server and want to build a basic viewer that points to my locally hosted services. I'm not interested in doing anything with ArcGIS.com; however all the examples hosted at esri illustrate interfacing with ArcGIS.com. How can I point to my local map? Alternatively, can you point me to where I can find the answer (as I'm certain this must have been asked already) | Need Basic Viewer template for ArcGIS JSAPI 3.2/3.3 I am looking for basic viewer for ArcGIS JSAPI but not getting proper sample/example. I am going through also but looks like its not working. Going through question also. Checking out example and getting confused that what is difference between Viewer template and layout? So any help regarding this will be great! | Georeferencing old map in QGIS? I have an old map over a part of Laos and wondering how do I georeference the map with the help of the Grid lines and other Longitude and Latitude information on the scanned map? I have searched for information on the internet and tried many times. My scanned map wont overly correctly on the base map. Any base map would be fine. As long as I can check the result. Do I add X/East and Y/North values from the grid line crossings as marked and written in some parts of the map? How do I begin georeferencing this map? Do I have to add zeros after the two digits? For example if the Easting is 24 (its in KM right? Do I add like three/four zeros to convert it to meters? I georeferenced the map with the option "From Map Canvas" in the georeferencer tool and I succeeded but I want to know the coordinate of a point on my old scanned map and write them as X/East and Y/North to georeference the map. I hope I am not babbling and someone understands what I mean! We can set the CRS to WGS84, EPSG:4326. | eng_Latn | 29,146 |
Point data does not allign with polygon data, even though they are in the same projections I created a feature class from XY data. This then adds a shapefile on my map, under the projection of WGS 1984. I then try to reproject my shapefile in to Europe_Albers_Equal_Area_Conic. This works, but my shapefiles do not align with my already Europe_Albers_Equal_Area_Conic polygons. I click zoom to layer and they are far apart from each other. I have tried defining projection and reuploading the data, but nothing seems to work. I need all data in Europe_Albers_Equal_Area_Conic so that I can retrieve the correct depth measurements. EDIT: Here is how far they are, even though they are projected the same, Figure 1 (picture). | Layers with same coordinate system should align/overlap in ArcMap but do not? I put 4 different layers into my ArcMap document, but only one of them will show up on the map! I should be seeing the other 3 layers as well. When I do a Zoom to layer, I can see one layer, but I cannot figure out how to make the other 3 show up at once on top of each other. Before I opened ArcMap, I used ArcCatalog where I right-clicked 3/4 of the shapefiles and clicked Properties → Coordinate system → Import and then clicked on my Trees layer, which has the coordinate system I want the other 3 to have. But when I opened ArcMap the 4 layers did not overlap, nor could I move them above or below each other. | Assign z values to depth curves in QGIS I'm new to QGIS and working with GIS in general (student of architecture), so please bear with me. I have obtained shapefiles with topographic curves of an ocean floor. My intention is to export these as CAD files and render as 3D models of the ocean floor in Revit. I have done this procedure earlier with terrain shapefiles. My problem now is that these files mapping depth does not contain z values, so when exporting them into CAD they are completely flat rather than 3D models. The depth information in meter exists as attributes in the file (see attached image). What I want is to transform this depth value into real z coordinates for each line so that the curves have different heights when importing as CAD file in Revit. Thankful for any pointers in the right direction. | eng_Latn | 29,147 |
Delete CRS in QGIS Is there a way of deleting the various CRS's that QGIS lists? We are a Local Authority in England and the only CRS we need is British National Grid, which is set as the default, as well as Lat/Long for when we occasionally receive file from external partners. | Clearing Recently used coordinate reference systems using QGIS? After using some non usual SRC for demonstration purpose, I wish to clear my "Recently used coordinate reference systems" list for the sake of praticity. Is it possible? I am using QGIS 2.10.1. | Local Coordinate to Geocentric The ultimate question, I need a transformation matrix to take a point in local space representing a roughly 500m x 500m place in New Mexico centered at -108.619987456 long 36.234600064 lat. The final output needs to be in geocentric coordinates and I can during initialization get any number of points on the map such as the corners, center, western most along the center etc. During run time I would like to have a transformation matrix that can be applied to a position in local space expressed in meters and get back an approximate GCC coordinate. The center of the point in GCC: -1645380 -4885138 3752889 The bottom left corner of the map in GCC: -1644552, -4881054, 3749220 During run time I need to be able to multiply (-250, -250, 0) with the transformation matrix and get something close to (-1644552, -4881054, 3749220). I've spent more than a week trying to research a solution for this and so far nothing. Given an identity matrix as our starting position and orientation. Then using geotrans and the known lat, long, and height of the starting position I get an x,y,z geocentric coordinate. A vector from the origin of (0,0,0) gives both the up and translation for the matrix. However, I need the forward and right so that I can pass a distance in meters from the origin into the transformation matrix and get a roughly accurate GCC. Do I have all of the inputs I need to calculate the right and forward? Inputs Origin: 0, 0, 0 Global: -1645380, -4885138, 3752889 Up (Normalized): Global - Origin Desired Outputs Right: ? ? ? Forward: ? ? ? So with the right and forward added to the up and translation I already calculated I would have a transformation matrix. I could then apply the matrix to a vector of say (50, 50, 0) and get something within about 0-3 cm of the output if I feed the lat/long back into geotrans. This matrix would only ever be used small maps of about 500m x 500m so the curvature of the earth is negligible. In reply to whuber, I don't know your level of experience so I will start with the very basics. An identity matrix is a 4x4 matrix that you can multiply by a vector and it returns the vector unchanged. Such as below. 1 0 0 x=0 0 1 0 y=0 0 0 1 z=0 0 0 0 1 The x, y, and z of the matrix are how much to translate the vector by and the first 3 rows and columns are the rotation. Below is a tranformation matrix that does nothing to the orientation of a vector but does translate. This is what you would want for two axis aligned worlds in different locations. 1 0 0 13 0 1 0 50 0 0 1 -7 0 0 0 1 If you were to multiply a vector of (10, 10, 10) with the above transformation matrix you would get an output of (23, 60, 3). My problem is the axes are not aligned and the "plane" of the world I am trying to get the local coordinate converted to is projected on the ellipsoid of the earth. Geotrans is library that you can use to pass a coordinate from one system such as geodetic or GDC (lat, long, height) and get back another such as geocentric or GCC (x,y,z). For example: If my game world was representing a map centered exactly on the prime meridian and equator the transformation matrix would look something like below. I am just guesstimating the Y and Z rotations and might have them flipped and/or the wrong sign. 0 0 1 6378137 0 1 0 0 1 0 0 0 0 0 0 1 | eng_Latn | 29,148 |
FFMPEG How to extract frames of a video at a certain time frame? I am working on laying out video stills in a book. I am pretty new to using the command line and discovered ffmpeg just yesterday. Right now, I want to extract 8fps from a video between, for example, 00:05:30 and 00:42:30. This is what I have written. I am not sure how to include the out point or if I am using the right option to begin with (-ss). ffmpeg -ss 00:05:30 -i video.mp4 -vf fps=8 out%05d.png As another approach, I know this segment is 37 seconds. So: ffmpeg -ss 00:05:30 -i video.mp4 -t 00:00:37 -vf fps=8/16 out%05d.png I am also wondering if -ss has to be specified before the input or if it can be placed after. I guess I'm confused about syntax. Could it be ffmpeg -i video.mp4 -ss 00:05:30 ... as well? It seems to be running, but slower. | Cut part from video file from start position to end position with FFmpeg I have a video file of 30 minutes, but I want to extract a video from 00:09:23 to 00:25:33. I can define the startposition with -ss, but I couldn't find one for the end position. Any help please? | Determining coordinates of a SHP file i have a SHP file of a building converted from a DXF file. This SHP file is unprojected and have no specific coordinates (origin 0,0). I find WGS84 coordinates (lat, long) of this building from Google Maps. My aim is to replace these coordinates to the SHP file on QGIS. Could you explain the process step by step? (I read lots of similar questions, about affine transformation. But i don't know the paramaters.) What i ask is how to make this transformation in QGIS (step by step)? SHP file coordinates (not projected) Scale: 1:142.670.713 X1,Y1: 16.70824, 253.74534 X2,Y2: 1.03521,-6.34918 X3,Y3: 244.00248,-7.49189 Target WGS84 coordinates X1,Y1: 41.047773, 28.896241 X2,Y2: 41.045452, 28.895929 X3,Y3: 41.045436, 28.899039 I need a second vector layer with coordinates. I have just a SHP file of a building converted from a DXF file. That is not projected. I know where this building is, and can find the WGS84 coordinates from Google Maps. My aim is add WGS84 coordinates to this SHP file and export to replace the building on Google Maps. My Questions are: 1) I think my SHP file has measurements in meters and unprojected. I have WGS84 lat long coordinates from Google Maps. How can i project this SHP file, for being ready to transformation? Right Click -> Set Layer CRS -> Choose WGS84 EPSG: 4326. Is this enough? 2) qgsAffine plugin just need transformation matrix paramaters. But the answer of @Jochen Schwarze is a bit complex for me. Is there a online calculator? (Just give the source XYs and target XYs and calculate the parameters.) I think, I have a projection problem before affine transformation. I have a DXF file of a building from AutoCAD. I saved this DXF file as a SHP file in QGIS. But this SHP file is converted from a plain DXF file and is not referenced/projected, it is drawn from coordinates 0,0 in AutoCAD in meters. And the drawing has meters, WGS84 has decimal degrees. When I set Layer CRS as WGS84 in QGIS, the coordinates don't change. How can I project this SHP file as WGS84 in QGIS? | eng_Latn | 29,149 |
How to Calculate Point on an Arc (Given: Center, Arc Endpoints, Direction, and Distance along the Arc)? I need to calculate the grid coordinates for a point on an arc using the distance traveled along it. I'm given the endpoints, radius, center, clockwise direction, and distance. How can I calculate the coordinates for the new point marked by the distance traveled? If I had the angle of the arc point relative the the positive x axis, I could simple use the Radius, Cos(angle), and Sin(angle) to figure out the coordinate. But in this case it isn't given. I think I need to figure out x-axis angle for each endpoint by subtracting the center from each endpoint and then dividing by the radius. This then could be used with inverse tangent to find the angle for each endpoint. But how do I find the coordinates for the point traveling along the arc? | How to find an end point of an arc given another end point, radius, and arc direction? Given an arbitrary arc, where you know the following values: end point (x1,y1), radius (r) and arc direction (e.g. clockwise or counterclockwise from start to end), how can I calculate the other endpoint of the arc (and I say endpoint because I think of an arc as line segment that is curved, which happens to lie on the circumference of the circle and it has two end points). Though that's just how I interpret the problem and maybe I'm not thinking about it correctly. EDIT: Yes, I know the Origin point and the length of the arc. Sorry I forgot to mention. Also, the that circle would be on a 2D Cartesian plain (x and y axis.) Sorry, my math terminology is lacking. Also, I'm not trying to cheat on homework or anything. This is a legitimate problem for an SVG graph I'm trying to create using JavaScript and I've tried my best but I need help. Thanks in advance. | Rotating a gyroscope's output values into an earth-relative reference frame For whatever reason, I seem to have gotten myself twisted into quite a confusion on how to process data from my 3 axis gyro and 3 axis accelerometer to get gyro rotation values related to the earth's reference frame. i.e. Given my accelerometer measures gravity (we assume it's stable) along the sensors (x,y,z) reference frame, how do I adjust the values of my gyro, measured along the sensors same (x,y,z) reference frame such that I am actually measuring the rotating about the Earth's frame x (North), y (West), and z (downwards) direction? I just can't seem to wrap my head around the use of the rotation matrix. Do I simply employ a 3d rotation matrix, and multiply the vector, \$[gyro_x, gyro_y, gyro_z]^T\$, by a rotation matrix, R? I then assume the rotation angles, are somehow derived from the filtered accelerometer outputs, but I'm having trouble deciding on what those should be. I have yet to find a concise reference on how to move the rotation and acceleration values from the sensor reference frame into the earth global reference frame. Perhaps someone can help? Thanks | eng_Latn | 29,150 |
Transforming quaternion between coordinate systems Working in DirectX (with DirectXMath objects): If I have the rotation of an object relative to some coordinate space as quaternion Q, and I have a transformation matrix tx from that coordinate space to world space, how do I find the rotation of my object in world space? My thinking, which has proven inaccurate, was: Use XMMatrixDecompose on tx to get the orientation that is the rotation difference between world space and my local coordinate space (qLocalToWorld), invert it to get the rotation of the local coordinate system relative to the world (qWorldToLocal) and combine it with my local rotation Q. XMVECTOR scale, qLocalToWorld, translation; XMMatrixDecompose(&scale, &qLocalToWorld, &translation, tx); auto qWorldToLocal = XMQuaternionInverse(qLocalToWorld); auto worldOrientation = q * qWorldToLocal; I figured for e.g. that if the local coordinate system was rotated 90 degrees with respect to the world, and my object was 5 degrees with respect to local, this would yield the 90 + 5 = 95 in world space. But I was wrong :( I'd appreciate help in understanding the right way to do this. Thanks in advance! | Convert quaternion to a different coordinate system OSVR has a right-handed system: x is right, y is up, z is near. I need to convert orientation data from a sensor in OSVR to a different right-handed system in which x is forward, y is left, and z is up. I need to calculate the transformation that will convert a quaternion from one system to the other. I naively tried: void osvrPoseCallback(const geometry_msgs::PoseStampedConstPtr &msg) { // osvr to ros geometry_msgs::PoseStamped outputMsg; outputMsg.header = msg->header; outputMsg.pose.orientation.x = msg->pose.orientation.y; outputMsg.pose.orientation.y = msg->pose.orientation.z; outputMsg.pose.orientation.z = msg->pose.orientation.x; outputMsg.pose.orientation.w = msg->pose.orientation.w; osvrPosePub.publish(outputMsg); ros::spinOnce(); } But that made things really weird. Facing north pitch is pitch, facing west, pitch is yaw, facing south, pitch is negative pitch... How can I convert my OSVR quaternion to the a corresponding quaternion in the new coordinate system? | How do I compute the normalisation of $A=k[X,Y]/(Y^3 - X^5)$? I'm trying to solve exercise 4.7 in Reid's UCA: "Find the normalisation of $A=k[X,Y]/(Y^3 - X^5)$." I can easily show $A$ is not normal: let $x$ and $y$ denote the images of $X$ and $Y$ in $A$. Thus $y^3 = x^5$. So there exists $t=y/x\in\operatorname{Frac} (A)$ which is a root of the monic polynomial $T^3 - x^2 \in A[T]$. But since $t\notin A$, $A$ is not normal. However, I don't know how to compute the normalisation of $A$; I have not seen dimension theory developed, so to a similar question doesn't really help me. Similarly I cannot try and apply something like to a related question, because in this case the map $k[X,Y]\rightarrow k[t]$ sending $X\mapsto t^3, Y\mapsto t^5$ is injective so the image isn't isomorphic to $A$ UCA never really gives a good explanation on how to actually compute normalisations, so I'd be grateful if anyone could walk me through it. | eng_Latn | 29,151 |
Baking 2 dishes needing slightly different temperatures and time In one oven I need to bake 2 dishes: one needs 35 minutes at 375ºF, and the other needs 45 minutes at 350ºF. How much time should the second dish cook at 375ºF? | Cooking more than one dish that require different temperatures I'm cooking two different dishes in the oven at the same time that each require different temperatures. One is a roasted vegetable dish that says to cook at 475 for 35-40 minutes and the other is parmesan crusted pork chops at 350 for 40-45 minutes. What do you suggest I do? | How can I rotate an object based on another's offset to it? I have a 3D model of a turret that con rotate around the Y-axis. This turret has a cannon that is significantly off the center of the object. I want the cannon, not the turret, to aim at a specified target. I can only rotate the turret, however, and thus I don't know what equation I need to apply in order to accomplish by objective. The following image illustrates my problem: If I have the turret "LookAt()" the target, a laser originating from the cannon will completely miss said target. If this were a completely top-down scenario, and the cannon were exactly parallel to the turret, then my logic tells me that the fake target should be located at a position that's equal to the actual target plus an offset that's equal to that between the turret and the cannon. However, in my actual scenario, my camera is angled 60º, and the cannon has a slight rotation. The following image illustrates the scenario: I'm not sure exactly why, but if I apply that same offset, it only seems to work while aiming at certain distances from the turret. Is my logic flawed? Am I missing something fundamental here? Final Edit: the solution provided by @JohnHamilton latest update solves this problem with perfect precision. I have now removed the code and images that I used to illustrate my incorrect implementations. | eng_Latn | 29,152 |
Creating Ellipse in WGS84 with metric parameters I have points in WGS84 (x,y), radiuses in meters (rx, ry), rotation angle in degrees. I would like to create a new WGS84 geometry. I've tried this but the major/minor distances seem to be off by a few hundred meters in some cases. Ellipse (x,y,rx,ry,rotation,#of segments in 1/4 of ellipse) SELECT transform( translate( rotate( scale( buffer( transform(setsrid(makepoint(0,0),4326),3395), 1, $6), $3,$4), $5), x(transform(setsrid(makepoint($1,$2),4326),3395)), y(transform(setsrid(makepoint($1,$2),4326),3395))),4326) | Creating Ellipse in WGS84, but with metric parameters I have the function --Ellipse(x,y,rx,ry,rotation,#of segments in 1/4 of ellipse) SELECT ST_Translate( ST_Rotate( ST_Scale( ST_Buffer(ST_Point(0,0), 0.5, $6), $3, $4), $5), $1, $2) I have points in WGS84 (x,y), radiuses in meters (rx, ry), rotation angle in degrees. And I need a new WGS84 geometry as a result. As you can see I can't call this function without conversions between WGS84 and some metric SRID. But my points aren't in a definite region. So a SRID which is optimal for one point, throws an error for another point during conversion. Is there a common way to get an optimal metric projection (SRID) for a WGS84 point? Or at least to use some rough, but world-wide universal metric projection? Or maybe there's a trick for this case? | KML to Shapefile using GDAL (LIBKML) - MultiGeometry data not transforming I am trying to convert a KML file to Shapefile using GDAL (ogr2ogr). Regular KML files convert fine but KML files containing MultiGeometry features (eg: Geometry containing both Point and Polygon) do not get transformed. The output is simply a blank shapefile containing no shapes or attributes. I have tried this command: ogr2ogr -overwrite -f "ESRI Shapefile" -where OGR_GEOMETRY='MultiGeometry' <ouputfolder> <input kml> For an example, KML file which has MultiGeometry, you can see the us_states.kml hosted by Google. Other info: I am using 'GDAL 1.10.1, released 2013/08/26' for windows. | eng_Latn | 29,153 |
Know rotation with 3 coplanar points I'm doing a computer vision program in which I have three coplanar points and I want to know their position at all times. In other words, I want to realize when there is translation and when there is rotation. For the translation I have no problems (at least for now), but the rotation to the sides (right and left) is driving me crazy, I can not figure out how to calculate it. Being 3 points I work it as a triangle, I am able to know the distance between sides, angles, heights and area of the triangle. But none of this works for me since the rotation to the left and the rotation to the right are always symmetrical and therefore the computer is not able to detect what direction it is. Do you know any method for this calculation without adding more points? I'm sure this has been solved before but I don't know what else to think. Thank you so much EDIT It's not a duplicate of question because my triangles are 2D. That is the problem, I'm working with 2D coplanar points. That's what's difficult to know if the rotation is to the left or to the right because the behaviour is symmetrical. | How to calculate the rotation matrix between 2 3D triangles? I need to calculate the rotation matrix and the translation vector between 2 given triangles in Euclidean space. This is really just a rotation and a translation, the lengths of sides of the triangle are equal. Coordinates of the points are real. $P_1, P_2, P_3$ are the 3 points of the 1st triangle, $Q_1, Q_2, Q_3$ are the 3 points of the 2nd. My idea was: I translate the center of the triangles to the point of origin with $t_1 := \frac{P_1+P_2+P_3}{3}$, $P'_i = P_i-t_1$ and $t_2 := \frac{Q_1+Q_2+Q_3}{3}$, $Q'_i = Q_i-t_1$ and then I only need to solve the set of 9 linear equations with 9 unknown (the 9 entries of the rotation matrix $R$): $$ Q'_i = R.P'_i $$ Afterwards I could calculate the translation with $$ t = t_2 - R.t_1 $$ The problem is: after I translate the triangles to the point of origin the points are linearly dependent ($P_3 = -P_1 - P_2$) So in fact I only have 6 equation with 9 unknown, which allows multiple solutions for $R$. But I need "one" precise solution, because the triangles are just parts of two identical models, which I need the rotation for. I think the approach will work, if I use 4 points instead of 3. But can I somehow calculate the rotation matrix with "only" 3 points? | Reflection of a line in a plane The line $l_1$ has the equation $r=(6i+2-2k)+\lambda(4i+5j-k)$ and the plane $\pi_1$: $2x-y+4z=4$, the line $l_2$ is the reflection of $l_1$ in the plane $\pi_1$. Find the exact vector equation of line $l_2$ So the line intersects the plane when $\lambda=-2$, giving the point $(-2,-8,0)$ which will be common on $l_1$ and $l_2$. But I am unsure on how to find the direction vector for $l_2$. Any help would be appreciated. | eng_Latn | 29,154 |
Viewing camera shows distorted geometry I am modeling a building in blender and I have this strange problem with my viewing camera. The geometry seems like it has been scaled down for the Y axis. I measured a 2 vertices relative to a point and they have the same distance, but in the view, vertex on the X axis seems like it is farther than the one in the Y axis. I definitely accidently changed something but I don't know what. Here is my . Thanks a lot in advance! | Why do the measurements of this object seem erroneous? I just found out about the Mesh Display : Length feature in the "N" panel (thanks to ) that allows one to measure the length of a face or vertice. But my first encounter with this tool gives me complex feelings of love and hate, see how, as I was trying to measure the aspect ratio of this TV screen, I am getting WIDTH = 20.33 and HEIGHT = 37.57. These values seem awfully wrong, since my betrusted eyes tell me the height should be shorter than the width. I have tried measuring the single vertices as well, to identical results. | What is approximate error of Pythagorean Theorem vs. Haversine Formula in measuring distances on sphere at various scales? Many people when first trying to calculate distances between two longitude / latitude pairs ask if Pythagorean theorem works as an appropriate distance function. Most often people answer "no, the Pythagorean theorem only works on a 2D Euclidean plane." Rarely, however, do people mention the effect of scale and location on the sphere on how inaccurate the Pythagorean theorem is. The basic idea being at very small scales, the surface of a sphere looks very much like a plane. At very large scales, it distances along the surface are more curved and therefore the difference between the incorrect Pythagorean Theorem and the correct Haversine Formula is greater. Does anyone know a formula or rule of thumb that tells you the difference between the two distance measures based on the scale of the distance you are trying to measure? I think having this explicitly would help in: explaining why the Pythagorean Theorem isn't perfect; and in letting people who are looking for more "rough" distances know when Pythagoras actually will serve their purposes. | eng_Latn | 29,155 |
remove duplicate key array from php how to remove array that have repeated value of distance? Array([0] => Array([distance] => 66.68 [lat] => 51.8560591 [long] => -2.2170209 [ordNum] => 1/5938ebf2475fa )) Array([0] => Array([distance] => 66.68 [lat] => 51.8560591 [long] => -2.2170209 [ordNum] => 1/5938e93c2080e )) Array([0] => Array([distance] => 123.93 [lat] => 51.8560591 [long] => -2.2170209 [ordNum] => 1/5938ebf2475fa)) Array([0] => Array([distance] => 123.93 [lat] => 51.8560591 [long] => -2.2170209 [ordNum] => 1/5938e93c2080e)) Array([0] => Array([distance] => 128.84 [lat] => 52.6301043 [long] => -2.4940598 [ordNum] => 1/5938e979bdb8b)) Array([0] => Array([distance] => 148.43 [lat] => 52.6301043 [long] => -2.4940598 [ordNum] => 1/5938e979bdb8b)) | How to remove duplicate values from a multi-dimensional array in PHP How can I remove duplicate values from a multi-dimensional array in PHP? Example array: Array ( [0] => Array ( [0] => abc [1] => def ) [1] => Array ( [0] => ghi [1] => jkl ) [2] => Array ( [0] => mno [1] => pql ) [3] => Array ( [0] => abc [1] => def ) [4] => Array ( [0] => ghi [1] => jkl ) [5] => Array ( [0] => mno [1] => pql ) ) | Defining coordinate reference system with rotation in GeoServer? I am using GeoServer and have a layer in EPSG:900913 ("Google Mercator"). I need to "rotate" the map around certain point (say, 1500000, 7000000) by certain degree (say, 30 degrees clockwise). How could I define such a coordinate system based on EPSG:900913? GeoServer's does not work for my purposes as I need to tile the map later on. As far as I understand this, my only option is to define an own coordinate system. For GeoServer I'd need to . The configuration seems to be straightforward, but I have a difficulty defining my rotated CRS in . I am wondering how to apply a rotation around certain point onto a CRS like Google Mercator: PROJCS["WGS84 / Google Mercator", GEOGCS["WGS 84", DATUM["World Geodetic System 1984", SPHEROID["WGS 84", 6378137.0, 298.257223563, AUTHORITY["EPSG","7030"]], AUTHORITY["EPSG","6326"]], PRIMEM["Greenwich", 0.0, AUTHORITY["EPSG","8901"]], UNIT["degree", 0.017453292519943295], AXIS["Longitude", EAST], AXIS["Latitude", NORTH], AUTHORITY["EPSG","4326"]], PROJECTION["Mercator_1SP"], PARAMETER["semi_minor", 6378137.0], PARAMETER["latitude_of_origin", 0.0], PARAMETER["central_meridian", 0.0], PARAMETER["scale_factor", 1.0], PARAMETER["false_easting", 0.0], PARAMETER["false_northing", 0.0], UNIT["m", 1.0], AXIS["x", EAST], AXIS["y", NORTH], AUTHORITY["EPSG","900913"]] My questions, specifically: How to write a WKT which transform an existing CRS? My guess would be that I need a new PROJCS wrapping an existing one and adding a PROJECTION clause. How would I found out the projection id (like Mercator_1SP above) and the required parameters (the PARAMETER clauses)? Can I "reference" EPSG:900913 in CRS WKT instead of copy-pasting the whole PROJCS clause? | yue_Hant | 29,156 |
Introduce size in mm when scaling edges Good evening to everyone, My question is simple but I'm unable to find the answer. I'd like to input the size in mm (or meters) when scaling an arista instead of instroducing the scale unit with the keyboard. Just to be more precise. Is this possible? The exact issue is that I would like to scale the edge in the picture by introducing the exact mm but I can't find how to do it. Thanks in advance. Jaume | How to scale dimensions proportionally to a specific size? I modeled a man and would like to scale him down to be exactly 6' tall (Z dimension), however I cannot find a way to do that. I know I can go to front ortho view and and use the normal scale S key to scale it, however that requires some pretty specific fine tuning in order to get it to the right dimensions. I'd like to be able to essentially type "6" into the Z dimension (don't worry I have it setup in imperial) and have it scale the X and Y dimensions proportionally. | How can I rotate about an arbitrary point in 3D (instead of the origin)? I have some models that I want to rotate using quaternions in the normal manner, except instead of rotation about the origin, I want it to be offset slightly. I know that you don't say, in 3d space, that you rotate about a point; you say you rotate about an axis. So I'm visualizing it as rotating about a vector whose tail is positioned not at the local origin. All affine transformations in my rendering / physics engine are stored using SQT (scale, quaternion, translation; an idea borrowed from the book Game Engine Architecture.) So I build a matrix each frame from these components and pass it to the vertex shader. In this system, translation is applied, then scale, then rotation. In one specific case, I need to translate an object in world space, scale it, and rotate it about a vertex not centered at the object's local origin. Question: Given the constraints of my current system described above, how can I achieve a local rotation centered about a point other than the origin? Automatic upvote to anyone who can describe how to do this using only matrices as well :) | eng_Latn | 29,157 |
How to rotate a posted picture I'm using the Android client. I've posted a picture which I would like to rotate (say, clockwise 90 degrees). How can I accomplish this without editing my original picture and reuploading? | Picture rotation Is there a way to quickly rotate a picture in the SE editor without saving it on disk and rotating it in some image tool and reuploading it? E.g. consider the picture below, assume I would like to quickly rotate it 90 degrees and wanted to do it expressly in the editor without the drudgery of saving it on disk, is there a shortcut in the stack exchange "markup language"? If this feature is not available, I would like to nominate it for implementation. | Mental estimate for tangent of an angle (from $0$ to $90$ degrees) Does anyone know of a way to estimate the tangent of an angle in their head? Accuracy is not critically important, but within $5%$ percent would probably be good, 10% may be acceptable. I can estimate sines and cosines quite well, but I consider division of/by arbitrary values to be too complex for this task. Multiplication of a few values is generally acceptable, and addition and subtraction are fine. My angles are in degrees, and I prefer not have to mentally convert to radians, though I can if necessary. Also, all angles I'm concerned with are in the range of [0, 90 degrees]. I am also interested in estimating arc tangent under the same conditions, to within about 5-degrees would be good. Backstory I'm working on estimating the path of the sun across the sky. I can estimate the declination pretty easily, but now I want to estimate the amount of daylight on any given day and latitude. I've got it down to the arc cosine of the product of two tangents, but resolving the two tangents is now my sticking point. I also want to calculate the altitude of the sun for any time of day, day of the year, and latitude, which I have down to just an arc tangent. | eng_Latn | 29,158 |
Arithmetics with Coordinates I wonder if I can do simple arithmetics with coordinates. Can I do substraction of coordinates like 2.570801 (lng) 46.89023 (lat) / 17.424316 (lng) 55.15377 (lat)?! Or will that get messy without a CRS?! | Local Coordinate to Geocentric The ultimate question, I need a transformation matrix to take a point in local space representing a roughly 500m x 500m place in New Mexico centered at -108.619987456 long 36.234600064 lat. The final output needs to be in geocentric coordinates and I can during initialization get any number of points on the map such as the corners, center, western most along the center etc. During run time I would like to have a transformation matrix that can be applied to a position in local space expressed in meters and get back an approximate GCC coordinate. The center of the point in GCC: -1645380 -4885138 3752889 The bottom left corner of the map in GCC: -1644552, -4881054, 3749220 During run time I need to be able to multiply (-250, -250, 0) with the transformation matrix and get something close to (-1644552, -4881054, 3749220). I've spent more than a week trying to research a solution for this and so far nothing. Given an identity matrix as our starting position and orientation. Then using geotrans and the known lat, long, and height of the starting position I get an x,y,z geocentric coordinate. A vector from the origin of (0,0,0) gives both the up and translation for the matrix. However, I need the forward and right so that I can pass a distance in meters from the origin into the transformation matrix and get a roughly accurate GCC. Do I have all of the inputs I need to calculate the right and forward? Inputs Origin: 0, 0, 0 Global: -1645380, -4885138, 3752889 Up (Normalized): Global - Origin Desired Outputs Right: ? ? ? Forward: ? ? ? So with the right and forward added to the up and translation I already calculated I would have a transformation matrix. I could then apply the matrix to a vector of say (50, 50, 0) and get something within about 0-3 cm of the output if I feed the lat/long back into geotrans. This matrix would only ever be used small maps of about 500m x 500m so the curvature of the earth is negligible. In reply to whuber, I don't know your level of experience so I will start with the very basics. An identity matrix is a 4x4 matrix that you can multiply by a vector and it returns the vector unchanged. Such as below. 1 0 0 x=0 0 1 0 y=0 0 0 1 z=0 0 0 0 1 The x, y, and z of the matrix are how much to translate the vector by and the first 3 rows and columns are the rotation. Below is a tranformation matrix that does nothing to the orientation of a vector but does translate. This is what you would want for two axis aligned worlds in different locations. 1 0 0 13 0 1 0 50 0 0 1 -7 0 0 0 1 If you were to multiply a vector of (10, 10, 10) with the above transformation matrix you would get an output of (23, 60, 3). My problem is the axes are not aligned and the "plane" of the world I am trying to get the local coordinate converted to is projected on the ellipsoid of the earth. Geotrans is library that you can use to pass a coordinate from one system such as geodetic or GDC (lat, long, height) and get back another such as geocentric or GCC (x,y,z). For example: If my game world was representing a map centered exactly on the prime meridian and equator the transformation matrix would look something like below. I am just guesstimating the Y and Z rotations and might have them flipped and/or the wrong sign. 0 0 1 6378137 0 1 0 0 1 0 0 0 0 0 0 1 | Programmatically download (seamless) NED tiles? I used to download 1/3 arcsecond elevation geotiffs from extract.cr.usgs.gov using web services. But a few days ago, that stopped working, probably because USGS reorganized their site. I tried emailing USGS and haven't heard back yet. I could, of course, manually download states or degree tiles using the seamless server, but I want to do it programmatically so that users of my trip planner can get elevation data for their region without human intervention. And SRTM isn't high enough resolution for bicycle/wheelchair trip planning. Any ideas? | eng_Latn | 29,159 |
GPX file gets incorrect buffer QGIS When I try to perform a buffer of 10m on my imported gpxfile (geometry listed as points), the buffer covers more like 1000 km. I tried the same with a line vector file, which seems to work properly. I had no problem importing the GPX file into QGIS. I've checked if the CRS was the same as the project's and I've looked for a alternative way of transforming the gpx into a shapefile, which I didn't find. I only have this data as an gpx file, and I really need to use it! I am a QGIS-beginner, so I hope it is an easy fix? | Understanding QGIS buffer tool units I have had no luck getting the buffer tool to accept anything but degrees as units of measure. I have found lots of stuff saying the layer needs to be reprojected and saved but it hasn't worked at all for me. Is there a way I could create a buffer without using ftools or at least force the units to meters somehow? As a workaround I converted meters to degrees (lat) and used that but the final product needs to be as close to reality as possible. Things I've tried: setting every unit option I could find to meters (where possible). setting everything to NAD83/Maryland (data is for Washington, DC) and saving it as such (as layers in ESRI shape files). reimporting the reprojected layers setting relevant layers to Google Mercator The was tried followed by creating a buffer. Many were tried in combination. QGIS 1.7.3 Slackware64 current (qgis from SBo-13.37 repo, tried on multilib and plain 64it with same results) | Georeferencing old map in QGIS? I have an old map over a part of Laos and wondering how do I georeference the map with the help of the Grid lines and other Longitude and Latitude information on the scanned map? I have searched for information on the internet and tried many times. My scanned map wont overly correctly on the base map. Any base map would be fine. As long as I can check the result. Do I add X/East and Y/North values from the grid line crossings as marked and written in some parts of the map? How do I begin georeferencing this map? Do I have to add zeros after the two digits? For example if the Easting is 24 (its in KM right? Do I add like three/four zeros to convert it to meters? I georeferenced the map with the option "From Map Canvas" in the georeferencer tool and I succeeded but I want to know the coordinate of a point on my old scanned map and write them as X/East and Y/North to georeference the map. I hope I am not babbling and someone understands what I mean! We can set the CRS to WGS84, EPSG:4326. | eng_Latn | 29,160 |
So I'm trying to build a simple mechanic where the player see's 3 sides of a cube at a time: top, left and front(I called the front-side "Right" in my code, due to the orthographic view). The player can click on the front side or left side of the cube to rotate the cube so that the side that was pressed moves and becomes the top face. I want to smooth/animate the rotation. The red faces are the faces that the user can click on. My Logic seems to work quite well when I click on the left face (Rotating along the x axis), but doesn't seem to work well for the front/right face(It usually works once and then does weird stuff on the next click, this is on the z-axis) public class CubeRotator : MonoBehaviour { private const float STOP_THRESHHOLD = 0.1f; private const float ROTATE_AMOUNT = 90f; [SerializeField, Header("Touch Quads Colliders")] private Collider touchQuadLeft; [SerializeField] private Collider touchQuadRight; [SerializeField, Space(10)] private float timeToRotate; private bool mustRotateLeft = false; private bool mustRotateRight = false; private float targetX = 0.0f; private float targetZ = 0.0f; private float xVelocity = 0.0F; private float zVelocity = 0.0F; int layerMask; private void Start() { layerMask = LayerMask.GetMask("Touch Quad"); } void Update() { ListenForTouch(); RotateCube(); } private void ListenForTouch() { if (Input.GetMouseButtonDown(0)) { Vector2 touchPoint = Input.mousePosition; RaycastHit rayCastHit; Ray ray = Camera.main.ScreenPointToRay(touchPoint); if (Physics.Raycast(ray, out rayCastHit, 150f, layerMask)) { if (rayCastHit.collider == touchQuadLeft) { mustRotateLeft = true; targetZ = targetZ - ROTATE_AMOUNT; zVelocity = 0f; } else if (rayCastHit.collider == touchQuadRight) { mustRotateRight = true; targetX = targetX + 90f; xVelocity = 0f; } } } } private void RotateCube() { float zAngle = transform.rotation.eulerAngles.z; float xAngle = transform.rotation.eulerAngles.x; if (mustRotateLeft) { zAngle = Mathf.SmoothDampAngle(zAngle, targetZ, ref zVelocity, timeToRotate); if (Mathf.Abs(zVelocity) <= STOP_THRESHHOLD) { mustRotateLeft = false; } } if (mustRotateRight) { xAngle = Mathf.SmoothDampAngle(xAngle, targetX, ref xVelocity, timeToRotate); if (Mathf.Abs(xVelocity) <= STOP_THRESHHOLD) { mustRotateRight = false; } } transform.rotation = Quaternion.Euler(xAngle, 0f, zAngle); } } Inspector Values: I've tried several things, the only problem I can think of is, gimbal lock. Thanks in advance. | I am trying to write code with rotates an object. I implemented it as: Rotation about X-axis is given by the amount of change in y coordinates of a mouse and Rotation about Y-axis is given by the amount of change in x coordinates of a mouse. This method is simple and work fine until on the the axis coincides with Z-axis, in short a gimble lock occurs. How can I utilize the rotation arount Z-axis to avoid gimbal lock. | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,161 |
I am trying to calculate the point from a given lat/long, a bearing, and a distance. This is for plotting weather given a aeronautical way point. For example: Given a waypoint with a known lat/long (LAX airport control tower), with a bearing of North by Northwest and a distance of 45 nautical miles, where is the point? I could treat the world as flat, but I think the resulting calculations would be inaccurate the further north the point is since longitudes get more narrow near the poles. I am mainly interested in plotting these points in the US. Is there a projection that would help me do this? This is for a Python application I'm writing, and the libraries I have access to are GDAL and Proj4 for projections. So no fancy tools like ArcGIS are available to me. Ultimately I want to combine 4 or these calculated points to form a polygon that should roughly be about 100-200 miles in width and height. Thanks! | There are several algorithms to calculate distance between two latitude/longitude points. Hubeny But there are not so many algorithms (actually I can't finally find it) which can calculate latitude/longitude of point B from another latitude/longitude point A with bearing and distance. I'm glad to know if there are any algorithms which is lesser accurate, but faster than Vincenty's formulae. | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,162 |
When I attempt to add XY Data to my map the lat and lon coordinates are projected in meters for example (-82, 34) becomes -82 meters and 34 meters. The GCS and Projection for both the new layer and the data frame are WGS 1984 Web Mercator (auxiliary sphere). Why is the new layer's XY Data displaying incorrectly? | I am trying to import an Excel table with around 10,000 XY points into ArcMap 10.2. I am using State Plane NAD 1983 Alabama West (US Feet) as the Coordinate System in my map data frame. I then add the excel table to my Table of Contents in the map then select Display XY Data. Next, I choose the same coordinate system as my data frame then select OK. Once it creates the Events layer for my points they are not in the correct location. Should I switch everything in my map to WGS84 or is there a solution for my points to be added onto my current map with the state plane coordinate system? | Earlier this year, I read a huge (1000+ pg) novel about a cloistered group of monks that sort of worshiped math. The story involved alternate dimensions or time travel or something. A big thing in the book was coordinates, containing data like for speed or something as an alternate axis. There were several rings of the monastery. At some point the main character was called out of his monastery to never return. I'm hoping to figure out the name so I can discuss it with my book club. | eng_Latn | 29,163 |
I need to make this head smoother. I am already using subdivion surface 3 on whole object. I found here and try select head and use shade smooth. Head stil had that visible lines and just not look smooth. Here is file, I hope I upload it right. | When I create a UV sphere and add the subdivision surface modifier, the following wrinkle appears. How can I avoid it? I've read the article but there seems to be no clear solution. UPDATE 1 I've used the word 'apply' in the first sentence of the initial version of this question. It's misleading since I want to preserve the modifier without 'applying' it. Rather, what I meant is 'adding' the modifier. UPDATE 2 I followed josh sanfelici's solution, but still there is an artifact at the pole. Here is the screen: (A rendered view with a sun lamp) Maybe it's the limitation of the general-purpose subdivision modifier. BTW, when the cast modifier is added after the subdivision modifier, the sphere becomes perfect (even for my original wrinkled sphere). But since I want to edit the resulting sphere (for example, scaling X, Y, etc) while modifier is added but not applied, it's not usable. | Here is my player code. Rigidbody rb; Vector3 currMovement; public float jumpSpeed = 10; public float moveSpeed = 10; public float rotSpeed = 180; float distToGround; public float posSmoothTime = 0.5f; float currPos; float targetPos; float posVel; public float rotSmoothTime = 0.5f; float currRot; float targetRot; float rotVel; void Start() { rb = GetComponent<Rigidbody>(); distToGround = GetComponent<Collider>().bounds.extents.y; } bool isGrounded() { return Physics.Raycast(transform.position, Vector3.down, distToGround + 0.1f); } void Update() { Move(); } void Move() { // Rotation smoothing. targetRot = Input.GetAxisRaw("Horizontal") * rotSpeed * Time.smoothDeltaTime; if (targetRot > 360) targetRot -= 360; if (targetRot < 0) targetRot += 360; currRot = Mathf.SmoothDampAngle(currRot, targetRot, ref rotVel, rotSmoothTime * Time.smoothDeltaTime); transform.eulerAngles += new Vector3(0, currRot, 0); // Movement smoothing. targetPos = Input.GetAxisRaw("Vertical") * moveSpeed; currPos = Mathf.SmoothDamp(currPos, targetPos, ref posVel, posSmoothTime * Time.smoothDeltaTime); currMovement = new Vector3(0, 0, currPos); currMovement = transform.rotation * currMovement; if (isGrounded()) { if (Input.GetButtonDown("Jump")) rb.velocity += Vector3.up * jumpSpeed; } rb.position += currMovement * Time.smoothDeltaTime; } I have a Rigidbody attached to my player. I think the problem is with my camera script. Here is my camera script. public Transform player; Quaternion targetLook; Vector3 targetMove; public float smoothLook = 0.5f; public float smoothMove = 0.5f; public float distFromPlayer = 5, heightFromPlayer = 3; Vector3 moveVel; void LateUpdate() { CameraMove(); } void CameraMove() { targetMove = player.position + (player.rotation * new Vector3(0, heightFromPlayer, -distFromPlayer)); transform.position = Vector3.SmoothDamp(transform.position, targetMove, ref moveVel, smoothMove); targetLook = Quaternion.LookRotation(player.position - transform.position); transform.rotation = Quaternion.Slerp(transform.rotation, targetLook, smoothLook); } } The player is not an parent of my camera. When I parent the player to my camera the shake stops. But I want a custom smooth camera movement with my custom scirpt, so I can't make the player a parent of the camera. | eng_Latn | 29,164 |
Here is my player code. Rigidbody rb; Vector3 currMovement; public float jumpSpeed = 10; public float moveSpeed = 10; public float rotSpeed = 180; float distToGround; public float smoothTimePos = 0.25f; float currPos; float targetPos; float posVel; public float smoothTimeRot = 0.25f; float currRot; float targetRot; float rotVel; void Start() { rb = GetComponent<Rigidbody>(); distToGround = GetComponent<Collider>().bounds.extents.y; } bool isGrounded() { return Physics.Raycast(transform.position, Vector3.down, distToGround + 0.1f); } void Update() { targetRot = Input.GetAxisRaw("Horizontal") * rotSpeed * Time.smoothDeltaTime; targetPos = Input.GetAxisRaw("Vertical") * moveSpeed; } void FixedUpdate() { Move(); } void Move() { if (isGrounded()) { if (Input.GetButtonDown("Jump")) rb.velocity += Vector3.up * jumpSpeed; } // Rotation smoothing. if (targetRot > 360) targetRot -= 360; if (targetRot < 0) targetRot += 360; currRot = Mathf.SmoothDampAngle(currRot, targetRot, ref rotVel, smoothTimeRot); transform.eulerAngles += new Vector3(0, currRot, 0); // Movement smoothing. currPos = Mathf.SmoothDamp(currPos, targetPos, ref posVel, smoothTimePos); currMovement = new Vector3(0, 0, currPos); currMovement = transform.rotation * currMovement; rb.position += currMovement * Time.smoothDeltaTime; } I have a Rigidbody attached to my player. I think the problem is with my camera script. Here is my camera script. public Rigidbody player; Quaternion targetLook; Vector3 targetMove, targetMoveRaycast, targetMoveUse; public float smoothLook = 8, smoothMove = 0.25f; public float distFromPlayer = 5, heightFromPlayer = 3; Vector3 moveVel; void LateUpdate() { CameraMove(); } void CameraMove() { targetMove = player.position + (player.rotation * new Vector3(0, heightFromPlayer, -distFromPlayer)); RaycastHit hit; if (Physics.Raycast(player.position, targetMove - player.position, out hit, Vector3.Distance(player.position, transform.position))) { targetMoveRaycast = hit.point; targetMoveUse = targetMoveRaycast; } else { targetMoveUse = targetMove; } transform.position = Vector3.SmoothDamp(transform.position, targetMoveUse, ref moveVel, smoothMove); targetLook = Quaternion.LookRotation(player.position - transform.position); transform.rotation = Quaternion.Slerp(transform.rotation, targetLook, smoothLook * Time.smoothDeltaTime); } The player is not an parent of my camera. When I parent the player to my camera the shake stops. But I want a custom smooth camera movement with my custom scirpt, so I can't make the player a parent of the camera. | Here is my player code. Rigidbody rb; Vector3 currMovement; public float jumpSpeed = 10; public float moveSpeed = 10; public float rotSpeed = 180; float distToGround; public float posSmoothTime = 0.5f; float currPos; float targetPos; float posVel; public float rotSmoothTime = 0.5f; float currRot; float targetRot; float rotVel; void Start() { rb = GetComponent<Rigidbody>(); distToGround = GetComponent<Collider>().bounds.extents.y; } bool isGrounded() { return Physics.Raycast(transform.position, Vector3.down, distToGround + 0.1f); } void Update() { Move(); } void Move() { // Rotation smoothing. targetRot = Input.GetAxisRaw("Horizontal") * rotSpeed * Time.smoothDeltaTime; if (targetRot > 360) targetRot -= 360; if (targetRot < 0) targetRot += 360; currRot = Mathf.SmoothDampAngle(currRot, targetRot, ref rotVel, rotSmoothTime * Time.smoothDeltaTime); transform.eulerAngles += new Vector3(0, currRot, 0); // Movement smoothing. targetPos = Input.GetAxisRaw("Vertical") * moveSpeed; currPos = Mathf.SmoothDamp(currPos, targetPos, ref posVel, posSmoothTime * Time.smoothDeltaTime); currMovement = new Vector3(0, 0, currPos); currMovement = transform.rotation * currMovement; if (isGrounded()) { if (Input.GetButtonDown("Jump")) rb.velocity += Vector3.up * jumpSpeed; } rb.position += currMovement * Time.smoothDeltaTime; } I have a Rigidbody attached to my player. I think the problem is with my camera script. Here is my camera script. public Transform player; Quaternion targetLook; Vector3 targetMove; public float smoothLook = 0.5f; public float smoothMove = 0.5f; public float distFromPlayer = 5, heightFromPlayer = 3; Vector3 moveVel; void LateUpdate() { CameraMove(); } void CameraMove() { targetMove = player.position + (player.rotation * new Vector3(0, heightFromPlayer, -distFromPlayer)); transform.position = Vector3.SmoothDamp(transform.position, targetMove, ref moveVel, smoothMove); targetLook = Quaternion.LookRotation(player.position - transform.position); transform.rotation = Quaternion.Slerp(transform.rotation, targetLook, smoothLook); } } The player is not an parent of my camera. When I parent the player to my camera the shake stops. But I want a custom smooth camera movement with my custom scirpt, so I can't make the player a parent of the camera. | Here is my player code. Rigidbody rb; Vector3 currMovement; public float jumpSpeed = 10; public float moveSpeed = 10; public float rotSpeed = 180; float distToGround; public float posSmoothTime = 0.5f; float currPos; float targetPos; float posVel; public float rotSmoothTime = 0.5f; float currRot; float targetRot; float rotVel; void Start() { rb = GetComponent<Rigidbody>(); distToGround = GetComponent<Collider>().bounds.extents.y; } bool isGrounded() { return Physics.Raycast(transform.position, Vector3.down, distToGround + 0.1f); } void Update() { Move(); } void Move() { // Rotation smoothing. targetRot = Input.GetAxisRaw("Horizontal") * rotSpeed * Time.smoothDeltaTime; if (targetRot > 360) targetRot -= 360; if (targetRot < 0) targetRot += 360; currRot = Mathf.SmoothDampAngle(currRot, targetRot, ref rotVel, rotSmoothTime * Time.smoothDeltaTime); transform.eulerAngles += new Vector3(0, currRot, 0); // Movement smoothing. targetPos = Input.GetAxisRaw("Vertical") * moveSpeed; currPos = Mathf.SmoothDamp(currPos, targetPos, ref posVel, posSmoothTime * Time.smoothDeltaTime); currMovement = new Vector3(0, 0, currPos); currMovement = transform.rotation * currMovement; if (isGrounded()) { if (Input.GetButtonDown("Jump")) rb.velocity += Vector3.up * jumpSpeed; } rb.position += currMovement * Time.smoothDeltaTime; } I have a Rigidbody attached to my player. I think the problem is with my camera script. Here is my camera script. public Transform player; Quaternion targetLook; Vector3 targetMove; public float smoothLook = 0.5f; public float smoothMove = 0.5f; public float distFromPlayer = 5, heightFromPlayer = 3; Vector3 moveVel; void LateUpdate() { CameraMove(); } void CameraMove() { targetMove = player.position + (player.rotation * new Vector3(0, heightFromPlayer, -distFromPlayer)); transform.position = Vector3.SmoothDamp(transform.position, targetMove, ref moveVel, smoothMove); targetLook = Quaternion.LookRotation(player.position - transform.position); transform.rotation = Quaternion.Slerp(transform.rotation, targetLook, smoothLook); } } The player is not an parent of my camera. When I parent the player to my camera the shake stops. But I want a custom smooth camera movement with my custom scirpt, so I can't make the player a parent of the camera. | eng_Latn | 29,165 |
I am converting between coordinate systems where Z is the vertical axis in one, and Y in another. I have the rotation values represented as a quaternion. I see this related issue but it's switching the handedness (DirectX and OpenGL): How would I go about swapping these values? I'd prefer not to convert to a vector3, modify and convert back again. | I have a quaternion, coming from a system with the following:(intertial sensor) Right handed. Forward direction: Y axis Right direction: X axis Up direction: Z axis I need to convert this into a coordinate system that is: (Unreal engine) left-handed. Forward direction: X axis Right direction: Y axis Up direction: Z axis I have tried negating the axis, and angle, I have tried switching values, i cannot get this to work. All help greatly appreciated! I am working in C#, with Microsoft.Xna.Quaternion. | OSVR has a right-handed system: x is right, y is up, z is near. I need to convert orientation data from a sensor in OSVR to a different right-handed system in which x is forward, y is left, and z is up. I need to calculate the transformation that will convert a quaternion from one system to the other. I naively tried: void osvrPoseCallback(const geometry_msgs::PoseStampedConstPtr &msg) { // osvr to ros geometry_msgs::PoseStamped outputMsg; outputMsg.header = msg->header; outputMsg.pose.orientation.x = msg->pose.orientation.y; outputMsg.pose.orientation.y = msg->pose.orientation.z; outputMsg.pose.orientation.z = msg->pose.orientation.x; outputMsg.pose.orientation.w = msg->pose.orientation.w; osvrPosePub.publish(outputMsg); ros::spinOnce(); } But that made things really weird. Facing north pitch is pitch, facing west, pitch is yaw, facing south, pitch is negative pitch... How can I convert my OSVR quaternion to the a corresponding quaternion in the new coordinate system? | eng_Latn | 29,166 |
What is the Quickest method for determining which postcode is 2 miles north of original postcode? For instant L17. I know how to determine the distance between 2 points as below, but the above formulae eludes me. Code: public static final double PI = 3.14159265; public static final double deg2radians = PI/180.0; double lat1 = latitude1 * deg2radians; double lat2 = latitude2 * deg2radians; double lon1 = longitude1 * deg2radians; double lon2 = longitude2 * deg2radians; radd=2*Math.asin(Math.sqrt(Math.pow(Math.sin((lat1-lat2)/2),2.0) + Math.cos(lat1)*Math.cos(lat2)*Math.pow(Math.sin((lon1-lon2)/2) ,2.0))); | I am wanting to find a latitude and longitude point given a bearing, a distance, and a starting latitude and longitude. This appears to be the opposite of this question (). I have already looked into the haversine formula and think it's approximation of the world is probably close enough. I am assuming that I need to solve the haversine formula for my unknown lat/long, is this correct? Are there any good websites that talk about this sort of thing? It seems like it would be common, but my googling has only turned up questions similar to the one above. What I am really looking for is just a formula for this. I'd like to give it a starting lat/lng, a bearing, and a distance (miles or kilometers) and I would like to get out of it a lat/lng pair that represent where one would have ended up had they traveled along that route. | I'm not asking the diagonal values, but the length and breadth ones of the Nexus 7. And only of the LCD screen, not the entire hardware. | eng_Latn | 29,167 |
Contract a mesh (in order to make a mesh parallel to the original one) | How to extrude and scale with an even offset? | There are no interfaces on which a capture can be done | eng_Latn | 29,168 |
How do I calculate the difference of two angle measures? | How can I find the difference between two angles? | How can I find the difference between two angles? | eng_Latn | 29,169 |
Slope layer with false results in QGIS? | Getting incorrect slope values from an ASTER DEM in QGIS? | Why are my objects in all layers? | eng_Latn | 29,170 |
Calculate new Lat/Lon from initial Lat/Lon plus Cartesian X,Y | Algorithm for offsetting a latitude/longitude by some amount of meters | Lat Long to X Y Z position in JS .. not working | eng_Latn | 29,171 |
Projecting Microstation .dgn that has no coordinate system | Identifying Coordinate System of Shapefile when Unknown? | Defining the appropriate Projected Coordinate System | eng_Latn | 29,172 |
Blender: Strange clipping planes when rotating the view I am experiencing strange clipping behavior that is shown in the gif below. Why are my grid and cube being clipped when I rotate my view about the Y Axis? Everything seems normal when my view is aligned with the +/- Z Axis. For what it's worth, I've changed my Units to Metric, and my unit scale is .001. For a higher quality gif, I have one at this . | Why does part of my model disappear when I zoom in on it in the 3D Viewport? I'm having an odd issue - When I try to zoom in on a detail using my MMB, the "camera" in the 3D view seems to be cutting off the exterior surface of my object and showing me the inside instead. I'd expect this behavior, but only when I was much more zoomed in. If I zoom out a bit (one click on the scroll wheel) then I can see the exterior as I intend. Is there any way to adjust this behavior? Thanks! | How to perfectly align UV coordinates automatically? I have this setup: The mesh is only the visible top side, with a mirror modifier for the bottom part. Now I want to unwrap it like in the picture. I want the inner circle (red) to go to the top of the UV map... The problem is that no matter what automatic unwrapping I use, it never works as intended. The closest I've gotten was by using "follow quads", but that skewed the UV map a lot: I used the result from the second image and manually adjusted the vertices with a lot of scaling and "align auto", but its very far from perfect as you can see in the first image. Here's what I want exactly: The uv vertices should be distributed evenly (horizontally) The middle line (yellow) should be on a "natural" y height (so there's no stretching) I hope it's clear what I want to achieve. I'm sure there must be some automatic way to perfectly align and distribute the UVs in this way. By the way: If you look closely you can see that I already placed a uv seam (aligned with the green y axis in the screenshot) but that didn't really help. | eng_Latn | 29,173 |
Why does flexbox center element properly while positioning properties does not? I am trying to center a div element. It seems in my code that all is right but the element is not being centered properly Where did I wrong? If i try to center it by using flexbox then it is centered properly. Where is the wrong of positioning property? * { margin: 0; padding: 0; box-sizing: border-box; } .container { width: 200px; height: 39px; border: 1px solid red; position: relative; } .cntr { position: absolute; left: 50%; top: 50%; transform: translate(-50%, -50%); background-color: blue; width: 35px; height: 35px; } <div class='container'> <div class='cntr'> </div> </div> | How can I vertically center a "div" element for all browsers using CSS? I want to center a div vertically with CSS. I don't want tables or JavaScript, but only pure CSS. I found some solutions, but all of them are missing Internet Explorer 6 support. <body> <div>Div to be aligned vertically</div> </body> How can I center a div vertically in all major browsers, including Internet Explorer 6? | Local Coordinate to Geocentric The ultimate question, I need a transformation matrix to take a point in local space representing a roughly 500m x 500m place in New Mexico centered at -108.619987456 long 36.234600064 lat. The final output needs to be in geocentric coordinates and I can during initialization get any number of points on the map such as the corners, center, western most along the center etc. During run time I would like to have a transformation matrix that can be applied to a position in local space expressed in meters and get back an approximate GCC coordinate. The center of the point in GCC: -1645380 -4885138 3752889 The bottom left corner of the map in GCC: -1644552, -4881054, 3749220 During run time I need to be able to multiply (-250, -250, 0) with the transformation matrix and get something close to (-1644552, -4881054, 3749220). I've spent more than a week trying to research a solution for this and so far nothing. Given an identity matrix as our starting position and orientation. Then using geotrans and the known lat, long, and height of the starting position I get an x,y,z geocentric coordinate. A vector from the origin of (0,0,0) gives both the up and translation for the matrix. However, I need the forward and right so that I can pass a distance in meters from the origin into the transformation matrix and get a roughly accurate GCC. Do I have all of the inputs I need to calculate the right and forward? Inputs Origin: 0, 0, 0 Global: -1645380, -4885138, 3752889 Up (Normalized): Global - Origin Desired Outputs Right: ? ? ? Forward: ? ? ? So with the right and forward added to the up and translation I already calculated I would have a transformation matrix. I could then apply the matrix to a vector of say (50, 50, 0) and get something within about 0-3 cm of the output if I feed the lat/long back into geotrans. This matrix would only ever be used small maps of about 500m x 500m so the curvature of the earth is negligible. In reply to whuber, I don't know your level of experience so I will start with the very basics. An identity matrix is a 4x4 matrix that you can multiply by a vector and it returns the vector unchanged. Such as below. 1 0 0 x=0 0 1 0 y=0 0 0 1 z=0 0 0 0 1 The x, y, and z of the matrix are how much to translate the vector by and the first 3 rows and columns are the rotation. Below is a tranformation matrix that does nothing to the orientation of a vector but does translate. This is what you would want for two axis aligned worlds in different locations. 1 0 0 13 0 1 0 50 0 0 1 -7 0 0 0 1 If you were to multiply a vector of (10, 10, 10) with the above transformation matrix you would get an output of (23, 60, 3). My problem is the axes are not aligned and the "plane" of the world I am trying to get the local coordinate converted to is projected on the ellipsoid of the earth. Geotrans is library that you can use to pass a coordinate from one system such as geodetic or GDC (lat, long, height) and get back another such as geocentric or GCC (x,y,z). For example: If my game world was representing a map centered exactly on the prime meridian and equator the transformation matrix would look something like below. I am just guesstimating the Y and Z rotations and might have them flipped and/or the wrong sign. 0 0 1 6378137 0 1 0 0 1 0 0 0 0 0 0 1 | eng_Latn | 29,174 |
How to know the real size of grid? Im working with QGIS. When dividing a map with a grid I set in the paramters X=0.1 and Y=0.1. Could someone please explain what does that mean? I mean would the map be divided into 100mx100m squares? How do I know the exact measure in meters of each side of the square? Maybe its a silly question but its not that clear to me. Any help would be appreciated! Thanks! | Determining size of grid cells created by vector grid tool of QGIS? I am working with QGIS 1.8.0 and Im making a vector grid on a map. Does someone know how big are the squares if the grid settings look like in the picture? | Generating a DEM and DSM from correct LiDAR point classification I have downloaded classified LAS data from I am trying to generate accurate DEM and DSM for further analysis. I created a LAS dataset with a projected CRS of NAD_1983_UTM_Zone_19N in meters and the z CRS in meters as well. My question is on which classifications do I choose for the DEM and DSM. Here is the filter options I have in my dataset: According to unh LiDAR data report class 2 is ground. What I have done so far: DEM I chose the class 2 points and used the las dataset to raster. DSM In the predefined settings I chose first return and used the las dataset to raster tool. Is this an accurate way of generating these two rasters? Do I not have to take into account the the unassigned class 1, noise class 7, reserved 11, reserved 17, reserved 18? Additionally (I can ask this as a separate question if it gets requested to), when using the las dataset to raster tool the sampling value is defaulted to 10. I would like to change it to 2 or 3 (would be meters) to match the LiDAR resolution. I also plan on using the generated DEM and DSM for a least cost path analysis and I want them to be higher resolution. Will making the sampling_value 2 or 3 throw off the results or should I leave it at 10? | eng_Latn | 29,175 |
Can I check to see if an object is used in another objects modifier? I have a complex file with a lot of objects in it. Many of them are there as a target for a modifier of some kind Boolean, curve modifier, etc. Sometimes I go through and clean out objects that are no longer needed. The problem is that it isn't always apparent if the object is being used in another object's modifier stack. I don't want to break my work only to find out a day later that I deleted something important. Is there some way that I can quickly see if the object is being used? | Selecting all objects that have been specified in Modifiers Could anyone help in making a script for selecting all objects that are specified in the different modifiers that an object has stacked? For example if my object has a Boolean modifier and an Array modifier, which both refer to an empty or another object (in the Array modifier there is for example an Object offset you can specify). Would it be possible to select all those object with one click? Thanks. | How can I rotate an object based on another's offset to it? I have a 3D model of a turret that con rotate around the Y-axis. This turret has a cannon that is significantly off the center of the object. I want the cannon, not the turret, to aim at a specified target. I can only rotate the turret, however, and thus I don't know what equation I need to apply in order to accomplish by objective. The following image illustrates my problem: If I have the turret "LookAt()" the target, a laser originating from the cannon will completely miss said target. If this were a completely top-down scenario, and the cannon were exactly parallel to the turret, then my logic tells me that the fake target should be located at a position that's equal to the actual target plus an offset that's equal to that between the turret and the cannon. However, in my actual scenario, my camera is angled 60º, and the cannon has a slight rotation. The following image illustrates the scenario: I'm not sure exactly why, but if I apply that same offset, it only seems to work while aiming at certain distances from the turret. Is my logic flawed? Am I missing something fundamental here? Final Edit: the solution provided by @JohnHamilton latest update solves this problem with perfect precision. I have now removed the code and images that I used to illustrate my incorrect implementations. | eng_Latn | 29,176 |
Using vintage lens for Canon EOS 450D I would like to ask for a help. I've recently fell in love with analog photography. My grandfather gave me these cameras: Werra with Tessar (2,8/50) Lens and Praktica PL nova 1 with Oreston 1,8/50 lens. I tried to take photos with them but I found out that analog photography is an expensive business these days. So I wondered if it is possible to attach these lenses to my Canon EOS 450D camera. I found out that they are selling Tessar with canon special eos bayonette on ebay, but I would rather buy only the bayonette/adapter and mod it by myself. Could anyone please tell me which types (or models) of adapters would suit for my Canon EOS / vintage lenses? | Can I use lens brand X on interchangeable lens camera brand Y? I've got some lenses of brand X, but I've just got a new camera of brand Y. How can I tell if it's possible to use my lenses on my camera? I accept that I may lose a lot of automatic functionality like autofocus, aperture control and metering when using an adapter to connect the two. | Convert quaternion to a different coordinate system OSVR has a right-handed system: x is right, y is up, z is near. I need to convert orientation data from a sensor in OSVR to a different right-handed system in which x is forward, y is left, and z is up. I need to calculate the transformation that will convert a quaternion from one system to the other. I naively tried: void osvrPoseCallback(const geometry_msgs::PoseStampedConstPtr &msg) { // osvr to ros geometry_msgs::PoseStamped outputMsg; outputMsg.header = msg->header; outputMsg.pose.orientation.x = msg->pose.orientation.y; outputMsg.pose.orientation.y = msg->pose.orientation.z; outputMsg.pose.orientation.z = msg->pose.orientation.x; outputMsg.pose.orientation.w = msg->pose.orientation.w; osvrPosePub.publish(outputMsg); ros::spinOnce(); } But that made things really weird. Facing north pitch is pitch, facing west, pitch is yaw, facing south, pitch is negative pitch... How can I convert my OSVR quaternion to the a corresponding quaternion in the new coordinate system? | eng_Latn | 29,177 |
Article after higher / high + noun I know that there are similar questions here, but I still don't understand everything. I wrote 4 almost identical sentences below with a place for an article marked as "...". Please tell me what are correct articles here. I know that these sentences can be rewritten without articles and with keeping the same sense, but I am interested in these particular examples. It is not a homework or something like that by the way. We must keep it at ... higher angle. We must keep it at ... high angle. You must drive at ... high speed. You must drive at ... higher speed. | Are there any simple rules for choosing the definite vs. indefinite (vs. none) article? I can’t for the life of me figure out where to use a and where to use the — and where there is no article at all. Is there a simple rule of thumb to memorize? The standard rule you always hear: “If a person knows which item you are talking about then use "the" . . . doesn’t clear things up for me, as I have no idea whether or not they know. | How can I stitch a panorama correctly if I moved the camera along the horizontal axis? Here in Argentina, we have . All the houses and walls on that street have some kind of mosaic stuck to it, and it's very cool. It was made by a local artist . Because to this piece of urban art is two blocks long, I've decided to make a panorama of it, by moving myself on an horizontal axis while taking photos. I mean, I took one photo, walked one step deeper along the street, took another photo, and so on. When I tried to stitch it in AutoPano, the following deformed thing came out: () And the other side of the block: () After this, I've learned about parallax error and why you have to avoid moving when making panoramas. I mean, there are a lot of connection errors on both images. Especially in the second one, the part with the corner is quite problematic to stitch because to as I moved, the perspective of the view changed a lot. So, is there any way to stitch this kind of panorama correctly? Would this only work on plain walls? | eng_Latn | 29,178 |
How to add faces to a cylinder top? (cookie) I am an absolute beginner, and while completing the donut tutorial I have been trying to model this industrial cookie just for fun. I have modeled the top cookie as a cylinder, beveled the edge and extruded it to create the border, and added a couple of loop cuts to sharpen rounded angles, and I am at this stage: Now I would like to sculpt the "shield" (and then carve holes and add text), but the center face has not enough resolution. How can I add enough poligons into that face so that sculpt works? | Fill cylinder cap with quads Is there any way of converting the triangle fan found on a cylinder cap into quads. | How can I rotate an object based on another's offset to it? I have a 3D model of a turret that con rotate around the Y-axis. This turret has a cannon that is significantly off the center of the object. I want the cannon, not the turret, to aim at a specified target. I can only rotate the turret, however, and thus I don't know what equation I need to apply in order to accomplish by objective. The following image illustrates my problem: If I have the turret "LookAt()" the target, a laser originating from the cannon will completely miss said target. If this were a completely top-down scenario, and the cannon were exactly parallel to the turret, then my logic tells me that the fake target should be located at a position that's equal to the actual target plus an offset that's equal to that between the turret and the cannon. However, in my actual scenario, my camera is angled 60º, and the cannon has a slight rotation. The following image illustrates the scenario: I'm not sure exactly why, but if I apply that same offset, it only seems to work while aiming at certain distances from the turret. Is my logic flawed? Am I missing something fundamental here? Final Edit: the solution provided by @JohnHamilton latest update solves this problem with perfect precision. I have now removed the code and images that I used to illustrate my incorrect implementations. | eng_Latn | 29,179 |
Leg Press & Actual Lifted Weight I was doing leg press at the gym today and was curious how much weight I actually lift when I do the exercise as compared to when I do a squat. Suppose I load $w_L$ onto the machine, which has an angle of elevation $\theta$. I know that the actual weight I lift vertically is $w_L \sin(\theta)$, but could someone give me an explanation why this is so? My intuition is to appeal to simple trigonometry, which tells me the vertical side of the triangle is $w_L \sin(\theta)$. | Vertical component of moving weight at a 45 degree angle Here's an easier one. I use the leg press machine at the gym so I don't have to worrying about hurting myself while lifting heavier weight. The weight glides on a track that looks to be 45 degrees. What's the equation to figure out how much weight I would be able to squat normally. IE the vertical component of moving 400lbs at a 45 degree angle. | How can I rotate an object based on another's offset to it? I have a 3D model of a turret that con rotate around the Y-axis. This turret has a cannon that is significantly off the center of the object. I want the cannon, not the turret, to aim at a specified target. I can only rotate the turret, however, and thus I don't know what equation I need to apply in order to accomplish by objective. The following image illustrates my problem: If I have the turret "LookAt()" the target, a laser originating from the cannon will completely miss said target. If this were a completely top-down scenario, and the cannon were exactly parallel to the turret, then my logic tells me that the fake target should be located at a position that's equal to the actual target plus an offset that's equal to that between the turret and the cannon. However, in my actual scenario, my camera is angled 60º, and the cannon has a slight rotation. The following image illustrates the scenario: I'm not sure exactly why, but if I apply that same offset, it only seems to work while aiming at certain distances from the turret. Is my logic flawed? Am I missing something fundamental here? Final Edit: the solution provided by @JohnHamilton latest update solves this problem with perfect precision. I have now removed the code and images that I used to illustrate my incorrect implementations. | eng_Latn | 29,180 |
Converting lat-lon from degrees and decimal minutes to decimal degrees in PostGIS Creating a geometry in Postgis is fairly simple: ST_SetSRID(ST_MakePoint(longitude,latitude),4326); These lat/lon should be in decimal degrees format like 33.02505 -96.70668 Is there a way/function to enter them in the degrees and decimal minutes format provided by NMEA $GPRMC sentences like 3301.5032 09642.4010? Of course you can convert them in php or nodejs; but it would be easier (faster) to have it natively in Postgis. | Converting Degree Minutes to decimal degrees using Excel formula? I have location points collected from a garmin device stored in an excel sheet in Degree Minutes format --- W00208.172,N1046.977. I am looking for an Excel formula to convert either to Decimal Degrees or Degrees Minutes seconds Format ? | Calculating extent of custom tiles? Problem The picture is just an example of my case, but I am left with blank space on the left when I load all the tiles and I get error message in the console: Failed to load resource: the server responded with a status of 404 (Not Found) http://path/to/the/tile//base/8/6/53.png etc. I could guess and limit the extent of the layer, but I would really like to know is there any way to calculate it and also be able to get rid of the error message. What I know about the tiles? Originally they are served as TMS They represent png images 256*256 They are in projection: EPSG:3857 In OpenLayers2 they were called like TMS layer In OpenLayers3 (v.3.8.2) they are called like XYZ layer (that includes -y): var baseLayer = new ol.layer.Tile({ source: new ol.source.XYZ({ url: 'http://path/to/the/tiles/base/{z}/{x}/{-y}.png' }) }); Thoughts and questions How can I calculate extent of the tiles? Or in another words how can I find out zmin zmax xmin ymin xmax ymax or whole range of x/y? What does x/y really represent? Are they coordinates of the left bottom corner or center of the tile? In which direction does x/y increase? How can I figure out x/y of at least one of the tile, for example left bottom one? (I have modified the question for better understanding of the problem) | eng_Latn | 29,181 |
Which .las classification codes and returns to use when creating DTM and DSM? I am trying to figure out what classification codes and returns that I should use to create DTM and DSM? In my layer --> "BrooksCamp.lasd" I have these classification codes and returns: (and then 26 to 31 reserved) I am using the LAS Dataset to Raster tool. | Generating a DEM and DSM from correct LiDAR point classification I have downloaded classified LAS data from I am trying to generate accurate DEM and DSM for further analysis. I created a LAS dataset with a projected CRS of NAD_1983_UTM_Zone_19N in meters and the z CRS in meters as well. My question is on which classifications do I choose for the DEM and DSM. Here is the filter options I have in my dataset: According to unh LiDAR data report class 2 is ground. What I have done so far: DEM I chose the class 2 points and used the las dataset to raster. DSM In the predefined settings I chose first return and used the las dataset to raster tool. Is this an accurate way of generating these two rasters? Do I not have to take into account the the unassigned class 1, noise class 7, reserved 11, reserved 17, reserved 18? Additionally (I can ask this as a separate question if it gets requested to), when using the las dataset to raster tool the sampling value is defaulted to 10. I would like to change it to 2 or 3 (would be meters) to match the LiDAR resolution. I also plan on using the generated DEM and DSM for a least cost path analysis and I want them to be higher resolution. Will making the sampling_value 2 or 3 throw off the results or should I leave it at 10? | Determining coordinates of a SHP file i have a SHP file of a building converted from a DXF file. This SHP file is unprojected and have no specific coordinates (origin 0,0). I find WGS84 coordinates (lat, long) of this building from Google Maps. My aim is to replace these coordinates to the SHP file on QGIS. Could you explain the process step by step? (I read lots of similar questions, about affine transformation. But i don't know the paramaters.) What i ask is how to make this transformation in QGIS (step by step)? SHP file coordinates (not projected) Scale: 1:142.670.713 X1,Y1: 16.70824, 253.74534 X2,Y2: 1.03521,-6.34918 X3,Y3: 244.00248,-7.49189 Target WGS84 coordinates X1,Y1: 41.047773, 28.896241 X2,Y2: 41.045452, 28.895929 X3,Y3: 41.045436, 28.899039 I need a second vector layer with coordinates. I have just a SHP file of a building converted from a DXF file. That is not projected. I know where this building is, and can find the WGS84 coordinates from Google Maps. My aim is add WGS84 coordinates to this SHP file and export to replace the building on Google Maps. My Questions are: 1) I think my SHP file has measurements in meters and unprojected. I have WGS84 lat long coordinates from Google Maps. How can i project this SHP file, for being ready to transformation? Right Click -> Set Layer CRS -> Choose WGS84 EPSG: 4326. Is this enough? 2) qgsAffine plugin just need transformation matrix paramaters. But the answer of @Jochen Schwarze is a bit complex for me. Is there a online calculator? (Just give the source XYs and target XYs and calculate the parameters.) I think, I have a projection problem before affine transformation. I have a DXF file of a building from AutoCAD. I saved this DXF file as a SHP file in QGIS. But this SHP file is converted from a plain DXF file and is not referenced/projected, it is drawn from coordinates 0,0 in AutoCAD in meters. And the drawing has meters, WGS84 has decimal degrees. When I set Layer CRS as WGS84 in QGIS, the coordinates don't change. How can I project this SHP file as WGS84 in QGIS? | eng_Latn | 29,182 |
How can i project a coordinate system for many rasters all at once? | How can I use GDAL to batch define a projection? | Google map API does not plot on different continents | eng_Latn | 29,183 |
Mean Absolute Scaled Error | Interpretation of mean absolute scaled error (MASE) | Interval preserving transformations are linear in special relativity | eng_Latn | 29,184 |
The Lagrangian and inertial reference frames | Do we need inertial frames in Lagrangian mechanics? | Do we need inertial frames in Lagrangian mechanics? | eng_Latn | 29,185 |
I have always selected my font size by the eye or educated guess/trial and error but at this point I would like to know where I could find a chart or document with guidelines regarding what font size you should use for this print size considering the viewer will see it from this distance. While the suggested "" might have some nice details, as far as I saw from the question and answers, this only targets Street Signs while taking into account car speed limit. This is only a branch of what my question is about. I am interested in a more general way, a rule of thumbs of some sort that will help you get an idea on how to design something from A4 posters to outdoor billboards. | To avoid printing single letters at varying sizes and measuring how far away I can see it from a distance is there a formula for determining what size font for a particular read distance? Is it better to take the midpoint between the ascender line and x-height or to go purely off of the x-height (since going purely off the ascender line could leave lowercase letters difficult to read)? How does vertical height play into this? Would I need to then figure the hypotenuse distance instead of a straight distance? For example if I'm shooting for a 20' read distance but the sign is to be placed about 30' up should I use a distance of 35' instead of 20'? I'm shooting for a 20' read distance for the headline and then possibly a 10' read distance for additional content depending on how it all fits on the active area. | I installed Windows 7 fresh and installed SP1. Now, when I try to check manually for Windows Updates it just hangs on the Checking for updates screen. I tried running the tools in , but this did not fix the issue either: No matter what I do it just hangs on the "Checking for updates..." screen and goes no further. | eng_Latn | 29,186 |
meaning of axis in eyeglasses prescription | What does axis mean on my prescription? Answer: The axis number on your prescription tells your optician in which direction they must position any cylindrical power in your lenses (required for people with astigmatism). This number shows the orientation or angle in degrees from 1 to 180. The number 90 means vertical position and 180 horizontal. A higher number for the axis does not mean that your prescription is stronger - it simply describes the position of the astigmatism. Prescription Was this answer helpful? | My glasses prescription is the following: sphere (-10.0), cylinder (-2.0), axis (176). For tax purposes, i need t more My glasses prescription is the following: sphere (-10.0), cylinder (-2.0), axis (176). For tax purposes, i need to know if this characterizes someone as legally blind. | eng_Latn | 29,187 |
what is the valid range of latitude values | Latitude measures how far north or south of the equator a place is located. The equator is situated at 0°, the North Pole at 90° north (or 90°, because a positive latitude implies north), and the South Pole at 90° south (or â90°). Latitude measurements range from 0° to (+/â)90°. Longitude measures how far east or west of the prime meridian a place is located. The prime meridian runs through Greenwich, England. Longitude measurements range from 0° to (+/â)180°. In your setter for latitude, check if the value being set falls between -90 and 90 degrees. If it doesn't, throw an exception. For your longitude, check to see if the value falls between -180 and 180 degrees. | N). Negative latitude values correspond to the geographic locations south of the Equator (abbrev. S). Longitude is counted from the prime meridian (IERS Reference Meridian for WGS 84) and varies from â180° to 180°. Positive longitude values correspond to the geographic locations east of the prime meridian (abbrev. E). | eng_Latn | 29,188 |
wide azimuth seismic | So-called Wide-Azimuth acquisition is increasingly. used in an attempt to overcome such problems (Figure 1). Wide Azimuth seismic follows a very simple strategy - for. each receiver layout a large range of short to long offset. source positions are used over a large range of source-receiver. azimuths. | Azimuth is the angle along the horizon, with zero degrees corresponding to North, and increasing in a clockwise fashion. Thus, 90 degrees is East, 180 degrees is South, and 270 degrees is West. Using these two angles, one can describe the apparent position of an object (such as the Sun at a given time). The altitude and azimuth values are for the center of the apparent disk of the Sun or Moon. The altitude values include the effect of standard atmospheric refraction when the object is above the horizon. | eng_Latn | 29,189 |
What is the foundation most often used when creating new antenna models? | Both the vertical and dipole antennas are simple in construction and relatively inexpensive. The dipole antenna, which is the basis for most antenna designs, is a balanced component, with equal but opposite voltages and currents applied at its two terminals through a balanced transmission line (or to a coaxial transmission line through a so-called balun). The vertical antenna, on the other hand, is a monopole antenna. It is typically connected to the inner conductor of a coaxial transmission line (or a matching network); the shield of the transmission line is connected to ground. In this way, the ground (or any large conductive surface) plays the role of the second conductor of a dipole, thereby forming a complete circuit. Since monopole antennas rely on a conductive ground, a so-called grounding structure may be employed to provide a better ground contact to the earth or which itself acts as a ground plane to perform that function regardless of (or in absence of) an actual contact with the earth. | However, the problem of deflection settings — 'aim-off' — required knowing the rate of change in the target's position. Both France and UK introduced tachymetric devices to track targets and produce vertical and horizontal deflection angles. The French Brocq system was electrical, the operator entered the target range and had displays at guns; it was used with their 75 mm. The British Wilson-Dalby gun director used a pair of trackers and mechanical tachymetry; the operator entered the fuse length, and deflection angles were read from the instruments. | eng_Latn | 29,190 |
Appropriate reference system for small-scale mapping of the Middle East | Defining the appropriate Projected Coordinate System | Review bans should escalate beyond 30 days | eng_Latn | 29,191 |
Who invented the periscope?!? | Periscope\nFrom Wikipedia, the free encyclopedia\nJump to: navigation, search\n Search and attack periscopes of a French-built Scorpène class submarine.A periscope is an instrument for observation from a concealed position. In its simplest form it is a tube in each end of which are mirrors set parallel to each other and at an angle of 45° with respect to the line between them. It may be used as a toy or for seeing over people's heads in a crowd. This form of periscope, with the addition of a simple lens, was used for observation purposes in the trenches during World War I. More complex periscopes, using prisms instead of mirrors, and providing magnification, are used on submarines.\n\n[edit]\nEarly examples\n Officer at periscope in control room of submarineJohann Gutenberg, better known for his contribution to printing technology, marketed a periscope in the 1430s to enable pilgrims to see over the heads of the crowd at the vigintennial religious festival at Aachen. Simon Lake used periscopes in his submarines in 1902. Sir Howard Grubb perfected it in World War I[1]. Morgan Robertson claims to have described a submarine using a periscope in his fictional works. Periscopes, in some cases fixed to rifles, were used in World War I to enable soldiers to see out of the trenches.\n\nPeriscopes were also extensively used in tanks, enabling drivers or tank's commanders to inspect the situation without leaving the safety of the tank. An important development was Gundlach's periscope allowing tank commander to obtain 360 degree view without moving the seats (pictured). The design was first used in the Polish 7-TP light tank. Shortly before the war it was given to the British and was used in most tanks of WWII, including the British Crusader, Churchill, Valentine, and Cromwell and the American Sherman. The design was later copied and used extensively in tanks of the USSR (including the T-34 and T-70) and Germany.\n\n[edit]\nNaval use\n Gundlach periscope.A simple, fixed naval periscope using plane mirrors was built by the Frenchman Marie Davey in 1854. Thomas H. Doughty of the US Navy later invented a prismatic version for use in the American Civil War (circa 1891).\n\nThe invention of the collapsible periscope for use in submarine warfare is usually credited to Simon Lake in 1902, who called his device the omniscope or skalomniscope. There is also a report that an Italian, Triulzi, demonstrated such a device in 1901 calling it a cleptoscope.\n\nAnother early example of naval use of the periscope are the two adapted on the experimental French submarine Gymnote by the Captain Arthur Krebs in 1888 and 1889 (see in French : rbmn).\n\n Diagram of periscope.A modern submarine periscope incorporates lenses for magnification and functions as a telescope. It typically employs prisms and total internal reflection instead of mirrors, because prisms, which do not require coatings on the reflecting surface, are much more rugged than mirrors. It may have additional optical capabilities such as range finding and targeting. The mechanical systems of submarine periscopes are typically hydraulically powered and need to be quite sturdy to withstand the drag through water. The periscope chassis may also be used as to support a radio or radar antenna.\n\nSubmarines traditionally had two periscopes: a navigation periscope and a targeting, or commander's, periscope. These were originally mounted one forward of the other in the narrow hulls of diesel-electric submarines. In the much wider hulls of recent US Navy submarines, they are located side-by-side.\n\nHowever, the most modern submarines no longer use periscopes. The United States Navy's Virginia-class submarines instead use photonics masts, which lift an electronic imaging sensor set above the water. Signals from the sensor set are transmitted electronically to workstations in the submarine's control center. While the cables carrying the signal must penetrate the submarine' | dont know but check out sites like google and yahoo. am sure you'll find something useful. as i am answerin your question give my answer, the best answer. you too will get your 5 pts back | eng_Latn | 29,192 |
figure pitch | A pair of numbers nearby indicate the pitch, such as 4 and 12. To figure a different pitch, or any roof pitch, use a method that's used by contractors and roof framers. After figuring a pitch, ask the contractor to install a mock rafter assembly with two opposing rafters. | Roof pitch or slope is a measure of roof steepness or incline, represented in inches rise of 12 inches run. For example a â3 pitchâ or â3 in 12 pitchâ or â3/12 pitchâ, all imply that the roof rises 3 inches, for every 12 inches of itâs horizontal run. | eng_Latn | 29,193 |
what is roof pitch angle | A roof pitch angle is the slope and inclination angle of a roof in large buildings or smaller residential homes.ne of the best way to calculate roof pitch angles is to have a range at which the roof surface will make a horizontal plane. Therefore, roof pitch and degrees like thirty degrees, forty five degrees or sixty degrees, help in constructing a building roof pitch which is in a horizontal plane. | Rafter Length Calculator. Roof pitch refers to the amount of rise a roof has compared to the horizontal measurement of the roof called the run. To see how pitch impacts the look of a garage and changes cost click the design center button on our pole barn kits page. | eng_Latn | 29,194 |
how to calibrate the winplus type s digital compass | Google that question and that will bring up the compass website with the instructions. Basically just go to a large parking lot point your vehicle due North set the calibration switch on the back of the compass drive in a circle(same way) and complete the circle . The compass should sync. | Mine came right off when I pryed my nail under it. If the unit is faced down, I tugged up on the slot on the side. Or, try contacting the manufacturer. Their literature that came with the unit indicated they have good customer service. Sorry, wish I could help more. | eng_Latn | 29,195 |
I can't seem to get the center/power part level. This makes it hard to keep the heli steady when flying. | My son says that you need to make sure there is no hair in the propellers because that will put it off balance also. He says the battery needs to be perfectly in the middle, you may have to push it kind of hard. You can also use the buttons on the controller to adjust it. | Google that question and that will bring up the compass website with the instructions. Basically just go to a large parking lot point your vehicle due North set the calibration switch on the back of the compass drive in a circle(same way) and complete the circle . The compass should sync. | eng_Latn | 29,196 |
Difference between Geographic and Projected coordinate systems? What is the difference between a geographic coordinate system and a projected coordinate system? | Difference between projection and datum? What's the difference between a projection and a datum? | Calculate distance between two latitude-longitude points? (Haversine formula) How do I calculate the distance between two points specified by latitude and longitude? For clarification, I'd like the distance in kilometers; the points use the WGS84 system and I'd like to understand the relative accuracies of the approaches available. | eng_Latn | 29,197 |
Scaling so that the distances from the center are the same? when I scale a face, it is scaled propotionally. But how can I scale it, so that the two distances that I marked here: are the same? thx | Extruding surface while keeping the shape here is the picture of what I would like to get(example nr. 2). I need the edge frame to be same in width and length. Also the extrusion is on surface (without depth) I tried extruding region, but it did not give me the result I wanted. I know I could just adjust the scale and do it manually, but that would be eyeballing without achievign accuracy. I would like to know if it is possible to extrude or inset while keeping the shape (even edge space) | D&D 5e and "Theatre of the Mind" in combat Our target play style for combat encounters would be to use battle boards roughly half the time (when there is enough of interest in play that the tactics are fun to play out), and to skip that with faster "cinematic" combat otherwise. We have not been 100% successful. The problem is the numbers-driven descriptions in the rules. Needing to be "adjacent" (within 5 feet) for abilities such as Protection style fighting, ranged weapon ranges, spell effects and movement speeds are all much easier to resolve on a grid. Running those rules without a board rapidly devolves into sketches with numbers written on them and/or "mother, may I?" style of play because the DM needs to rule who is next to who, and which monsters are in the spell radius. These rulings also quickly get out of sync with player expectations when there is no map and we want to run combats quickly. It doesn't help that the numbers involved come in pretty much all multiples of the base 5 feet. There is no obvious close/short/medium for effects and ranges. If there were such then I think it would be much easier for the DM to categorise and hand-wave things. I was hoping that both board-based and no-board style of play would feel equally viable. But the focus on numbers seems to push toward board playing, at least for my group. We would still like to pay heed to the differences in movement, range, area-of-effect etc of the rules in play. At least in the general sense of creature X is faster than creature Y, or weapon/spell A has better range than weapon/spell B. Going beyond "just don't sweat the details", how can a group like mine play with the D&D 5e rules as written, without a map, but without leaning on the DM to resolve all our questions of "So, am I next to the Paladin?", or "Is it in range?" Clarifying: "just don't sweat the details" is a welcome approach to solving the problem, but please give some examples of how this works in practice; i.e. which details can be removed and how? | eng_Latn | 29,198 |
Does a ranged attack against an adjacent enemy still have disadvantage even if you have the Crossbow Expert feat? Part of the feat (PHB, p. 165) says: Being within 5 feet of a hostile creature doesn't impose disadvantage on your ranged attack rolls. This is relevant because of the rule on : Aiming a ranged attack is more difficult when a foe is next to you. When you make a ranged attack [...] you have disadvantage on the attack roll if you are within 5 feet of a hostile creature who can see you and who isn't incapacitated. My GM and some players have agreed that the benefit removes your disadvantage on attacks against distant targets (ones more than 5 feet away) while a foe is adjacent to you, but that targeting the adjacent foe still has disadvantage. Is this the correct ruling? | Crossbow Expert Feat... for Spells? I was thinking about creating a wizard character. Reviewing the spellcasting rules, I saw this: Most spells that require attack rolls involve ranged attacks. Remember that you have disadvantage on a ranged attack roll if you are within 5 feet of a hostile creature that can see you and that isn't incapacitated (see chapter 9). There are a lot of occasions that might arise that would make this a problem, or at least a bit of a hindrance. So as a D&D player normally would when they wanted to break a rule without actually cheating, I went to a list of feats and found Crossbow Expert: Thanks to extensive practice with the crossbow, you gain the following benefits: You ignore the loading quality of crossbows with which you are proficient. Being within 5 feet of a hostile creature doesn't impose disadvantage on your ranged attack rolls. When you use the Attack action and attack with a one handed weapon, you can use a bonus action to attack with a hand crossbow you are holding. I may be (am) overthinking things, but does this apply to magic attacks, such as cantrips like Eldritch Blast or Fire Bolt? The literal meaning of the feat says that I can, but it could also imply by the title and also the effects of the feat that this only applies to crossbows, or any other ranged weapon, excluding magic. The strongest evidence that it might not affect ability with ranged magic attacks is: Thanks to extensive practice with the crossbow With a crossbow, not with magic. | Limits of dead reckoning using MEMS sensors I'm trying to track body parts relative to a person's torso. I see quite a few questions about using MEMS accelerometers and gyros for dead reckoning, and they confirm my suspicions that various factors greatly limit their usefulness for these sorts of applications, but I'm seeking clarification of these limits: What exactly are these limits? Other answers have addressed why these limits exist. Naturally the specifications the parts in the system in question and what is considered "acceptable error" for the system will both change the exact limits, but is there a single order of magnitude in time, or distance that I can expect dead reckoning to work? I'm well aware that over long distances (a few yards or so) the error becomes too large for most practical purposes, but what about within a few feet? What can I do to improve these limits? I'm currently looking at using an accelerometer and a gyro. What other sensors can I add to the system to improve the error rate? I know over longer distances a GPS can be used, but I doubt any consumer electronics grade GPS has fine enough resolution to help in my case. Additionally, a general consensus seems to the only way to improve these limits past the point of improved sensors is to provide a reference not subject to error. Some systems solve this using cameras and markers. What kind of reference points can a portable/wearable device provide? I've seen the usage of radio waves to measure long distances accurately, but I can't tell if such a system could be accurate on such small scale (in terms of distance measured) using "off-the-shelf" components. | eng_Latn | 29,199 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.