Search is not available for this dataset
query
stringlengths 1
13.4k
| pos
stringlengths 1
61k
| neg
stringlengths 1
63.9k
| query_lang
stringclasses 147
values | __index_level_0__
int64 0
3.11M
|
---|---|---|---|---|
A Decoupling Algorithm for Full Vehicle Active Suspension Rolling Control | Vehicle is a MIMO coupling system, and its vibration is affected by multiple factors. Particularly under the steering condition, the rolling motion of the vehicle gets aggravated. In order to reduce the rolling angle of the vehicle effectively, a full vehicle model with active suspension is established and decoupling algorithm is proposed. The simulations show that the rolling angle of the vehicle is attenuated greatly, and the ride comfort and stability under steering condition are improved remarkably, which indicates that the control algorithm is effective. | The paper presents an analysis of the tracking loop, its dynamic and sampling requirements when it is designed to be an ::: essential part of the deeply coupled GNSS receiver. All the tracking loops of the GNSS receiver significantly influence ::: final GNSS receiver performance. Robustness of the tracking process can be improved using deeply coupled ::: architecture, when tracking loops inside of the GNSS receiver are supported with accelerations and velocities measured ::: in INS. | eng_Latn | 29,000 |
Hydrodynamic performance of concentric arrays of point absorbers | The behaviour of arrays of 12 heaving point absorbers in concentric arrangements is numerically assessed in a frequency domain model. The floaters are attached to a central cylindrical bottom-mounted structure. Each point absorber is restricted to the heave mode and is assumed to have its own linear power take-off system consisting of an external damping coefficient enabling power extraction and a supplementary mass coefficient tuning the point absorber to the incoming waves. The external damping and supplementary mass coefficients are optimized to maximize the power absorption by each floater in the array, with a restriction on the total control force that can be applied on the floaters. Various concentric arrangements with different radii and number of concentric circles are analysed to determine the most efficient among them. Moreover, the influence of the presence of a central bottom-mounted pillar and the effect of change in its dimension and shape on the power absorption are also studied. | Abstract According to the method described in reference [1], we have designed a series of corrector systems for Cassegrain telescopes automatically with an electronic computer. The calculations were carried out under the following two conditions: the primary and secondary mirrors have strictly conical surfaces, and the telescope is strictly free from third-order spherical aberration when the corrector systems are taken off. Results in this papercan be converted proportionally to real telescopes of any aperture, while the aberrations expressed in seconds of arc remain unchanged. | eng_Latn | 29,001 |
How to project WGS84 coordinates to calculate the distance in meters I have geo coordinates from Google Maps which are WGS84 in decimal degrees for India. I want to calculate the distance between them in meters. Which projection should I use? I use ArcMap 10.4.1. | Choosing Geographical Coordinate System and Projected Coordinate System for India? I want to create spatial data for entire India. Usually I use WGS84 as GCS and UTM zones for PCS in the places in India, but when I have to create the spatial data for entire India, I get confused which geographical coordinate systems and projected coordinate system will be suitable. What are suitable ones? | Limits of dead reckoning using MEMS sensors I'm trying to track body parts relative to a person's torso. I see quite a few questions about using MEMS accelerometers and gyros for dead reckoning, and they confirm my suspicions that various factors greatly limit their usefulness for these sorts of applications, but I'm seeking clarification of these limits: What exactly are these limits? Other answers have addressed why these limits exist. Naturally the specifications the parts in the system in question and what is considered "acceptable error" for the system will both change the exact limits, but is there a single order of magnitude in time, or distance that I can expect dead reckoning to work? I'm well aware that over long distances (a few yards or so) the error becomes too large for most practical purposes, but what about within a few feet? What can I do to improve these limits? I'm currently looking at using an accelerometer and a gyro. What other sensors can I add to the system to improve the error rate? I know over longer distances a GPS can be used, but I doubt any consumer electronics grade GPS has fine enough resolution to help in my case. Additionally, a general consensus seems to the only way to improve these limits past the point of improved sensors is to provide a reference not subject to error. Some systems solve this using cameras and markers. What kind of reference points can a portable/wearable device provide? I've seen the usage of radio waves to measure long distances accurately, but I can't tell if such a system could be accurate on such small scale (in terms of distance measured) using "off-the-shelf" components. | eng_Latn | 29,002 |
0402 SMD resistance shortcut I am wiring a small plastic model with 11 yellow 3V 0402 LEDs. I used the resistance calculator to find I need as many 56ohm resisters attached to each LED in parallel to keep them running properly, but that takes up an awful lot of space in a very thin model. Is there any way I can keep them powered safely but not need to wire 11 resisters? The website won't show me any other way and I'm not the best at researching this stuff. Thanks | Why exactly can't a single resistor be used for many parallel LEDs? Why can't you use a single resistor for a number of LEDs in parallel instead of one each? | Limits of dead reckoning using MEMS sensors I'm trying to track body parts relative to a person's torso. I see quite a few questions about using MEMS accelerometers and gyros for dead reckoning, and they confirm my suspicions that various factors greatly limit their usefulness for these sorts of applications, but I'm seeking clarification of these limits: What exactly are these limits? Other answers have addressed why these limits exist. Naturally the specifications the parts in the system in question and what is considered "acceptable error" for the system will both change the exact limits, but is there a single order of magnitude in time, or distance that I can expect dead reckoning to work? I'm well aware that over long distances (a few yards or so) the error becomes too large for most practical purposes, but what about within a few feet? What can I do to improve these limits? I'm currently looking at using an accelerometer and a gyro. What other sensors can I add to the system to improve the error rate? I know over longer distances a GPS can be used, but I doubt any consumer electronics grade GPS has fine enough resolution to help in my case. Additionally, a general consensus seems to the only way to improve these limits past the point of improved sensors is to provide a reference not subject to error. Some systems solve this using cameras and markers. What kind of reference points can a portable/wearable device provide? I've seen the usage of radio waves to measure long distances accurately, but I can't tell if such a system could be accurate on such small scale (in terms of distance measured) using "off-the-shelf" components. | eng_Latn | 29,003 |
Correcting incorrect altitudes of points recorded by my Android or how to calculate Geoid Height in QGIS | Inaccurate altitude GPS data with my Android? | Creating point features with exact coordinates in QGIS | eng_Latn | 29,004 |
A Topological Descriptor of Acoustic Images for Navigation and Mapping | on feature matching and image registration for two - dimensional forward - scan sonar imaging . | On the Sensor Design of Torque Controlled Actuators: A Comparison Study of Strain Gauge and Encoder-Based Principles | eng_Latn | 29,005 |
A Fast Calibration Method for Triaxial Magnetometers | An effective Pedestrian Dead Reckoning algorithm using a unified heading error model | An 8.5-ps Two-Stage Vernier Delay-Line Loop Shrinking Time-to-Digital Converter in 130-nm Flash FPGA | eng_Latn | 29,006 |
Evaluation of Precise Point Positioning Using MADOCA-LEX via Quasi-Zenith Satellite System | stanley : the robot that won the darpa grand challenge . | Aging-related changes in swallowing, and in the coordination of swallowing and respiration determined by novel non-invasive measurement techniques | kor_Hang | 29,007 |
A generic camera calibration method for fish-eye lenses | Intrinsic parameter calibration procedure for a (high-distortion) fish-eye lens camera with distortion model and accuracy estimation | Camera Self-Calibration: Theory and Experiments | eng_Latn | 29,008 |
An Improved Automatic Algorithm for Global Eddy Tracking Using Satellite Altimeter Data | Comparing images using the hausdorff distance | Efficacy of the Get Ready to Learn yoga program among children with autism spectrum disorders: a pretest-posttest control group design | kor_Hang | 29,009 |
Reverse-Projection Method for Measuring Camera MTF | Evolution of slanted edge gradient SFR measurement | attention mechanism quasi - samples for “ one - example ” classes . | eng_Latn | 29,010 |
Closed-Form Inverse Kinematics for Interventional C-Arm X-Ray Imaging With Six Degrees of Freedom: Modeling and Application | A flexible new technique for camera calibration | Long-distance liquid transport in plants | eng_Latn | 29,011 |
Online Initialization and Automatic Camera-IMU Extrinsic Calibration for Monocular Visual-Inertial SLAM | Unified temporal and spatial calibration for multi-sensor systems | Treatment of Smith-Lemli-Opitz syndrome and other sterol disorders. | kor_Hang | 29,012 |
Application of Levant's differentiator for velocity estimation and increased Z-width in haptic interfaces | Estimation of Angular Velocity and Acceleration from Shaft-Encoder Measurements | User loyalty and online communities: why members of online communities are not faithful | eng_Latn | 29,013 |
Closed-Form Design of Gysel Power Divider With Only One Isolation Resistor | A Modified Gysel Power Divider of Arbitrary Power Ratio and Real Terminated Impedances | Simple effective image and video color correction using quaternion distance metric | eng_Latn | 29,014 |
Classification and compensation of amplitude imbalance and imperfect quadrature in resolver signals | Software-Based Resolver-to-Digital Converter for DSP-Based Drives Using an Improved Angle-Tracking Observer | Robust Evaluation of RoboCup Soccer Strategies by Using Match History | eng_Latn | 29,015 |
SpeDo: 6 DOF Ego-Motion Sensor Using Speckle Defocus Imaging | A Triaxial Accelerometer Calibration Method Using a Mathematical Model | Image Super-Resolution Using Dense Skip Connections | kor_Hang | 29,016 |
Line-based extrinsic calibration of range and image sensors | extrinsic calibration of a 3d laser scanner and an omnidirectional camera . | Parental social support, coping strategies, resilience factors, stress, anxiety and depression levels in parents of children with MPS III (Sanfilippo syndrome) or children with intellectual disabilities (ID) | eng_Latn | 29,017 |
Design and Verification of a Digital Controller for a 2-Piece Hemispherical Resonator Gyroscope | Probably the best simple PID tuning rules in the world | internal model control . 4 . pid controller design . | eng_Latn | 29,018 |
Projective Bundle Adjustment from Arbitrary Initialization Using the Variable Projection Method | A Factorization Based Algorithm for Multi-Image Projective Structure and Motion | Closing the Loop: Evaluating a Measurement Instrument for Maturity Model Design | eng_Latn | 29,019 |
An evaluation of methods for modeling contact in multibody simulation | complementarity problems in gams and the path solver 1 . | gps - free geolocation using lora in low - power wans . | eng_Latn | 29,020 |
Techniques and tools for estimating ionospheric effects in interferometric and polarimetric SAR data | PALSAR Radiometric and Geometric Calibration | Fetch & Freight : Standard Platforms for Service Robot Applications | eng_Latn | 29,021 |
A high-Q birdbath resonator gyroscope (BRG) | Quadrature FM gyroscope | An algorithm for the graph crossing number problem | eng_Latn | 29,022 |
An Optimal Calibration Method for a MEMS Inertial Measurement Unit | Analysis and Modeling of Inertial Sensors Using Allan Variance | Provable Subspace Clustering: When LRR Meets SSC | eng_Latn | 29,023 |
Shifted Rayleigh filter: a new algorithm for bearings-only tracking | comparison of ekf , pseudomeasurement and particle filters for a bearing - only target tracking problem . | Reviving the Past Conventions of realism in the virtual reconstruction of Rome Reborn | eng_Latn | 29,024 |
Automatic Calibration of Spinning Actuated Lidar Internal Parameters | 3D Mapping for high-fidelity unmanned ground vehicle lidar simulation | Conflict Resolution Paradigms and their Influences to the Shona Societies in Zimbabwe | eng_Latn | 29,025 |
Design and implementation of a low-cost dual-axes autonomous solar tracker | Design, implementation and performance analysis of a dual-axis autonomous solar tracker | etude de mod \ ` eles \ ` a base de r \ ' eseaux bay \ ' esiens pour l ' aide au diagnostic de tumeurs c \ ' er \ ' ebrales . | eng_Latn | 29,026 |
Time-of-Flight sensor calibration for accurate range sensing | CALIBRATION OF A PMD-CAMERA USING A PLANAR CALIBRATION PATTERN TOGETHER WITH A MULTI-CAMERA SETUP | A new cell search scheme in 3GPP long term evolution downlink, OFDMA systems | eng_Latn | 29,027 |
Actuator fault detection and isolation system for an hexacopter | Mathematical modeling and control of a hexacopter | Geodesic image and video editing | eng_Latn | 29,028 |
Combining Numerous Uncorrelated MEMS Gyroscopes for Accuracy Improvement Based on an Optimal Kalman Filter | A Mode-Matched Silicon-Yaw Tuning-Fork Gyroscope With Subdegree-Per-Hour Allan Deviation Bias Instability | Body shape and women's attractiveness : The critical role of waist-to-hip ratio. | eng_Latn | 29,029 |
Variable stiffness control and implementation of hydraulic SEA based on virtual spring leg | Tendon elasticity and muscle function | Automatic Extrinsic Calibration of a Camera and a 3D LiDAR Using Line and Plane Correspondences | eng_Latn | 29,030 |
Comparison of bundle adjustment software for camera calibration in close range photogrammetry | Bundle Adjustment - A Modern Synthesis | A review of RF and microwave techniques for dielectric measurements on polar liquids | eng_Latn | 29,031 |
Globally optimal toon tracking | Computer aided inbetweening | Reliability of two goniometric methods of measuring active inversion and eversion range of motion at the ankle | eng_Latn | 29,032 |
Determining the Epipolar Geometry and its Uncertainty: A Review | A theory of self-calibration of a moving camera | Application of digital signal processing in discrimination of neutrons and gamma rays | eng_Latn | 29,033 |
Neat SIMD: Elegant vectorization in C++ by using specialized templates | An Evaluation of Vectorizing Compilers | IMU Preintegration on Manifold for Efficient Visual-Inertial Maximum-a-Posteriori Estimation | eng_Latn | 29,034 |
An Image Correction Method of Fisheye Lens Based on Bilinear Interpolation | Camera calibration with distortion models and accuracy evaluation | BIOLOGY OF TENDON INJURY: HEALING, MODELING AND REMODELING | eng_Latn | 29,035 |
Method of electrostatic gyroscope monitor's maritime six-times calibration | To analyze the relationship between the deceleration-free servo system's tracking error and electrostatic gyroscope monitor's six-time calibration,the reason and the solution method of six-time calibration working abnormally are given.By analyzing the academic relationship among the deceleration-free servo system's tracking error,h angle transformation and azimuth angle,a new six-time calibration method is proposed.In this method,the equatorial gyroscope and polar gyroscope are calibrated at the different time according their azimuth angles.By several dynamic tests,the method is proved effectively with 100% success rate when the deceleration-free servo system's tracking error or the ship rocking is over the mark.By using the method,the electrostatic gyroscope monitor is more applicable in the ships. | The utility model provides a dismantled and assembled measuring platform for intertidal zone feng chang, include: the intertidal zone is squeezed into as the ground to the bottom support, the main support, through can dismantle the mounting with the bottom leg joint, the main support is multi -segment type structure, adopts between each section can dismantle the mounting connects, platform, detachable are installed in the main support top, wind meter is fixed in on the platform. By last, the intertidal zone is squeezed into to the measuring platform ground, and the frame is firm. Adopt multi -segment type structure, can adjust the height of main support according to the demand, make the whole height of measuring platform can reach 10 meters, set up wind meter at the measuring platform top, realize measuring to the amount of wind of intertidal zone position. Dismantled and assembled platform, the simple installation is swift, is applicable to short characteristics of intertidal zone ebb activity duration. | eng_Latn | 29,036 |
Design of new sun-tracking device | A new sun-tracker based on a PSD sensor is developed.Two stepper motors are used to achieve the track of sun.It can achieve the track under any kind of weather and light-electric track is used in sunny day,and calendar-check track in rainy day.It can realized the automatic tracking in any direction angle of 360°and any height angle of 90°.Analysis of the system error and some experiments are done,experiment results show that the accuracy of this sun-tracker can be limited in 0.1° in ±12° inspect angle.This device can run stably under all weather. | AbstractRadar scatterometer observations at 17.2 GHz and 9.6 GHz were made of the snow cover in mid-latitude agricultural fields, using the University of Waterloo scatterometer, to determine the se... | eng_Latn | 29,037 |
Feasibility Of An Instrument For 15 Micrometer Mesoscale Geosynchronous Inversion | The success of vertical inversion measurements and processes from low altitude satellites has led us to speculate on the possibilities and returns of performing inversion from synchronous altitudes. We postu3ated the question as to whether one could accomplish temperature and composition inversion with a lateral resolution of about one mile square over a total field of about one hundred miles, and repeat this measurement about every hour. This would provide a continual picture of the temperature, water vapor and ozone content for specific mesoscale analysis. It would give a representation of "thermal winds" on a meaningful meteorological time scale.© (1974) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only. | The multi-functional GPS clock synchronization device was researched in this paper.The system used in Altera's SOPC solution and the GPS signal receiver module,it can synchronized the GPS clock automatically.And the frequency and phase can be adjustable independently and judge the input signal frequency automatically.The device provides synchronous clock for the electronic transformer and electronic transformer calibrator.Several field experiments show that the device meets the design requirements and achieves good effect. | kor_Hang | 29,038 |
Stability Assessment of a System Comprising a Single Machine and a Virtual Oscillator Controlled Inverter with Scalable Ratings | We present a small-signal stability study of a coupled synchronous generator and inverter system, where the inverter is controlled by virtual oscillator control (VOC). VOC is a recently proposed grid-forming inverter control strategy, which acts on faster time scales compared to droop control. In our study, we leverage a scalable VOC controller (that is by design agnostic of power levels) to test the system's small-signal stability at different inverter penetration levels. The impact of rotational inertia, reactive power support, and filter parameters on stability is then investigated. Results highlight possible issues that might arise in these mixed machine-inverter systems further motivating the need to develop next generation stabilizing grid-forming controllers. | The utility model provides a dismantled and assembled measuring platform for intertidal zone feng chang, include: the intertidal zone is squeezed into as the ground to the bottom support, the main support, through can dismantle the mounting with the bottom leg joint, the main support is multi -segment type structure, adopts between each section can dismantle the mounting connects, platform, detachable are installed in the main support top, wind meter is fixed in on the platform. By last, the intertidal zone is squeezed into to the measuring platform ground, and the frame is firm. Adopt multi -segment type structure, can adjust the height of main support according to the demand, make the whole height of measuring platform can reach 10 meters, set up wind meter at the measuring platform top, realize measuring to the amount of wind of intertidal zone position. Dismantled and assembled platform, the simple installation is swift, is applicable to short characteristics of intertidal zone ebb activity duration. | eng_Latn | 29,039 |
Solar sail technology—A state of the art review | Abstract In this paper, the current state of the art of solar sail technology is reviewed. Solar sail research is quite broad and multi-disciplinary; this paper focuses mainly on areas such as solar sail dynamics, attitude control, design and deployment, and mission and trajectory analysis. Special attention is given to solar radiation pressure force modeling and attitude dynamics. Some basics of solar sailing which would be very useful for a new investigator in the area are also presented. Technological difficulties and current challenges in solar sail system design are identified, and possible ideas for future research in the field are also discussed. | Recent progress in long service life and maintenance-free technique for solid rocket motor of tactical missile was discussed. According to recent research trend and development of related technique, the prospect of developing tendency in this field is also pointed out. | eng_Latn | 29,040 |
Action of Imaginary Number on Physics | The imaginary number comes in physical world along with physical evolution. The complex number composes most based concept element on physics. | This paper presents experimental studies from the road profile measurements by employing accelerometers and international roughness index - IRI assessment tools and practical guidelines with respect to measured acceleration data processing in terms of digital filter design and conversion of vertical acceleration data into displacement data. In addition, it shows comparative analysis of measurements with accelerometers and Rod & Level (Leica TS 06 Plus), and gives some recommendations how to process measured data. | yue_Hant | 29,041 |
THE OPTICAL ELLIPSE AS AN AID IN THE DESIGN OF DRIVERS' SEATS | The statistical description of the driver's visual position in the vehicle in the form of an optical ellipse is, in principle, a valuable aid for the car manufacturer, as this makes it possible, even at the design phase, to make predictions regarding the minimum angle of vision for any particular percentage of drivers. The SAE optical ellipse used nowadays does not, however, adequately describe the ocular position, as the reference system related to the seat reference point and the line of the torso is faulty, the influence of the back rest inclination is not sufficiently taken into account and the influence of a back rest inclination which is adjustable by the driver is completely overlooked. (TRRL) | Some Orbit Determination (OD) of Low Earth Orbiters (LEOs) based on undifferenced spaceborne GPS data were ::: discussed firstly in this paper. Then the principle and mathematical models of two different types of reduced-dynamic ::: were present. After that, dual-frequency spaceborne GPS data of doy 89, 2004 from CHAMP and GRACE satellite were ::: computed using two types of reduced-dynamic POD and the OD results were analyzed. Our CHAMP orbiting results of ::: one day using two different reduced dynamic POD methods are within 7 centimeters compared with GFZ Post processed ::: Science Orbits (PSO) and the GRACE orbiting results are with 3 centimeters compared with JPL OD results. | yue_Hant | 29,042 |
Bar Mechanism Base MATLAB Movement Parameter Emulation | Constitute mathematic model of Plane bar mechanism,analyze trace and velocity acceleration of four bar mechanism port,basing Matlab software design language.Basing four bar mechanism,define new vector to six bar mechanism and emulate visualization to port trace,it is benefit to parameter design and control by emulating in this way. | Programming the DLL of multiple founction data acquisition and using the DLL in labview to control stepper motor. | kor_Hang | 29,043 |
An enhanced Cramér-Rao bound weighted method for attitude accuracy improvement of a star tracker. | This study presents a non-average weighted method for the QUEST (QUaternion ESTimator) algorithm, using the inverse value of root sum square of Cramér-Rao bound and focal length drift errors of the tracking star as weight, to enhance the pointing accuracy of a star tracker. In this technique, the stars that are brighter, or at low angular rate, or located towards the center of star field will be given a higher weight in the attitude determination process, and thus, the accuracy is readily improved. Simulations and ground test results demonstrate that, compared to the average weighted method, it can reduce the attitude uncertainty by 10%-20%, which is confirmed particularly for the sky zones with non-uniform distribution of stars. Moreover, by using the iteratively weighted center of gravity algorithm as the newly centroiding method for the QUEST algorithm, the current attitude uncertainty can be further reduced to 44% with a negligible additional computing load. | We establish mean curvature estimate for immersed hypersurface with nonnegative extrinsic scalar curvature in Riemannian manifold $(N^{n+1}, \bar g)$ through regularity study of a degenerate fully nonlinear curvature equation in general Riemannian manifold. The estimate has a direct consequence for the Weyl isometric embedding problem of $(\mathbb S^2, g)$ in $3$-dimensional warped product space $(N^3, \bar g)$. We also discuss isometric embedding problem in spaces with horizon in general relativity, like the Anti-de Sitter-Schwarzschild manifolds and the Reissner-Nordstr\"om manifolds. | eng_Latn | 29,044 |
Level/position sensor and related electronic circuitry for interactive toy | A sensor for an interactive electronic device. This sensor assembly comprises a base, at least one recess formed in the base assembly, the recess partially through its peripheral wall is defined. A peripheral wall is placed in the base assembly has at least one switch. In addition, being placed in this recess has at least one trigger ball, the ball can freely move triggered near the peripheral wall. The sensor may generate a reference sensor relative to the at least two different states corresponding to respective positions of the planes. Moving the sensor relative to the reference plane to achieve the trigger moving the ball in the recess, with the ball when the trigger switch contact generates a state, when the ball does not contact with the trigger switch to generate another state. | Aiming at the requirements in running efficiently and high accuracy handling data timely,the login system,communication module,database storage and reading,relay verification module,report generation and printing.were designed.The working principle and composition of the calibration device were introduced.The upper computer software of microprocessor relay protection's trippingoperation-board intelligent calibration device was developed based on graphical LabVIEW language and database programming technology to realize the function of each part.The results indicate that the software can complete returning the relay operating board calibration value and action value efficiently,and has a multi-function interface.The software has been successfully applied in this device. | eng_Latn | 29,045 |
Crane and superlift counterweight device thereof | The utility model discloses a superlift counterweight device of a crane. The superlift counterweight device comprises a superlift counterweight tray (11), a counterweight luffing device (12) and a pin shaft assembly (14), wherein the pin shaft assembly (14) comprises a central shaft (141) and a shaft sleeve (142) jacketing the central shaft (141), the counterweight luffing device (12) is rigidly connected with the shaft sleeve (142), and the superlift counterweight tray (11) is connected with the central shaft (141) exposed out of the shaft sleeve (142) through a first connecting plate (17) rigidly connected with the superlift counterweight tray (11). The device has the properties of reducing the luffing resistance and improving the luffing stability of the crane, and can avoid the shakes of a superlift counterweight during rotation, so as to reduce the damage to the crane; and the device is simple to mount and disassemble, and can save labor and material resources. | Abstract : This contract supported on-going as well as planned research into environments and spacecraft interactions in near space. The major projects are summarized in this report. Models and geophysical data bases were investigated for spacecraft charging, shuttle contamination, electrostatic particle pushing codes, beam-plasma interaction in emitting probes, magnetospheric dynamics. Adiabatic invariance of trapped particles, fluxgate magnetometer simulation and falling sphere accelerometers. In support of the CRRES project, a data management plan has been provided, and a graphics capability was developed for the SPAN network. Software development was involved in all phases, using CYBER, VAX and RIDGE computers. | eng_Latn | 29,046 |
Sports Grounds - Safety Certificate for a Regulated Stand | Information on obtaining a Regulated Stand Certificate for sports grounds that have covered accommodation for more than 500 persons. | This paper covers a batch plate local stability checking method.The method is used in grab shipunloader parameterized design and the comparison of the results and the ones from overseas shows that the method is correct and practical. | eng_Latn | 29,047 |
Improve engineering drawing teaching by combining modern methods and traditional methods | With the development of technology, computer multimedia assistant teaching methods are applied widely and have unexampled advantages compared to the traditional method. Teacher should know the characteristics of different teaching methods and combine the modern methods with the traditional one reasonably to do the teaching of Engineering Drawing. Teacher can not only give play to teach but also inspire student to study hard, so as to obtain good teaching effect. | During the whole life of the building, the common inclination and displacement of the buildings affect the use and safety of the buildings. These problems must be observed by tilt and displacement observation. This paper mainly introduces the building slant measurement method, and the measurement data is prepared to deal with the measurement data in order to obtain the correct data performance of the construction and the stability of the construction and use. | eng_Latn | 29,048 |
Noise properties and signal-dependent interpixel crosstalk of the detectors of the Near-Infrared Spectrograph of the James Webb Space Telescope | The Near-Infrared Spectrograph (NIRSpec) is one of the four science instruments of the James Webb Space Telescope. Its focal plane consists of two HAWAII-2RG sensors operating in the wavelength range of 0.6 to 5.0 μm and, as part of characterizing NIRSpec, the noise properties of these detectors under dark and illuminated conditions were studied. Under dark conditions, and as already known, 1∕f noise in the detector system causes somewhat higher noise levels than can be accounted for by a simple model that includes white read noise and shot noise on integrated charge. More surprisingly, for high levels of accumulated charge, significantly lower total noise than expected was observed. This effect is shown to be due to pixel-to-pixel correlations intro- duced by signal-dependent interpixel crosstalk, with an interpixel coupling factor, α, that ranges from ∼0.01 for zero signal to ∼0.03 close to satura- tion. © 2013 Society of Photo-Optical Instrumentation Engineers (SPIE) (DOI: 10.1117/1.OE .52.3.034001) | Diagnosing fault at early stage plays a significant role in improving the reliability of the wind energy conversion systems (WECS). In this paper we present a method to diagnose actuator and sensor faults based on unknown input observer. The fault detection and isolation is performed using the unknown input observer based residual generator while the estimation of the bias fault is accomplished via the combination of estimated states of a specific unknown input observer, controlled input and measured variables of WECS. The effectiveness of an aforesaid method has been simulated on a 2MW wind turbine. | eng_Latn | 29,049 |
The study on applications of Large Aperture Scintillometer measuring large scale flux | As a new flux measuring instrument, Large Aperture Scintillometer (LAS) developed rapidly in recent years, which can measure sensible heat flux in large scales, from several hundred meters to several kilometers even to ten kilometers, In other words, LAS can match well with remote sensing scale. | Aiming at the requirements in running efficiently and high accuracy handling data timely,the login system,communication module,database storage and reading,relay verification module,report generation and printing.were designed.The working principle and composition of the calibration device were introduced.The upper computer software of microprocessor relay protection's trippingoperation-board intelligent calibration device was developed based on graphical LabVIEW language and database programming technology to realize the function of each part.The results indicate that the software can complete returning the relay operating board calibration value and action value efficiently,and has a multi-function interface.The software has been successfully applied in this device. | eng_Latn | 29,050 |
The Difference Between Probabilistic and Interval Reasoning Under Uncertainty | The purpose of this study is to provide a basic comparison of the probabilistic and interval methodologies for processing uncertain information in decision analysis. The awareness of fundamental differences between these methodologies is considered to be useful not only for better understanding their principles, but also for their appropriate application.Copyright © 2002 by ASME | In estimating the state of thrusting/ballistic endoatmospheric projectiles for the end purpose of impact point ::: prediction (IPP), the total observation time, the wind effect and the sensor accuracy significantly affect the IPP ::: performance. First the tracker accounting for the wind effect is presented. Following this, based on the multiple ::: interacting multiple model (MIMM) estimator developed recently, a sensitivity study of the IPP performance with ::: respect to the total observation time, the wind (strength and direction) and the sensor accuracy is presented. | eng_Latn | 29,051 |
Research on Three-dimensional Scanning Precision for Shoe Last | Basic techniques of using three-dimensional scanner and Delcam PS-Shoemaker software to input the physical model of shoe last were studied,and influencing factors of scanning precision,especially the techniques for drawing grids of shoe last were analyzed.The general technique for improving scanning precision was summed up.It has been proved simple and precise throughout practical operation. | Abstract : This contract supported on-going as well as planned research into environments and spacecraft interactions in near space. The major projects are summarized in this report. Models and geophysical data bases were investigated for spacecraft charging, shuttle contamination, electrostatic particle pushing codes, beam-plasma interaction in emitting probes, magnetospheric dynamics. Adiabatic invariance of trapped particles, fluxgate magnetometer simulation and falling sphere accelerometers. In support of the CRRES project, a data management plan has been provided, and a graphics capability was developed for the SPAN network. Software development was involved in all phases, using CYBER, VAX and RIDGE computers. | eng_Latn | 29,052 |
Are the range and angle adjustments seperate, or does one adjustment affect range and angle simultaneously? Nice stats! | It's a very basic function; varifocal lens zoom in and out. Angle is all on placement of the camera. This camera gives a better shot for longer distances. | The inner diffuser is removable, but I feel the unit is well designed as is. I measured the output of my Nikon SB-700 fired in manual with a PocketWizard AC3 and Mini TT1 triggering the flash via a Flex TT5. I used a Sekonic L-358 to measure the output of the speed light without the SMDV Diffuser 60 and then again with the Diffuser 60. From a distance of about 9 feet, I got a reading of f/9, (ISO 200, 1/125 sec.) without the soft box, and f/8, (ISO 200, 1/125 sec.) with the flash attached to the soft box. This test was with the inner diffuser attached as shipped from the factory. | eng_Latn | 29,053 |
Would this be good to use for whitewater rafting? i will have a hemet on. Can it mount to any helmet? | my camera broke after 2 months and Amazon will not replace it... I did buy a Gopro 3 and like that camera... pay a bit more and get a better camera! | The RFLKT+ is just a display unit and ANT+ bridge - it receives data from a compatible app such as Wahoo Fitness, Cyclemeter, or Strava. Because of the bridge, both BT and ANT+ devices can communicate with these apps, and all those three do support power meters. They all work fine with my Stages, but I don't have personal experience with a Vector - but I see no reason it would be a problem. Wahoo Fitness and Cyclemeter both give the usual power, average power and normalised power data, but the other parameters such as pedal smoothness, LR balance, and Torque effectiveness may not be supported by all apps (Wahoo Fitness is probably the best bet for power data actually, and of course was developed for use with the RFLKT+). | eng_Latn | 29,054 |
How do you adjust for elevation? I do not see an adjustment | It's in the center on top behind the lens, you can't see it in any of the photos. It's there. | Hi, Theresa! When measuring your sewing machine...please remove the embroidery unit and any accessory trays from the free arm area. This allows the machine to move up and down the platform and fit correctly. The custom insert (optional accessory) is used to cover the gap between the machine top and sewing surface of the machine. Hope this helps! If you'd like to order a Bertha, please go to: | eng_Latn | 29,055 |
cannot get below .89 volts when trying to install/calibrate. what am I doing incorrectly? | For me I started in the middle, then moved up a little, down a little, then kind of twisted it a bit, and ended up at .46. And went with that. I tightened up 1 side and left the other loose to try and get minimal change when I got close. I dont know if this will "help" u any. | Is it the right size for that car? Look at the size on it currently. If the numbers match, it will work, if not, it won't. | eng_Latn | 29,056 |
Is the Dual-Frequency transducer included in the deal or is it just the capability of having a Dual-Frequency Transducer connected to the GPS? | Yes, the one I ordered from Amazon had the transducer included. However on the box is said it was not included but it was in the box. And all works VERY nicely. | The advantage of this rig comes from the fact that you can both zoom and focus a dslr with the flick of a thumb (something that a person used to using an older ENG style camera might miss after switching). Doing time lapse work on this rig won't work. They are simple toggles that vary from speed/torque. Depending on the zoom, the lowest settings might not even turn the ring. You can tap the toggle for subtle focus control but only because the focus ring offers less resistance than the zoom ring. I think you should look for a different product, or modify a manual follow focus so that you can turn it mm by mm over the two hours. | eng_Latn | 29,057 |
Will the Tide clock correctly reflect the tide as I live inland on the Northeast Cape Fear River, which is subject to tide changes. | It will under normal conditions..depending on how far up the river you live, you may have more wind influence that can affect the tides. This clock doesn't automatically adjust. | The RFLKT+ is just a display unit and ANT+ bridge - it receives data from a compatible app such as Wahoo Fitness, Cyclemeter, or Strava. Because of the bridge, both BT and ANT+ devices can communicate with these apps, and all those three do support power meters. They all work fine with my Stages, but I don't have personal experience with a Vector - but I see no reason it would be a problem. Wahoo Fitness and Cyclemeter both give the usual power, average power and normalised power data, but the other parameters such as pedal smoothness, LR balance, and Torque effectiveness may not be supported by all apps (Wahoo Fitness is probably the best bet for power data actually, and of course was developed for use with the RFLKT+). | eng_Latn | 29,058 |
Do you have to rejet the carbs? | No.thougjt that as well.But i did get the values adjustment olnly because it was time.do it. It was a very good improvement as far horses power an looks.I am very happy with it nate. | Idk? I tried it on a Harley davidson and I didnt like it cause your either speeding up or slowing down. | eng_Latn | 29,059 |
can the angle of the easel be adjusted from straight up and down to a 45 or 90 degree angle? also can you size it down to hold smaller canvas? | Hi, I think I can help. You can angle the easel in folded position. And yes, you can size it down. It's a great product. | I had some initial issues with the base. It felt like it was catching slightly and making my sweeping monopod pans have a slight resistance until I kicked it's ass with lemon pledge. Worked to take out restriction like a champ and now it is smooth as butter. Tried wd 40 and it didn't do the trick. The lemon pledge was the stuff! | eng_Latn | 29,060 |
If you are sitting on uneven ground such as shooting on the side of a hill is this tripod hard to adjust. Would it better better to use than a bipod? | In a word yes. The tripod will allow far greater stability then a bipod. As for ease of adjustment and uneven ground, I have used this for several tactical matches and it works perfectly. | No takes some getting use to but the screen is your view finder. Nice little camera for the price. needs a tri-pod at 24x. | eng_Latn | 29,061 |
What is the maximum range of adjustment for elevation and windage? | I've found it to be a fantastic scope, very close to my nightforce nxs. I wouldn't hesitate to buy another. | I have used mine on my sailboat and have had very little lose as my boat moves. I have been as far away as 1 mile and get 100% signel. | eng_Latn | 29,062 |
does it only measure vertical angles or can i measure horizontal angles as well? | it measures with a reference to gravity, this does not work as a protractor. http://www.amazon.com/Wixey-WR410-8-Inch-Digital-Protractor/dp/B001PTGBSA may be what you want in that case. | It comes with a similar screen as the Go Pro. Tells battery, time remaining, time recorded, gps, frame rate, video resolution, wifi. You will need the case for it to mount onto a tripod. You can get a skeleton case if you don't want to use the waterproof case. | eng_Latn | 29,063 |
Autoalignment In Step-And-Repeat Wafer Printing | A fully automatic through-the-lens alignment method is described for use in step-and-repeat projection aligners. The method utilizes dark field illumination of the wafer, which is aligned to the reticle. It is shown that 2.5 pm wide alignment marks that are 2 mils in length provide an ample signal-to-noise ratio for 0.1 pm alignment accuracies. Various contributions to alignment error are examined such as the visibility of the alignment features, surface scatter on the wafer, and shot noise in the detected signal. Applications of these results to x, y, and theta alignment in step-and-repeat printing are offered. | The aim of this paper is to present a new and promising approach of the text-to-speech alignment problem. For thi:j purpose, an original idea is developed : a high quality digital speech synthesizer is used to create a reference speech pattern used during the alignment process. The system has been used and tested to extract the prosodic [eatures 01 read French utterances. The results show a segmentation error rate of about 8%. This system will be ;I powerl'ul tool for the automatic creation of large prosodically labeled databases and for research on automatic prosody generation. | eng_Latn | 29,064 |
On Elkies subgroups of l-torsion points in elliptic curves defined over a finite field | As a subproduct of the Schoof-Elkies-Atkin algorithm to count points on elliptic curves defined over finite fields of characteristic p, there exists an algorithm that computes, for l an Elkies prime, l-torsion points in an extension of degree l-1 at cost O(l max(l, \log q)^2) bit operations in the favorable case where l < p/2. ::: We combine in this work a fast algorithm for computing isogenies due to Bostan, Morain, Salvy and Schost with the p-adic approach followed by Joux and Lercier to get for the first time an algorithm valid without any limitation on l and p but of similar complexity. | The statistical description of the driver's visual position in the vehicle in the form of an optical ellipse is, in principle, a valuable aid for the car manufacturer, as this makes it possible, even at the design phase, to make predictions regarding the minimum angle of vision for any particular percentage of drivers. The SAE optical ellipse used nowadays does not, however, adequately describe the ocular position, as the reference system related to the seat reference point and the line of the torso is faulty, the influence of the back rest inclination is not sufficiently taken into account and the influence of a back rest inclination which is adjustable by the driver is completely overlooked. (TRRL) | eng_Latn | 29,065 |
A calibration method for laser-triangulating 3D cameras | A laser-triangulating range camera uses a laser plane to light an object. If the position of the laser relative to the camera as well as certrain properties of the camera is known, it is possible t ... | The authors present a fault tolerant algorithm for the solution of linear systems of equations using matrix triangularization procedures suitable for implementation on array architectures. Gaussian elimination with partial or pairwise pivoting and QR decomposition are made fault tolerant against two transient errors occurring during the triangularization procedure. The extended Euclidean algorithm is implemented to solve for the locations and values of the errors defined appropriately using the theory of error correcting codes. The Sherman-Morrison Woodbury formula is then used to obtain the correct solution vector to the linear system of equations without requiring a valid decomposition. > | eng_Latn | 29,066 |
Accurately at distances of 2.5 km or less Why do you say his results are assumed? | Why do you say his result are assumed? | Isobar what does it mean when they are far apart? | eng_Latn | 29,067 |
Seismograms typically record motions in three cartesian axes ( x , y , and z ) , with the z axis perpendicular to the Earth 's surface and the x - and y - axes parallel to the surface . | Seisometers record motions in three cartesian axes ( x , y , and z ) , with the z axis perpendicular to the Earth 's surface and the x - and y - axes parallel to the surface . | Specifically , Michael Luby and Charles Rackoff analyzed the Feistel block cipher construction , and proved that if the round function is a cryptographically secure pseudorandom function , with Ki used as the seed , then 3 rounds is sufficient to make the block cipher a pseudorandom permutation , while 4 rounds is sufficient to make it a `` strong '' pseudorandom permutation ( which means that it remains pseudorandom even to an adversary who gets oracle access to its inverse permutation ) . | eng_Latn | 29,068 |
Fellow geometry enthusiasts: I am studying a particular geometry and I would like to change from one set of coordinates to a new set that makes the metric unimodular (i.e. it's determinant is one). The components of the metric are rather complicated functions of the coordinates, so I don't know how to solve the system of linear first order PDEs that I get from the Jacobian. Is anyone aware of any software that can give me the new coordinates in terms of the old ones given the Jacobian of the transformation? I have Maple 18, but I have not been able to find this capability in Maple. | What is some good free software for doing physics calculations? I'm mainly interested in symbolic computation (something like Mathematica, but free). | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,069 |
I want to overlap both LiDAR data and SRTM data but they use different vertical coordinate systems. I am using ArcMap 10.2 and I firstly created a LAS dataset and original vertical coordinate system of LAS dataset was Unknown VCS from ArcInfo Workstation (I didn't choose any vertical coordinate system. And this one appeared). And then, I created a new LAS dataset and chose vertical coordinate system as EGM96 which SRTM data and Google Earth have. But elevation values of LiDAR points did not change. Still have different values from Google Earth and SRTM data. I need LiDAR to have same elevation values with SRTM for the same region. Is there a way to make LiDAR and SRTM data the same vertical values for the same point on the map? | How can one change a coordinate system for a .lasd (Lidar) dataset? The file is currently in UTM_Zone_15N and I am trying to convert it to IL_East. | The reason that I ask is because this is happening to LiDAR, and I am inclined to use lidar because it is easier to read. | eng_Latn | 29,070 |
I am trying to create a mathematical formula to convert meters to decimal degrees. Reading this article , I thought this generic formula: x = (Value_in_meter * 0.00001)/1.1132 But I know it is not 100% correct, I should use the other values according where is my point. I am using Google Maps Api, so, how to discover if my point is at 23N/S, 45N/S or 67N/S? Complementing I did this function: public static double convertMeterToDegrees(double meter, double latitude){ double quotient; double degree = Math.floor(latitude); double modDegree = Math.abs(degree); if (modDegree == 0){ quotient = 1.1132; } else if (modDegree <= 23){ quotient = 1.0247; } else if (modDegree <= 45){ quotient = 0.7871; } else { quotient = 0.43496; } return (meter * 0.00001)/quotient; } Is it correct? | I am wanting to find a latitude and longitude point given a bearing, a distance, and a starting latitude and longitude. This appears to be the opposite of this question (). I have already looked into the haversine formula and think it's approximation of the world is probably close enough. I am assuming that I need to solve the haversine formula for my unknown lat/long, is this correct? Are there any good websites that talk about this sort of thing? It seems like it would be common, but my googling has only turned up questions similar to the one above. What I am really looking for is just a formula for this. I'd like to give it a starting lat/lng, a bearing, and a distance (miles or kilometers) and I would like to get out of it a lat/lng pair that represent where one would have ended up had they traveled along that route. | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,071 |
How do you convert the WGS coordinate to latitude and longitude? | I'm working with that has X Y coordinates. They are akin to '3672187.92698000, 534175.72095400'. I would like to convert them to longitude latitude so they are more like '-90.097017, 29.963176'. I've seen this however I don't have that software. I was able to download and install but I am unfortunately perplexed by its complicated interface. Would like to do the conversion with it, if possible. | As my client software uses lat/lon coordinates when communicating with my (spherical mercator) postgis database i decided to ST_Transform every geometry to WGS-84. However i noticed that ST_Distance for WGS-84 returns units as degrees (i need meters). So i decided to use ST_DistanceSpheroid(geom1, geom2, 'SPHEROID["WGS 84",6378137,298.257223563]') but this method seems to be extremely slow. Therefor i switched back to using spherical mercator again and transforming my input and output to and from wgs-84 as that performs a lot better. Am i using the correct method or is this a known issue? | eng_Latn | 29,072 |
I have changed my touchpad configurations to: xinput set-prop "13" "Synaptics Finger" 50 50 255 xinput set-prop "13" "Synaptics Noise Cancellation" 20 20 true How can I save those configurations? Every time I reboot, they are set to default again. | I successfully followed to set different sensitivities for my touchpad and my USB mouse. Only problem is, once you unplug and replug the device or shutdown, restart etc., the settings reset. | OSVR has a right-handed system: x is right, y is up, z is near. I need to convert orientation data from a sensor in OSVR to a different right-handed system in which x is forward, y is left, and z is up. I need to calculate the transformation that will convert a quaternion from one system to the other. I naively tried: void osvrPoseCallback(const geometry_msgs::PoseStampedConstPtr &msg) { // osvr to ros geometry_msgs::PoseStamped outputMsg; outputMsg.header = msg->header; outputMsg.pose.orientation.x = msg->pose.orientation.y; outputMsg.pose.orientation.y = msg->pose.orientation.z; outputMsg.pose.orientation.z = msg->pose.orientation.x; outputMsg.pose.orientation.w = msg->pose.orientation.w; osvrPosePub.publish(outputMsg); ros::spinOnce(); } But that made things really weird. Facing north pitch is pitch, facing west, pitch is yaw, facing south, pitch is negative pitch... How can I convert my OSVR quaternion to the a corresponding quaternion in the new coordinate system? | eng_Latn | 29,073 |
What should I do to find the distance between two very far points? p1 = "POINT(6.2 61.1)" p2 = "POINT(6.1 61.0)" select ST_Distance( ST_Transform(ST_GeomFromWKT('$p1'), 'EPSG:4326', '$epsg'), ST_Transform(ST_GeomFromWKT('$p2'), 'EPSG:4326', '$epsg') ) as distance with EPSG:3857 this is 25550.25 but with EPSG:32632 I'm got 12382.11 and leaning on Google that's right. I'm using GeoSpark library in my calculations. Update: My range of longitude approximately from 20 to 190 Update 2: I can't use PostGIS, I'm using Hadoop and GeoSpark | I have a simple question about calculating distances in PostGIS. I would like to get the distance between two geometries. I am using this sid : 4269 in meters what i am doing now is this : ST_Distance((a.geom,b.geom)) FROM ... but I am getting result in degrees. I think that I should work with geography but how can I cast a geom to a geography? I tried with (a.geom::geography) but i am getting an error. What can I do to get my result in meters? Thank You | As my client software uses lat/lon coordinates when communicating with my (spherical mercator) postgis database i decided to ST_Transform every geometry to WGS-84. However i noticed that ST_Distance for WGS-84 returns units as degrees (i need meters). So i decided to use ST_DistanceSpheroid(geom1, geom2, 'SPHEROID["WGS 84",6378137,298.257223563]') but this method seems to be extremely slow. Therefor i switched back to using spherical mercator again and transforming my input and output to and from wgs-84 as that performs a lot better. Am i using the correct method or is this a known issue? | eng_Latn | 29,074 |
Is there a way to add keyframes (I key) with LINEAR interpolation as default instead of BEZIER? I often animate cyclic motion (e.g. planets orbiting around the sun or textures spinning about Z on a sphere) and the created f-curves have BEZIER as default interpolation. This requires me to immediately change the interpolation to LINEAR. I was wondering if there is a magic key that I can use to have the keyframes added as LINEAR interpolated, or at least have a global setting that I can set to have LINEAR interpolation used by default. | I've been working on some animations recently, and I've noticed that I'm having poor luck with certain things, namely the fact that I don't really use Blender's graph editor very well. One of the major issues that I have is pretty widespread among 3D graphics programs; when I create an animated object, it defaults to using a curved graph instead of a linear graph, which means that the most movement is at the middle of the animation, and stuff just practically stops near the beginning or the end. While this helps with organic camera movements or such, it's also not what I usually want. Can I set animation curves to default to linear? | The speed of light is the same in all inertial frames. Does it change from a non-inertial frame to another? Can it be zero? If it is not constant in non-inertial frames, is it still bounded from above? | eng_Latn | 29,075 |
Just moved from Unity to Gnome with the update to 17.10. Can the Workspaces be made to go left right / east west as in Unity, rather than north south / up down ? After three years of use, my fingers are really used to left / right. Does anybody know how to do this ? | I have just upgraded to Ubuntu 17.10 and found out there is no option to have 2 by 2 workspaces. I was able to configure 4 "static" workspaces using , but I would like to have them in 2 rows by 2. Any ideas? | I see questions come up quite often that have this underlying issue, but they're all caught up in the particulars of a given feature or tool. Here's an attempt to create a canonical answer we can refer users to when this comes up - with lots of animated examples! :) Let's say we're making a first-person camera. The basic idea is it should yaw to look left & right, and pitch to look up & down. So we write a bit of code like this (using Unity as an example): void Update() { float speed = lookSpeed * Time.deltaTime; // Yaw around the y axis using the player's horizontal input. transform.Rotate(0f, Input.GetAxis("Horizontal") * speed, 0f); // Pitch around the x axis using the player's vertical input. transform.Rotate(-Input.GetAxis("Vertical") * speed, 0f, 0f); } or maybe // Construct a quaternion or a matrix representing incremental camera rotation. Quaternion rotation = Quaternion.Euler( -Input.GetAxis("Vertical") * speed, Input.GetAxis("Horizontal") * speed, 0); // Fold this change into the camera's current rotation. transform.rotation *= rotation; And it mostly works, but over time the view starts to get crooked. The camera seems to be turning on its roll axis (z) even though we only told it to rotate on the x and y! This can also happen if we're trying to manipulate an object in front of the camera - say it's a globe we want to turn to look around: The same problem - after a while the North pole starts to wander away to the left or right. We're giving input on two axes but we're getting this confusing rotation on a third. And it happens whether we apply all our rotations around the object's local axes or the world's global axes. In many engines you'll also see this in the inspector - rotate the object in the world, and suddenly numbers change on an axis we didn't even touch! So, is this an engine bug? How do we tell the program we don't want it adding extra rotation? Does it have something to do with Euler angles? Should I use Quaternions or Rotation Matrices or Basis Vectors instead? | eng_Latn | 29,076 |
I have points in a txt file with lon, lat and elevation. The CRS is the National Grid of my country (metric system). However, when I add them in QGIS and then try to lay over them a map from OpenLayers plugin, the points are 50 meters off their true location. How to fix this? The national grid is Balkan Zone 7. | I have problem with CRS Bosnia and Herzegovina? MGI 1901 / Balkans zone 6 EPSG:3908 MGI 31276 / Balkans zone 6 EPSG:31276 Which of these two is OK? When I open google sattelite image with openlayers plugin, MGI 1901 match an image, when I change to MGI 31276 vector layer mismatch about 300m. When I import my layers to Global Mapper, datum of vector is "MGI Austria" I need this projection: | I am trying to save a GeoJSON file in QGIS as EPSG:4326, but it keeps being saved as CRS84. EPSG:4326 and CRS84 are the same except CRS84 saves coordinates as long, lat - and EPSG:4326 saves them as lat, long. When I open the saved file in QGIS it's read as EPSG:4326, but really the file is CRS84. I know this because when I open the GeoJSON file in notepad it's listed as CRS84, and when I load the file in Shapely it's recognized as CRS84. What's going on here? I've tried this on both QGIS 3.6.3 and QGIS 3.4.8 with the same result. | eng_Latn | 29,077 |
I have some raster data whose metadata defines its coordinate system as: rotated regular grid, virtual North Pole at 39.25 N, 162.00 W (rotated coordinates) Its spatial resolution is 0.044 degree. I am trying to get this data into regular longitude, latitude coordinates and to do so I have defined a custom projection with these parameters: +proj=longlat +datum=WGS84 +lat_0=39.25 +lon_0=-162 +no_defs. It doesn't really seem to help though, this is the raster layer and OSM basemap layer, both visualized in WGS84: and this is the same layers, both visualized with this defined coordinate system: I feel like I must be missing something obvious here. Is this not how you would define a coordinate system for a rotated regular grid? The source data, as well as the metadata PDF is | First I should clarify I don't have previous experience with the field, so I don't know the technical terminology. My question is as follows: I have two weather datasets: The first one has the regular coordinate system (I don't know if it has an specific name), ranging from -90 to 90 and -180 to 180, and the poles are at latitudes -90 and 90. In the second one, although it should correspond to the same region, I noticed something different: latitude and longitude were not the same, as they have another reference point (in the description is called a rotated grid). Together with the lat/lon pairs, comes the following information: southern pole lat: -35.00, southern pole lon: -15.00, angle: 0.0. I need to transform the second pair of lon/lat to the first one. It could be as simple as add 35 to the latitudes and 15 to the longitudes, since the angle is 0 and it seems a simple shifting, but I'm not sure. Edit: The information I have about the coordinates is the following Apparently, the second coordinate system is defined by a general rotation of the sphere "One choice for these parameters is: The geographic latitude in degrees of the southern pole of the coordinate system, thetap for example; The geographic longitude in degrees of the southern pole of the coordinate system, lambdap for example; The angle of rotation in degrees about the new polar axis (measured clockwise when looking from the southern to the northern pole) of the coordinate system, assuming the new axis to have been obtained by first rotating the sphere through lambdap degrees about the geographic polar axis, and then rotating through (90 + thetap) degrees so that the southern pole moved along the (previously rotated) Greenwich meridian." but still I don't know how to convert this to the first one. | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,078 |
I'm implementing a touch screen interface whereby the user dragging their finger on the screen will rotate the camera in a sphere around the central point. My code works well for the horizontal swipe (rotation around the y axis), but only covers a smaller angle from the vertical swipe (rotation around the x axis) instead of the full 360. I used the article - Spherical coordinate system form Wikipedia (), specifically the Cartesian Coordinates section, to do this. I've posted my code below. Can anyone help get this working through the full 360 degrees on both axis? See how Google Earth works when you swipe the globe for what I want. This must have been done a billion times, but Google searches confuse with the phone built in camera, gps and the like. Thanks! //Get current position oldX = getPosition().x; oldY = getPosition().y; oldZ = getPosition().z; //convert to inclination/azimuth camRadius = Math.sqrt((oldX * oldX) + (oldY * oldY) + (oldZ * oldZ)); inclinationTheta = Math.acos(oldZ / camRadius); azimuthPhi = Math.atan2(oldY,oldX); . . . //use touch screen changes (deltaTouchX and Y) inclinationTheta = (inclinationTheta + deltaTouchScreenX) % (2 * pi); azimuthPhi = (azimuthPhi + deltaTouchScreenY) % (2 * pi); //get new co-ordinates newX = camRadius * Math.sin(inclinationTheta) * Math.cos(azimuthPhi); newY = camRadius * Math.sin(inclinationTheta) * Math.sin(azimuthPhi); newZ = camRadius * Math.cos(inclinationTheta); //set new position setPosition(newX, newY, newZ); | I'm drawing a scene where the camera freely moves about the universe. The camera class keeps track of the view (or look at) point, the position of the camera, and the up vector. These vectors/points are then passed into gluLookAt. Pan and zoom are nearly trivial to implement. However, I'm finding rotation about the look at point to be much more of an issue. I want to write a function Camera.rotate that takes 2 angles, one that rotates up/down and one that rotates left/right along an imaginary sphere that is centered about the look at point. Is there an easy way to do this? I've (briefly) read about quaternions, but I wanted to see if there was an easier solution given the relatively simple construction of my scene. | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,079 |
a quick question on how keyframes / rendering an animation works within Blender. I want to render multiple images with different settings within some materials ,different shapekeys and poses, but no movement at all. I thought the best way to archive this, is to render a x frame sequence and have drivers tap in or out with 100% on each frame. So my question is: How does Blender handle this? For each frame, will blender first apply all keyframes and then render ? Also can I deactivate the interpolation thing, or can it happen that something is not 100% on frame 2 when i have set it 0% on frame 1 and 100% on frame 2 ? A short answer is all i need, thanks in advance. Also on a side note : How is Blender handling resources ? Besaid file is getting kinda large with many meshes/materials and i have experienced slower performance while doing the same tasks (like creating another mesh :D).Making other stuff invisible didn't seem to change that. Moving them to another layer seemed better (maybe that in fact is freeing all memory for those items ?).It is still very workable, but in general, if there is a way to make blender kinda ignore all the stuff i don't need right now entirely without removing it from the file, that would be great. Thank you in advance, again. Have a nice time, Aaron. | In order to animate a character so that it grabs an object an throws it, I added a Child-Of constrained and keyframed its influence. In the F-curve Editor I can't set the curve which controls the influence to interpolation-mode Constant without changing the other curves (Which should be linear or bezier). How can the interpolation mode of a curve changed independently? | When applying to a PhD program in the US, how does the admissions process work? If an applicant is weak in a particular area, is it possible to offset that by being strong in a different area? Note that this question originated from this . Please feel free to edit the question to improve it. | eng_Latn | 29,080 |
Help the mesh is moving at the wrong part when I move the bone! | I have an armature set up, but the bones aren't moving the mesh properly The mesh and armature: The vertex group of the bone I am going to move: As you can see, the circled region doesn't belong to the vertex group. But, when I move the bone look what happens: That part of the mesh has moved with the bone.. Why? | Here is my player code. Rigidbody rb; Vector3 currMovement; public float jumpSpeed = 10; public float moveSpeed = 10; public float rotSpeed = 180; float distToGround; public float posSmoothTime = 0.5f; float currPos; float targetPos; float posVel; public float rotSmoothTime = 0.5f; float currRot; float targetRot; float rotVel; void Start() { rb = GetComponent<Rigidbody>(); distToGround = GetComponent<Collider>().bounds.extents.y; } bool isGrounded() { return Physics.Raycast(transform.position, Vector3.down, distToGround + 0.1f); } void Update() { Move(); } void Move() { // Rotation smoothing. targetRot = Input.GetAxisRaw("Horizontal") * rotSpeed * Time.smoothDeltaTime; if (targetRot > 360) targetRot -= 360; if (targetRot < 0) targetRot += 360; currRot = Mathf.SmoothDampAngle(currRot, targetRot, ref rotVel, rotSmoothTime * Time.smoothDeltaTime); transform.eulerAngles += new Vector3(0, currRot, 0); // Movement smoothing. targetPos = Input.GetAxisRaw("Vertical") * moveSpeed; currPos = Mathf.SmoothDamp(currPos, targetPos, ref posVel, posSmoothTime * Time.smoothDeltaTime); currMovement = new Vector3(0, 0, currPos); currMovement = transform.rotation * currMovement; if (isGrounded()) { if (Input.GetButtonDown("Jump")) rb.velocity += Vector3.up * jumpSpeed; } rb.position += currMovement * Time.smoothDeltaTime; } I have a Rigidbody attached to my player. I think the problem is with my camera script. Here is my camera script. public Transform player; Quaternion targetLook; Vector3 targetMove; public float smoothLook = 0.5f; public float smoothMove = 0.5f; public float distFromPlayer = 5, heightFromPlayer = 3; Vector3 moveVel; void LateUpdate() { CameraMove(); } void CameraMove() { targetMove = player.position + (player.rotation * new Vector3(0, heightFromPlayer, -distFromPlayer)); transform.position = Vector3.SmoothDamp(transform.position, targetMove, ref moveVel, smoothMove); targetLook = Quaternion.LookRotation(player.position - transform.position); transform.rotation = Quaternion.Slerp(transform.rotation, targetLook, smoothLook); } } The player is not an parent of my camera. When I parent the player to my camera the shake stops. But I want a custom smooth camera movement with my custom scirpt, so I can't make the player a parent of the camera. | eng_Latn | 29,081 |
I am attempting to create a 24x36 map of New York City, and overlay it with a shapefile from City Planning. I'm using the Openlayers plugin, and I've tried all of the different types of base street map layers. I have been able to successfully convert the shapefile from a NAD 83 projection (EPSG 2263) to a WGS 84 projection (EPSG 3857), but I'm still suffering from some type of shift whenever I draw the map out on the print composer on a large scale piece of paper. The larger I try to print, the more dramatic the shift the misalignment. | I'm having trouble with the OpenLayers Plugin of Qgis and the map composer: If I create an OSM-background layer and if I want to export this, the OSM-Layer looks perfectly all right in the normal qgis program window. But in the map composer and after export the layer has shifted relatively to my other shape layers (EPSG:32633 - WGS 84 / UTM zone 33N). The second thing is that the output resolution of the exported osm-layer is very, very poor. A really bad way of getting around this whole trouble would be to increase the screen resolution and make a screenshot of the map composition window of qgis. But I don't think this would be very professional. It also would cause a lot of pain :) I'm using Qgis 1.8.0-Lisboa under Linux. The openlayers plugin is version 0.92. | I'm having problems with a WFS GetFeature request. I'm using OpenLayers with an OSM map as base layers which use the Google projection. Overlaying the layer with WMS works perfect. When I do a request to return all features within the bounds of a polygon, no features are found. My layer uses EPSG:28992. the base layer uses: EPSG:900913 My layer configuration in GeoServer: When I set the declared SRS to EPSG:900913 and set "SRS handling" to "Reproject native to declared" it works fine, but I need OpenLayers to reproject it, not GeoServer. Below is an example request I would do. I think the problem is that GeoServer does not convert the spatial filter to the native SRS. http://mydomain.com/wfs?request=GetFeature&version=1.1.0& srsName=epsg:28992& typeName=Namespace:Layer1& propertyName=(*)& CQL_FILTER=year = 2011 and INTERSECTS(geometry,POLYGON((565585.61613069 6816272.786200799,565585.61613069 6817032.3791692,564362.62367835 6817032.3791692,564362.62367835 6816272.786200799,565585.61613069 6816272.786200799)))& outputFormat=GMl2& Update I just tried re-projecting some points from OpenLayers (EPSG:900913) via proj4js to EPSG:28992 (PostgreSQL and GeoServer). I took 3 points and compared them. They all showed an offset. Converted: POINT(233492,24349424947 582191,0681508556) In Database: POINT(233530,852147064 582315,680920578) Difference x: 38,60865281453 Difference y: 124,6127697224 Converted: POINT(133459,1980658749 455770,7481867363) In Database: POINT(133486,356985081 455879,67223342) Difference x: 27,1589192061 Difference y: 108,9240466837 Converted: POINT(176920,43262818802 317930,36182729155) In Database: POINT(176954,154083767 318023,515324756) Difference x: 33,72145557898 Difference y: 93,15349746445 If the offset was constant I could correct it. Now I'm not sure if the offset is caused by GeSserver or by proj4js. I'm guessing proj4js since the WMS layer does not show any offset. | eng_Latn | 29,082 |
I've been trying to create a 400m buffer in a polygon layer in QGIS 2.18 and it seems like I may be doing something wrong compared to how the tool used to work. Does the tool still use the units of the CRS? When I try to create a 400m buffer or even a 10m buffer I end up just getting a large buffer that definitely isn't either one of those distances. I'm going to Vector->Geoprocessing Tools->Fixed Distance Buffer and setting things as follows in the screen shot. GeoJSON is in EPSG2768. Seems like the buffer tool wasn't using the defined CRS units for my layers. Checked the output buffer layer and it was using WGS84 instead of EPSG2768. CRS for the project and all the layers was set to EPSG2768. Things didn't work until I changed the CRS for the buffer layer and then re-ran the buffer tool. I set up a couple virtual machines to test things out with mixed results. Fresh Xunbuntu VM with QGIS installed and the buffer tool worked right off the bat, but a fresh KDE Neon VM with QGIS had the same issue I initially posted about. Things are working now though. | I have had no luck getting the buffer tool to accept anything but degrees as units of measure. I have found lots of stuff saying the layer needs to be reprojected and saved but it hasn't worked at all for me. Is there a way I could create a buffer without using ftools or at least force the units to meters somehow? As a workaround I converted meters to degrees (lat) and used that but the final product needs to be as close to reality as possible. Things I've tried: setting every unit option I could find to meters (where possible). setting everything to NAD83/Maryland (data is for Washington, DC) and saving it as such (as layers in ESRI shape files). reimporting the reprojected layers setting relevant layers to Google Mercator The was tried followed by creating a buffer. Many were tried in combination. QGIS 1.7.3 Slackware64 current (qgis from SBo-13.37 repo, tried on multilib and plain 64it with same results) | The problem I have tens of thousands of small polygons for which I want to create buffer/doughnuts for, but I don’t get the units right. I want meters and not degrees, and changing to (what I think is) the right CRS my layer doesn’t longer show. The polygons are small farming plots. I work in QGIS 3.4.15. Let me take you through it I start by adding a vector layer. I opt for ‘Directory’ as a source type (Encoding: UTF-8 by default) and under Source (Type: UK. NTF2 by default) I point to the folder in which I store the shapefile. The default CRS is set to EPSG:4326 – WGS 84, for both layer and the project (checked in bottom right), and unit is thus in degrees. I add the layer world map (EPSG:4326 – WGS 84 by default) by taping “world” in the coordinate box in the middle bottom, I also add Open street map (EPSG:3857 – WGS 84 / Pseudo-Mercator by default) from XYZ Tiles. The polygons appear nicely on the map considering some are just adjacent to the nearby lake, many nicely adjacent to roads, etc. When attempting to do buffers the distance is obviously set to degrees. I have tried to change the CRS, both for the layer(s) and the projects, but without luck. After researching (e.g. here: ) I believe that the correct CSR would be Arc 1960 / UTM zone 36S EPSG:21036 (it is my understanding that either 36N or 36S would work as the polygons appear both south and north of, but close enough to the equator). When setting the mentioned CRS for my polygon layer, the polygons disappear from the map, and remain gone even after setting the project CSR (by bottom right) to the same. It seems that whenever I try with different CRS in UTM, the layer doesn’t show. Does this mean that the polygons are somehow not compatible with units in meter? Is there an issue with importing the .prj file? Other things I've tried: I have made a good attempt searching for solutions. I have ensured that the polygon layer is on top of the layers list. I have ensured the file names are identical and tried all different suitable CRS I can think of. I have seen similar questions been solved by saving the .shp 'as', but I don't think that's my issue as it's already a .shp file. Can it be the case that I need to perform a datum transformation? I gave it a shot using a transformation code (1122) provided by above link, but then the polygons were projected on the Antarctica on the world map, so obviously something went quite wrong... Note that when I have changed to what I believe is the right CRS I can execute buffers in units of meters, however, they do not appear on the window… | eng_Latn | 29,083 |
How to derive the formula for the uncertainty of opposite in TOA? (let the opposite be h and the adjacent be x) | As a lab assignment I have to replicate experiment with spring oscillations, and eventually calculate spring constant. The measurements I took are: spring displacement from equilibrium $x$, mass of weights $m$, and period $T$. And presumed uncertainties: $\Delta_\alpha(x)=1[mm]$, $\Delta_e(x)=2[mm]$, $\Delta_\alpha(t)=0.1[s]$, $\Delta_e(t)=1[s]$. Where $\alpha$ stands for uncertainty of used equipment, and $e$ is readings uncertainty made by me. I calculated the standard uncertainty using this formula: $$\sqrt\frac{\Delta_\alpha(x)^2+\Delta_e(x)^2}{3}$$ Which produced $u(x)=1.29[mm]$. Then using the following substitutions: $$F=kx$$ $$mg=kx$$ $$x=\frac{g}{k}m$$ I assert that: $$tan(a)=\frac{g}{k}$$ Because in this part of experiment I plotted $x=f(m)$, using excel's linest, I get the slope $s$ $$tan(a)=\frac{g}{k}=s$$ Now, my question is, since I had uncertainty for $x$ introduced, and it is used to compute $tan(a)$, then the latter is affected and should have uncertainty of itself? How in this case should I go about propagating uncertainty to $tan(a)$? | At page 17 of Munkres' Elements of Algebraic topology it says, referring to fig. 3.6, that "the diagram does not determine [the torus]. It does more than paste opposite edges together": Is it so? I tried to glue them together, but end up with a normal torus. | eng_Latn | 29,084 |
I want to create a custom axis for a bone to rotate around, which won't be along any one of the axis X, Y or Z. I want to use it for a concept car which uses some body panels to use it as an active air brake. So they'll be operated in an angled way. Can anyone help... | I've been modeling an F-15, but when I rotate the flaps on Y-Axis the Flaps offset from the wing and they look like they are about to fall out, so I end up correcting it by rotating in the X and Z-axis to align them correctly. However when I rotate them back from the Y-Axis they look distorted again because I did rotations on the X and Z axis. I tried riding it, no help. I saw something about custom orientation to fix it but that was not clearly explained and it was done on an ideal plane that was (0,0,0); mine is pre-rotated on all axis. I'll take scripts if needed and I'm using Blender 2.8. When I rotate it on the Y-Axis: Fixing it using X and Z rotation: Rotating it back on the Y distorted it again: Here's the wing with ailerons and flaps: | What is a general rule for use of auxiliary verbs in sentence? Should we duplicate it or not? For instance, It is available for every item and (is) used with . . . | eng_Latn | 29,085 |
Is it possible to move an object in the 3D View on a plane of two axis like XY, XZ, YZ? | I want to be able to lock my translation/scale operations to two axes at a time, say xy, xz or yz. It currently takes two operations (one for each axis). Example, say for a cylinder, I would like to reduce the radius. I currently have to scale on the x axis first, then on the y axis, or I have to manually input the scale values in the object properties panel. This option is easily accessible in 3dsmax via the axis constraints command bar. Or I can click the intersecting area between the 3d manipulator of the axes in question and move in that plane. | I want to be able to lock my translation/scale operations to two axes at a time, say xy, xz or yz. It currently takes two operations (one for each axis). Example, say for a cylinder, I would like to reduce the radius. I currently have to scale on the x axis first, then on the y axis, or I have to manually input the scale values in the object properties panel. This option is easily accessible in 3dsmax via the axis constraints command bar. Or I can click the intersecting area between the 3d manipulator of the axes in question and move in that plane. | eng_Latn | 29,086 |
I'm an archaeology student working on a recent project and I'm wondering how to set my arcGIS project. I'm used to work with geographical data in ArcGIS on a large scale, but the current project is different, let me explain. I want to display and work with dispersion of artefacts on an archaeological site. This site is subdivided in 1 square meter units or parcels in which I want to display a point layer composed of each artefacts having x-y coordinates (from an arbitrary x0-y0 itself on the site). So all my coordinates make sense in the context of the site, but they are not georeferenced in any way on a geographical level. What projection should I use to work with this kind of data, if I want my square meters units (which will be polygons) to be perfectly square without distortion (kind of a non-earth system if that's a thing). My guess is that, on this geographical scale, distortion and error will be inexistant in any projection, but I want to make sure before starting the project in ArcGIS. | I don't know so much about coordinate systems... In my office we use to deal with spatial data coming from archaeological sites. Each site has its own x-y-z coordinate system (GCS). Three simple ortogonal cartesian axis. In the last years we have been managing this spatial data through GIS software (ArcGIS), without using specific coordinate system (just leave it as "undefined") I'd like to know if there exisits any GCS designed to deal with such datasets using simple cartesian orthogonal axis, without grid distortions of the typical GCS. In addition, I'd like to know if this system is suitable for using it in an online mapping application. By the way, we manage 2D (ArcMap) and 3D (ArcScene) environments and work with "mm" as length base unit. If such a thing doesn't exists, maybe someone knows how to create it. | The new Top-Bar does not show reputation changes from Area 51. | eng_Latn | 29,087 |
i have created a dithered and stepped distance fading effect similar to lod transitions in video games. the parameter is controlled by the camera's texture coordinates. (basically a gradient attached to camera) this unfortunately results in vertices closer to the camera fading in first. see image: is there any way to keep this "distance to camera" value consistent for the entire object? like calculate the distance of the object's origin from the camera? or sample the current texture coordinate system form one vertex for the entire object? one solution would be to use drivers. see image: the issue with this is that drivers don't change for object instances, like geometry nodes scattering. any help or ideas would be greatly appreciated SOLUTION the second answer in the linked duplicate question, camera x, y and z position are added with a driver since there is only one camera. | I have a blender file where a sphere is orbiting a force field. I want it to be more red when it's closer to the center, and more blue when it's further. I already have the math figured out, I just need to know how to get x/y coordinates in nodes. Here's the file: | At the very beginning of SG1, the Goa'uld invade Earth, and then return somehow. How did they manage to return? There was no DHD, and somehow they didn't let it be known where they were going, IE, there was no log of the return. This seems a bit odd to me... | eng_Latn | 29,088 |
I've some lat/long in EPSG:4326 coordinate system like Eiffel Tower position : ["lat" : 48.858274,"lon" : 2.2944] For automated tests I would like to get, for every points of my dataset, the x and y coordinate of the location 100 meters Eastern of my dataset's point. How can I calulate it? PS : it'll be implemented in Java | Online calculators such as (view page source) use the below formulas to get meters per degree. I understand in general how distance per degree varies depending on latitude location, but I do not understand how that translates to the below. More specifically, where do the constants, the 3 "cos" terms in each formula, and coefficients (2, 4, 6; 3, and 5) for "lat" come from? // Set up "Constants" m1 = 111132.92; // latitude calculation term 1 m2 = -559.82; // latitude calculation term 2 m3 = 1.175; // latitude calculation term 3 m4 = -0.0023; // latitude calculation term 4 p1 = 111412.84; // longitude calculation term 1 p2 = -93.5; // longitude calculation term 2 p3 = 0.118; // longitude calculation term 3 // Calculate the length of a degree of latitude and longitude in meters latlen = m1 + (m2 * Math.cos(2 * lat)) + (m3 * Math.cos(4 * lat)) + (m4 * Math.cos(6 * lat)); longlen = (p1 * Math.cos(lat)) + (p2 * Math.cos(3 * lat)) + (p3 * Math.cos(5 * lat)); | I am trying to save a GeoJSON file in QGIS as EPSG:4326, but it keeps being saved as CRS84. EPSG:4326 and CRS84 are the same except CRS84 saves coordinates as long, lat - and EPSG:4326 saves them as lat, long. When I open the saved file in QGIS it's read as EPSG:4326, but really the file is CRS84. I know this because when I open the GeoJSON file in notepad it's listed as CRS84, and when I load the file in Shapely it's recognized as CRS84. What's going on here? I've tried this on both QGIS 3.6.3 and QGIS 3.4.8 with the same result. | eng_Latn | 29,089 |
Problem reprojecting LIDAR data with Liblas I am currently trying to reproject a LIDAR file (.las) using the Open-Source LibLAS, compiled on a Scientific Linux server. The Original data is in Oregon Lambert, NAD83 (EPSG:2992), and I am trying to reproject it to UTM WGS 84, Zone 10 N (EPSG:32610). However, when I use the --t_srs option, I get the following error: "error: X scale and offset combination is insufficient to represent the data" I am suspecting it has to do with the integer values stored and the distance of translation. How do I get it to work? Is there anything I have to change in the header information? Regards, Edit: Here is the command line I am using (lasinfo detects the original EPSG, so no need to specify it): ./las2las -i path/to/input/folder/*.las -olas -odir path/to/output -v --t_srs EPSG:32610 Result: $ Setting output SRS to EPSG:32610 $ error: X scale and offset combination is insufficient to represent the data | Reprojecting LiDAR data with libLAS, error: "latitude or longitude exceeded limits"? I have a lidar file (file.las) with the following spatial reference: Spatial Reference: PROJCS["NAD83 / UTM zone 11N", GEOGCS["NAD83", DATUM["North_American_Datum_1983", SPHEROID["GRS 1980",6378137,298.2572221010002, AUTHORITY["EPSG","7019"]], AUTHORITY["EPSG","6269"]], PRIMEM["Greenwich",0], UNIT["degree",0.0174532925199433], AUTHORITY["EPSG","4269"]], PROJECTION["Transverse_Mercator"], PARAMETER["latitude_of_origin",0], PARAMETER["central_meridian",-117], PARAMETER["scale_factor",0.9996], PARAMETER["false_easting",500000], PARAMETER["false_northing",0], UNIT["metre",1, AUTHORITY["EPSG","9001"]], AUTHORITY["EPSG","26911"]] Then I reproject it to WGS84 with las2las: las2las --a_srs EPSG:26911 --t_srs EPSG:4326 -i file1.las -o output.las And I get: Spatial Reference: GEOGCS["WGS 84", DATUM["WGS_1984", SPHEROID["WGS 84",6378137,298.257223563, AUTHORITY["EPSG","7030"]], AUTHORITY["EPSG","6326"]], PRIMEM["Greenwich",0], UNIT["degree",0.0174532925199433], AUTHORITY["EPSG","4326"]] But when I try to revert the transformation: from WGS84 to NAD83/UTM 11N, I get an error: las2las --a_srs EPSG:4326 --t_srs EPSG:26911 -i output.las -o wgs2utm.las ERROR 1: latitude or longitude exceeded limits error: Could not project point for ReprojectionTransform::latitude or longitude exceeded limits0. Why I got this error, if I am just reversing the transformation (originally from NAD83 to WGS84, and then from WGS84 to NAD83)?. | Georeferencing old map in QGIS? I have an old map over a part of Laos and wondering how do I georeference the map with the help of the Grid lines and other Longitude and Latitude information on the scanned map? I have searched for information on the internet and tried many times. My scanned map wont overly correctly on the base map. Any base map would be fine. As long as I can check the result. Do I add X/East and Y/North values from the grid line crossings as marked and written in some parts of the map? How do I begin georeferencing this map? Do I have to add zeros after the two digits? For example if the Easting is 24 (its in KM right? Do I add like three/four zeros to convert it to meters? I georeferenced the map with the option "From Map Canvas" in the georeferencer tool and I succeeded but I want to know the coordinate of a point on my old scanned map and write them as X/East and Y/North to georeference the map. I hope I am not babbling and someone understands what I mean! We can set the CRS to WGS84, EPSG:4326. | eng_Latn | 29,090 |
Send a parameters to self page with link and get it without page refresh in php I want a page (showmarker.php) that It is used in Google Maps and Markers: marker.set("type", "point"); marker.set("id", c); google.maps.event.addListener(marker, 'click', (function(marker, c) { return function() { window.location.href = "showmarker.php?location=" + c; canter=myLatLng; } })(marker, c)); this function active with click on each marker.I want send c parameter to showmarker.php without page refresh and get it with php.Please Help me... | HTTP GET request in JavaScript? I need to do an request in JavaScript. What's the best way to do that? I need to do this in a Mac OS X dashcode widget. | Defining coordinate reference system with rotation in GeoServer? I am using GeoServer and have a layer in EPSG:900913 ("Google Mercator"). I need to "rotate" the map around certain point (say, 1500000, 7000000) by certain degree (say, 30 degrees clockwise). How could I define such a coordinate system based on EPSG:900913? GeoServer's does not work for my purposes as I need to tile the map later on. As far as I understand this, my only option is to define an own coordinate system. For GeoServer I'd need to . The configuration seems to be straightforward, but I have a difficulty defining my rotated CRS in . I am wondering how to apply a rotation around certain point onto a CRS like Google Mercator: PROJCS["WGS84 / Google Mercator", GEOGCS["WGS 84", DATUM["World Geodetic System 1984", SPHEROID["WGS 84", 6378137.0, 298.257223563, AUTHORITY["EPSG","7030"]], AUTHORITY["EPSG","6326"]], PRIMEM["Greenwich", 0.0, AUTHORITY["EPSG","8901"]], UNIT["degree", 0.017453292519943295], AXIS["Longitude", EAST], AXIS["Latitude", NORTH], AUTHORITY["EPSG","4326"]], PROJECTION["Mercator_1SP"], PARAMETER["semi_minor", 6378137.0], PARAMETER["latitude_of_origin", 0.0], PARAMETER["central_meridian", 0.0], PARAMETER["scale_factor", 1.0], PARAMETER["false_easting", 0.0], PARAMETER["false_northing", 0.0], UNIT["m", 1.0], AXIS["x", EAST], AXIS["y", NORTH], AUTHORITY["EPSG","900913"]] My questions, specifically: How to write a WKT which transform an existing CRS? My guess would be that I need a new PROJCS wrapping an existing one and adding a PROJECTION clause. How would I found out the projection id (like Mercator_1SP above) and the required parameters (the PARAMETER clauses)? Can I "reference" EPSG:900913 in CRS WKT instead of copy-pasting the whole PROJCS clause? | eng_Latn | 29,091 |
Units of angular speed vs linear speed I'm studying pendulums, and the units have me a bit confused. In a pendulum, Iw^2=mgL. Putting together the units of the left side of the equation, we get gm^2 (rads^-1)^2=g(m^2)(s^-2)(rad^2). But the units of the right side of the equation equal g(m^2)(s^-2) Where does the unit radian fit in? How do I deal with the unit radian when dealing with measures of linear speed? | How do the radian have a unit? The radian is defined as the ratio of the circumference and the radius. Both are measured in meters. So there should not be a unit for that. But we use 'rad' as the unit of the radian value. The coefficient of static/kinetic friction also the same, it is a ratio of both forces. Therefore it doesn't have a unit. So, is there a special reason to have a unit for the radian value? | What units does the rotation argument expect when creating objects? I want to create a lamp in blender using python. I want to give the lamp some rotation. As per the I can set the initial rotation using a tuple of numbers in the function call like this rotation=(0.0, 0.0, 0.0) this is the specifc documentation for the rotation parameter: rotation: (float array of 3 items in [-inf, inf], (optional)) – Rotation, Rotation for the newly added object What units is this parameter specified with? It does not seem to be in degrees or radian. When I try with my example code here where I use rotation=(0, 1, 0) to get 1 degree on the y axis. I end up getting 57.296 degrees after running the code. import bpy bpy.ops.object.lamp_add(type='AREA', view_align=False, location=(0, 0, 0), rotation=(0, 1, 0), layers=(True, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False)) | eng_Latn | 29,092 |
Mapping Node, Box projection, and rotation Goal I'm trying to map a texture on a cube; Since my texture is not exactly fitting the correct orientation, I need to rotate it. The results are not as expected. Setup description I'm using the setup shown in the following pictures. I've set the projection mode to Box have the texture visible on all faces. Texture mode is set on Repeat to avoid undesired clamping (default value). What I'm trying to achieve is to have the planks horizontal instead of vertical on the front face (rotate around X axis). The rotation is applied in a Mapping Node. Results The face I need is properly rotated, but the others get this blurry effect, which reminds me of the "Clamp" mode in OpenGL. Is this expected behavior ? How to solve this ? **Additional note (edits) ** Both the cube and the material are simplified versions of the original problem. The meshes I have are more complex, a few thousands faces, and with complex shapes (such as switches or sockets outlets). Very few faces are not aligned with the world axes, and thus they can be ignored. The material needs to be applied on various shapes, so UV map can't be made manually. The purpose of this is to automatically create a diffuse map of each textured item, where every face with a similar direction/normal would have a similar rotation applied. For simplificity sake, the cube seems to be a good sample | Is there implementation of tri-planar mapping in Cycles renderer? Is it possible to have tri-planar mapping in cycles? Like in corona renderer or fstorm? | How do I make 'X-Axis Mirror' available in Pose Mode? More often that not, while posing an armature, I want to mirror the pose across the X-axis, so that when I move "brow.L", "brow.R" also moves in the same way. You can copy and paste a mirrored pose, or select both bones prior to performing a transform, but both of these options aren't particularly good workflow wise. In edit mode, in the tools panel, under options you can enable "X-Axis Mirror" to copy all transforms across the X-axis in real time, but this option is not available in pose mode. I'm not very experienced with python, but edit source reveals: col.prop(ob.data, "use_mirror_x") My understanding is that "use_mirror_x" refers to a boolean property, but I have no idea how to find where that boolean property is defined or if it would be possible to modify it to get the same result in pose mode. Any advice on this is appreciated. Maybe there's a better way to achieve this? Thanks. | eng_Latn | 29,093 |
align objects on a surface I got two cube objects I want two align their bottom to be on one surface , like they are buildings on the ground. i dont want to change objects center | Snap face to grid I need to move an entire object, so that one of the faces is exactly at Z=0. What is the best way to do this? Basically, if I could select Grid or 3D Cursor in the Snap To menu, that would be perfect: My current workaround is to select the face, copy its Z coordinate, select all, move it all down by that same amount. Is there a faster way? | Orthographic scale of camera in Blender could somebody explain how orthographic scale is calculated in Blender? I am trying to match an orthographic view that I get from another software with that of Blender, but I am not able. Camera gets in the correct position but the zoom level is wrong. Thanks, byte | eng_Latn | 29,094 |
I am trying to get an orthographic render out of Blender with precise accuracy. The scale that I would like is 1"=1'. To prepare my scene, I have changed units to Imperial with separate values. I then changed the output dimensions to 7200x5400 - so that I can get an 24"x18" output at 300ppi. My problem now is setting the orthographic scale. I have tried 12 (for 12 inches in a foot), but the resulting render is too small. After fiddling with the scale manually, I figure that I'd need something less than 7.4, but I don't know how to get a precise scale for my use case. | I'm using 3d to speed up production of 2d graphics of machines and structures (which have a lot of straight lines, objects and little details repeated all over the place). The goal is to have an isometric background image that repeats seamless. I set the camera with orthographic projection and 45° of rotation. And manually cut the vertical borders in an image editor in the position where it repeats. Then I can do something like this: Instead of cutting it manually every time I do a test, I would like to set the camera view position and height from Blender. Then I can render an already seamless repeated image. Those two Empty objects in the boundaries are where the camera borders should be. However I have no clue how to accurately set up the camera in this case. How can I do this? | Set the Max depth parameter of both the Reflection and Refraction roll-out to 10. I am not sure whether roll-out should be plural or not in this particular case. | eng_Latn | 29,095 |
Rectangular/Portrait orientation. Is there any way to change the size of the camera? I'm trying to make a rectangular/portrait orientation style image for the camera in Blender 2.8. I've tried adjusting the aspect ratio, but all it does is the stretch the image. Is there any way you can adjust the size of the camera? How it is now. How I want it to be. | How to change the camera from Landscape to Portrait I need a way to render a picture that is longer heighth-wise than width-wise. Changing the scale of the camera does nothing. I've been told to look in the scene focus settings, and I didn't see anything. Is there a way to change the shape of the rendered image? | How can I rotate about an arbitrary point in 3D (instead of the origin)? I have some models that I want to rotate using quaternions in the normal manner, except instead of rotation about the origin, I want it to be offset slightly. I know that you don't say, in 3d space, that you rotate about a point; you say you rotate about an axis. So I'm visualizing it as rotating about a vector whose tail is positioned not at the local origin. All affine transformations in my rendering / physics engine are stored using SQT (scale, quaternion, translation; an idea borrowed from the book Game Engine Architecture.) So I build a matrix each frame from these components and pass it to the vertex shader. In this system, translation is applied, then scale, then rotation. In one specific case, I need to translate an object in world space, scale it, and rotate it about a vertex not centered at the object's local origin. Question: Given the constraints of my current system described above, how can I achieve a local rotation centered about a point other than the origin? Automatic upvote to anyone who can describe how to do this using only matrices as well :) | eng_Latn | 29,096 |
Can't access GeoServer after CORS filter To enable CORS filter I added the following in web.xml file in webapps/Geoserver/WEB-INF <filter> <filter-name>cross-origin</filter-name> <filter-class>org.eclipse.jetty.servlets.CrossOriginFilter</filter-class> ... <filter-mapping> <filter-name>cross-origin</filter-name> <url-pattern>/*</url-pattern> </filter-mapping> After adding above, I restarted GeoServer. Now, when I access GeoServer, I get following error: HTTP ERROR: 503 Problem accessing /geoserver/. Reason: Service Unavailable | Enabling CORS in GeoServer (jetty)? I hope somebody has already figured this one out. I just installed Geoserver 2.9 on a vanilla Ubuntu 16.04 distro. The Geoserver 2.8 method of enabling CORS with the shanbe.hezoun class does no longer work with Jetty 9.2.13. There are mentions that CORS support is already packaged with Jetty 9.2.13 in the jetty-servlets.jar. The Jetty lib which is compiled with Geoserver contains a jetty-servlet-9.2.13.v20150730.jar in geoserver/lib but not jetty-servlets.9.2.13.v20150730.jar. Are these supposed to be the same jar with a different name? It should be possible to enable CORS either in geoserver/etc/webdefault.xml or in geoserver/webapps/geoserver/WEB-INF/web.xml. My understanding is that the webdefault.xml is applied first and the web.xml thereafter. I have tried following filter in both xml. I haven't got as far as adding a filter mapping. Adding the filter alone will cause the Geoserver/Jetty service to not start proper. <filter> <filter-name>cross-origin</filter-name> <filter-class>org.eclipse.jetty.servlets.CrossOriginFilter</filter-class> </filter> | Local Coordinate to Geocentric The ultimate question, I need a transformation matrix to take a point in local space representing a roughly 500m x 500m place in New Mexico centered at -108.619987456 long 36.234600064 lat. The final output needs to be in geocentric coordinates and I can during initialization get any number of points on the map such as the corners, center, western most along the center etc. During run time I would like to have a transformation matrix that can be applied to a position in local space expressed in meters and get back an approximate GCC coordinate. The center of the point in GCC: -1645380 -4885138 3752889 The bottom left corner of the map in GCC: -1644552, -4881054, 3749220 During run time I need to be able to multiply (-250, -250, 0) with the transformation matrix and get something close to (-1644552, -4881054, 3749220). I've spent more than a week trying to research a solution for this and so far nothing. Given an identity matrix as our starting position and orientation. Then using geotrans and the known lat, long, and height of the starting position I get an x,y,z geocentric coordinate. A vector from the origin of (0,0,0) gives both the up and translation for the matrix. However, I need the forward and right so that I can pass a distance in meters from the origin into the transformation matrix and get a roughly accurate GCC. Do I have all of the inputs I need to calculate the right and forward? Inputs Origin: 0, 0, 0 Global: -1645380, -4885138, 3752889 Up (Normalized): Global - Origin Desired Outputs Right: ? ? ? Forward: ? ? ? So with the right and forward added to the up and translation I already calculated I would have a transformation matrix. I could then apply the matrix to a vector of say (50, 50, 0) and get something within about 0-3 cm of the output if I feed the lat/long back into geotrans. This matrix would only ever be used small maps of about 500m x 500m so the curvature of the earth is negligible. In reply to whuber, I don't know your level of experience so I will start with the very basics. An identity matrix is a 4x4 matrix that you can multiply by a vector and it returns the vector unchanged. Such as below. 1 0 0 x=0 0 1 0 y=0 0 0 1 z=0 0 0 0 1 The x, y, and z of the matrix are how much to translate the vector by and the first 3 rows and columns are the rotation. Below is a tranformation matrix that does nothing to the orientation of a vector but does translate. This is what you would want for two axis aligned worlds in different locations. 1 0 0 13 0 1 0 50 0 0 1 -7 0 0 0 1 If you were to multiply a vector of (10, 10, 10) with the above transformation matrix you would get an output of (23, 60, 3). My problem is the axes are not aligned and the "plane" of the world I am trying to get the local coordinate converted to is projected on the ellipsoid of the earth. Geotrans is library that you can use to pass a coordinate from one system such as geodetic or GDC (lat, long, height) and get back another such as geocentric or GCC (x,y,z). For example: If my game world was representing a map centered exactly on the prime meridian and equator the transformation matrix would look something like below. I am just guesstimating the Y and Z rotations and might have them flipped and/or the wrong sign. 0 0 1 6378137 0 1 0 0 1 0 0 0 0 0 0 1 | eng_Latn | 29,097 |
The chain stay angle is the angle between chain stay and seat tube? I need a new front derailleur for my Shimano Acera T3000 gear shift. I have to choose between: They differ only in the chain stay angle they are made for. The info I find on what the "chain stay angle" is seems partly contradictory or unclear. So I want to make sure I got this: The chain stay angle is the angle between chain stay and seat tube - is that correct? Or more precise the angle between AB and AC with: A: bottom bracket axis B: axis of rear wheel C: "middle point" at the top of the seat tube Here's with measured angles: of the angles in question only the one between seat tube and chain stay (68°) falls into the interval [63,69] ... so, it has to be that one, I suppose. From my mechanical, geometrical intuition I don't see why that angle is relevant for the FD. It should be ∠(KK',KC) in my opinion!? | Selecting a front derailleur: parameters The spec sheet for the Shimano Alivio FD-M430 front derailleur contains a number of points that I don't understand. What does "Maximum capacity: 22 teeth" mean? Why are "Top gear teeth" constrained to "44/48T"? Is there anything wrong with 46? What is "Cable Routing: dual-pull type"? And most puzzling, what is "Chain Stay Angle 63-66 / 66-69" and how do I measure mine? | Why does a delta/wye transformer make 30 degrees phase shift ? I'v heard that a delta/delta or Wye/Wye transformers do not make any phase shift. In delta/wye or wye/delta transformers, There is a 30 degrees phase shift between the primary and secondary coils. So, Why is there a phase shift ? and Why is it 30 degrees ?.. Why not 60 or 120 ? I googled and I found calculations with phasor diagrams that proves the 30 degrees phase but I'm confused because of too many calculations and diagrams, I didn't get it. Would you give me a simple answer, please ? and I prefer the physical meanings and concepts rather than equations and mathematics. Thank you very much, | eng_Latn | 29,098 |
Displaying rotated grid data in Leaflet with GeoServer WMS I am trying to display rotated grid data from NetCDF in Leaflet. When I try to use the "WMS angle parameter" () I got some strange results like this: Any idea why this happens? I already have seen that there is the possibility to create a custom projection which allows to add rotation parameters, but I could not figure out how the correct usage is with using my own projection. Currently I am using (EPSG:25832) () Would be nice to find a solution to display the rotation without manipulating the raw NetCDF data. | Defining coordinate reference system with rotation in GeoServer? I am using GeoServer and have a layer in EPSG:900913 ("Google Mercator"). I need to "rotate" the map around certain point (say, 1500000, 7000000) by certain degree (say, 30 degrees clockwise). How could I define such a coordinate system based on EPSG:900913? GeoServer's does not work for my purposes as I need to tile the map later on. As far as I understand this, my only option is to define an own coordinate system. For GeoServer I'd need to . The configuration seems to be straightforward, but I have a difficulty defining my rotated CRS in . I am wondering how to apply a rotation around certain point onto a CRS like Google Mercator: PROJCS["WGS84 / Google Mercator", GEOGCS["WGS 84", DATUM["World Geodetic System 1984", SPHEROID["WGS 84", 6378137.0, 298.257223563, AUTHORITY["EPSG","7030"]], AUTHORITY["EPSG","6326"]], PRIMEM["Greenwich", 0.0, AUTHORITY["EPSG","8901"]], UNIT["degree", 0.017453292519943295], AXIS["Longitude", EAST], AXIS["Latitude", NORTH], AUTHORITY["EPSG","4326"]], PROJECTION["Mercator_1SP"], PARAMETER["semi_minor", 6378137.0], PARAMETER["latitude_of_origin", 0.0], PARAMETER["central_meridian", 0.0], PARAMETER["scale_factor", 1.0], PARAMETER["false_easting", 0.0], PARAMETER["false_northing", 0.0], UNIT["m", 1.0], AXIS["x", EAST], AXIS["y", NORTH], AUTHORITY["EPSG","900913"]] My questions, specifically: How to write a WKT which transform an existing CRS? My guess would be that I need a new PROJCS wrapping an existing one and adding a PROJECTION clause. How would I found out the projection id (like Mercator_1SP above) and the required parameters (the PARAMETER clauses)? Can I "reference" EPSG:900913 in CRS WKT instead of copy-pasting the whole PROJCS clause? | How to superimpose LaTeX on a picture? I know how to add a picture to my LaTeX file, e.g. \includegraphics[width=7cm]{curve.pdf}. What I'd now like to do is place some latex symbols onto the picture. Like a \gamma to name a curve. I do not want to do that in the graphics editing program e.g. because I might want to rename that curve later and don't want to repeat the editing process. I know I can do a \put in a picture environment. But this environment forces me to specify its size at the beginning. That would be cumbersome since I'd have to go open the graphics file and do some calculations. Is there a way to get the dimensions (or aspect ratio) of a graphics file within LaTeX, so I might use it there? But even if I could do these calculations automatically, I still wouldn't like the redundancy of specifying the width for \includegraphics again, but I could live with that. Also, is it possible to specify the size in cm to the picture environment? I just tried and got some errors. So basically what I am looking for is a solution as simple as possible which: a) lets me specify a width or a height and a file, and displays the image scaled while keeping its aspect ratio and then b) lets me put LaTeX stuff inside the image in relative coordinates (percentages or 0...1), so I won't have to redo anything should I decide that the image needs to be a bit larger or smaller. (b is kind of like setting \unitlength to the whole width=height for the 'picture' environment. But of course most images are not square and I also don't appreciate the redundancy (but could live with it).) | eng_Latn | 29,099 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.