Looping for importing and interpolating a lot of LIDAR files?

Looping for importing and interpolating a lot of LIDAR files?

I'm quite new to Grass Gis and I'm trying to make DEM (digital elevation model) for modelling flooding. I have 170 *.xyz files, which need to be scanned for extents ( -s input=… output=filename fs="), then region needs to be set with previously recieved data (g.region n=… s=… e=… w=… res=2), importing lidar data ( input=… output=filename fs=") and at last interpolating that data ( [email protected] output=filenameidw).

I did it successfully with one file, but it would be waste of time to do it manually for 170 files. Can anyone recommend me a loop, which I can use?

Here's a snippet of a bash shell script I use:

for f in precip_accum*.txt; do precip_rast="basename ${f} .txt" precip_recl=${precip_rast}_recl reg=" -s -g input=${f} output=dummy fs=, | awk"{print $1" "$2" "$3" "$4}"g.region --quiet $reg # Also set resolution to 1/30 degree (about 3 km) g.region --quiet res=0.033 --quiet --overwrite input=${f} output=${precip_rast} fs=, method=mean r.null --quiet $precip_rast setnull=0… more lines with additional processing… done

I'd like to add that with dense lidar point data you might not need to interpolate at all. This gives a big savings in time. If you have at least one lidar point per raster cell, then just use (as above)method=meanand each raster cell will get the mean height of the lidar points in that cell.

For a python script you would start with something like:

import os import grass.script as grass for f in os.listdir(… directory of your… ): if f.endswith(".xyz"): f_out, ext = os.path.splitext(f) grass.run_command('', input=f, output=f_out)

and similarly the rest of the grass modules you need

Can an Interpolation function be 'saved'? [duplicate]

I am importing tables with sizes of 100k - 500k rows and 4 columns. The larger the table, the slower my Mathematica runs for all computations after the import. I import the table and then apply Interpolation to them. After that the tables are no longer required and I use just the Interpolation functions.

So my question is, can I somehow restart the kernel to get rid of the large data tables but keep the interpolation functions calculated from those tables? Or is some other solution possible? THanks!

How can I create a smooth looping animation for cloth simulation?

I have created a waving flag simulation by pinning a set of points, adding cloth physics, and setting up wind forcefields. The result is just fine when I render the scene except that I want the resulting png files to be a continuously looping animation. The resulting animation is currently jumping when the loop transitions from the last frame to the first frame.

I have tried exporting to lightwave format, importing it back in and inserting appropriate shapekeys to get an interpolation between the last and first frame, (as described here), but it seems the export-import step messes up both the orientation and the basis for the shape keys. I tried exporting-importing twice and I also tried using a custom plug-in that fixes the orientation issue, but my shape keys at 1.0 value are always stretched out and rotated compared to the same shape key at 0.0. The result is that after interpolation, the flag not only waves, but also rotates and stretches.

I am thinking about trying one of the following routes:

Somehow edit the necessary shapekeys at value 1.0 so that it matches the orientation and rotation at 0.0.

Look into the export-orientation problem and see if I can fix it myself.

Try to morph from last frame to first frame of the rendered result using some other tool.

What do you think is the easiest and best way to go to get a nice loop transition?

After trying sambler's suggestion using two modifiers with varying influence in the transition between last and first frame, the mesh seems to be rotated and scaled in the frames where the modifier influence is interpolated.

Utilizing GIS Data for Asset Management

Surveyor verifying data within the coded GIS framework.

Photo Courtesy of Metro Consulting Associates

Loading survey parcel geometries into the GIS database for secure web access.

Photo Courtesy of Metro Consulting Associates

In the last few years, the utility industry has been migrating towards Geographic Information Systems (GIS) as their go-to asset management system. This technology is a game-changer for asset management as it not only stores geographical referenced geometries for utility features like welds, valves, substations, transformers, and towers/poles, but acts as a database system for applying domains and coded values to standardize utility attributes. Recently, this has led to the development of utility as-built standards delivered within a fully digital GIS, enabling organizations to publish survey quality as-built data across multiple platforms. In addition to the traditional uses of this high-quality data by engineering and surveying departments, the user-friendly platforms hosted via secure websites provide instant access for real estate and upper management decision makers.

To establish a baseline for the discussion of the growth of GIS in the last few years – specifically in the multiple web-based platforms that provide user-friendly environments to analyze, edit and view essential data – the following offers a brief overview.

Growth of Secure Web Services

At its core, GIS is a system designed to capture, store, manipulate, analyze, manage, and present spatial or geographic data with attributes. The need for a tool with the ability to communicate to the high-level GIS software suite, yet allows accessibility and interoperability across platforms and organizations, has driven innovation. This tool should be able to provide secure access from anywhere and everywhere to high quality, spatially referenced data. A tool with this capability could transform the way organizations operate internally. Enter the service-centric, secure, web-based GIS.

Advancements in database connections through web services enable key players involved with a project to co-exist in the same database in real time.

When used as an asset management tool, GIS has a proven track record of success – fostering solid communication amongst teams to positively transform corporate culture and management. The advancements in database connections through web services has pushed the system to the cutting edge of technology, allowing access via smartphones, tablets, and the more traditional laptop/desktop environment. These advancements enable key players involved with a project to co-exist in the same database – engineers, surveyors, consultants, real-estate professionals, and management staff – in real time. The multiple platform/multi-user environment of GIS delivers vital information that removes the guesswork, with detailed property data, survey grade locations of utility features and other infrastructure inventory items, operations and maintenance data and logs, general aerial images, and more at your fingertips.

Let’s take a closer look at how each department within the organization can utilize this multi-centric, accessible environment.

Departmental Efficiency Through Communication

Communication is the fundamental keystone of any relationship. A breakdown in communication costs organizations money. The basic principle behind the multi-centric environment of a modern GIS is to avoid breakdowns in communication by fostering it throughout organizations. And it starts at the survey level.

GIS Integration with Surveying

The process of building a useable GIS management hub begins on the GIS database side. The three major tasks within this process are:

  1. Standardizing features names with all data owners and users
  2. Establishing attributes and coded domain values for the newly minted features
  3. Interpolating this information into a useable data dictionary for the surveyor data collection device.

The feature standardization process requires the input of all data owners, as well as the end users who will be accessing the information via the secure web services. This includes receiving input from GIS, engineering, surveying, construction, real estate, and project management staff. Each group may have opinions based on regulatory requirements or industry specific terminology that will need to be considered to build the most accurate and useful feature framework.

Upon completion, these same stakeholders need to agree upon the required feature attributes and their corresponding domain values for each one in the newly created framework. These attributes and domain values are the meat and potatoes of the GIS database system. They will provide the necessary data, as collected in the field, to perform routine and complex analysis of assets.

The next piece of the puzzle is integration of the GIS database feature domains into a survey-grade GNSS unit. This can be achieved by export of the database schema. There are a few programs on the market that can easily interpolate between differing data dictionary formats. Having a great interpolation program can save a lot of headaches and helps avoid duplicate work. This process will not only ensure field data is being collected with the best attribute data possible, but because it is utilizing the feature framework with coded domain values created in a GIS database environment, it will create efficiency while importing the survey-grade data back into GIS.

Additionally, it is a useful practice to convert any office engineering or surveying data (parcels and easements) into GIS from CAD. This can be accomplished via multiple workflows. If the proper workflow is chosen, you can interpolate all owner information out of CAD with the legal descriptive parcel shapes. The same can be done with any engineering data that is essential to O&M tasks. This is an invaluable approach for communication between maintenance, real estate and asset management staff.

After-Survey Processing

By investing heavily in a GIS environment from the beginning of the process, you will have the necessary elements to mold a customizable, user-friendly, secure application for your organization. The multi-centric experience starts here.

At this point, all data collected in the field and developed by engineering and survey office staff is converted into a GIS database format. As web services are built, database administration via cloud or in-house storage units is taking place. From within a versioned database workspace, with edit tracking enabled, a living, breathing feature layer is born that has the ability to be utilized in the multi-centric environment of GIS.

Now that we have our baseline data in a GIS environment, we can start to add in supporting information. One example of an auxiliary information source is UAV/UAS time-lapse imagery of a sub-surface construction project. This imagery can be utilized as a historical basemap of the subsurface features as installed in-situ. Likewise, UAV/UAS has the ability to obtain up-to-date imagery of organizational managed rights-of-way to identify potential encroachments.

Another new data collection method that has been growing exponentially in the last few years is mobile LiDAR scanners with orbital imagery capability. These datasets can be added into the growing multi-centric, GIS milieu to provide a 360-degree virtual scan of topographic features. This is extremely useful in a dense, urban environment where landscapes can change on a near monthly basis. Having historical orbital imagery can make it easy for real estate employees to perform analysis. And where exurban landscapes exist, this data is vital for taking snapshots in time of an organization’s rights-of-way to identify potential encroachments, perform overhead line sag analysis, and take inventory of natural features.

Moreover, this 3D UAV and LiDAR information can be represented in a 2D space within GIS. A specific case to showcase the value of having 3D information represented in a 2D space is engineering stationing of pipeline assets. Once field data with XYZ information is imported into a GIS, there are special linear reference tools that enable 3D distances on a 2D line representing a pipeline route. This can be extremely useful to aid O&M staff field services.

Asset Management: Real Estate and Beyond

Now that we’ve built a data-dense, multi-user environment with survey grade data – supplemented with new data collection techniques and engineering information – let’s take a look at a case study in how the multi-centric environment fosters effective communication.

Case Study in Real Estate Management

As a high voltage, overhead transmission line is in the first stages of planning, the GIS department takes the initial design from the engineering staff and loads the proposed route into a GIS database. A selection based on known easement widths from the centerline is made from county parcel and assessor information. This will form the initial GIS database structure used to track landowner contacts.

Once this GIS parcel, centerline, and easement information is enabled in the multi-centric environment, real estate professionals can begin to access this system via a secure web portal to update landowner contact reports and parcel acquisition status. The process of negotiating with landowners for access is the first piece of essential field work that needs to be communicated across the organization. As this information is edited in the secure environment by real estate professionals, a viewer-only version can be analyzed by management staff and accessed in the field by surveyors, which landowners have agreed to access rights. This is a nonverbal communication technique that creates a smart, real-time hub that can help prevent landowner customer service issues from occurring.

The Right Team

As organizations begin to buy into GIS software as an asset management solution, it is essential to have the proper GIS professionals and consultants in place from the very beginning. The exponential growth of GIS as a multi-centric environment has made having the appropriate staffing in place even more critical. The GIS teams around the world working on these complex systems need to have team members with a solid knowledge base in all disciplines within utility companies. Additionally, the web GIS environment has added “web developer” to the GIS analyst job description. The ultimate goal of the multi-centric GIS environment is to enable secure access from anywhere and by all interested parties to high-quality, spatially referenced data. Once this system becomes enabled within an organization, GIS will become synonymous with communication and create efficiencies in field work, business analysis, asset management, and beyond.

2 Answers 2

interpolate $f_1$ and $f_2$ : linear interpolation on irregular grid,

evaluate $f_1$ and $f_2$ on the entire grid (interpolate missing data):

plots of the interpolated functions in the style of @kickert's solution:

The Predict function can provide you the information you need.

Start by importing your data into Mathematica. For me, it was easiest to change the file extensions to .txt and use SemanticImport .

Then pull out the subset that with i2=4.

You can now thread your (x0, y0) values to the f1 values:

At this point you have some choices to make around the Method and Performance Goals you use for the Predict function. We could go deep in the weeds on this, but I created some training and test data and ran through all the options and found GradientBoostedTrees was the best compromise between quality and computational time.

With the Predictor you just created, you can run the missing data through it.

Then combine the inputs and outputs and Join the lists

Using a ListDensityPlot , you get this:

Looking at the ListPointPlot3D you can see it isn't perfect, but it is very close.

If you want to use this for f2, then follow the same process pulling your data from subset[[All,<1,2,4>]] and creating a new predictor

This import file is an export done with a tool such as exp73, exp or expdp.

The manner of importing data you are referring to when you have data in an excel sheet with coma separated value, SQL developer will create a large insert statement with the data for you and run it, which will populate a table.

What you want to do is to use oracle import tools such as impdp or imp.

Do you happen to have the export's log ? There might be a file with the same name that finishes with a .log

Here is an example of syntax on how to import specific tables of an Oracle export

You will want to run the following query to find where your data pump dir location is and place the export in the corresponding folder :

4 Answers 4

64-bit Windows only

Note for Mathematica 11.3: There is a potential conflict between MathMF and the built-in MediaTools package. See here for details and here for an example of how to use MediaTools in place of MathMF .

Note for Mathematica version 10: The Wolfram Library has been updated in version 10 and you will need to recompile the MathMF DLL. This is most easily accomplished by evaluating "MathMF"//FindLibrary//DeleteFile prior to loading the package.

I have written a package called MathMF which uses a LibraryLink DLL to do frame-by-frame video import and export with Windows Media Foundation. It should be able to read a reasonable variety of movie files, including AVI, WMV and MP4. Exporting is currently limited to WMV and MP4 formats (AVI encoding is not natively supported by Media Foundation)

Here is the sort of code you can write with it. The code first opens a video file for reading, and creates a new video file for writing to. It then runs a loop in which each frame is sequentially read from the input stream, processed in Mathematica and then written to the output stream. So Mathematica is effectively being used as a video filter.

The package can be downloaded from the GitHub link at the top of this post, it is too large to include in full here.

The package includes the library source code, and on first use will attempt to compile the library locally. I believe the compilation should work if you have Visual Studio 2010 or later installed, and probably won't work if you use a different compiler. There is a pre-built DLL available if the compilation fails (see the readme on GitHub for more details)

I hope some people find this useful, it has been hovering in my mind as something to try to do for quite some time, hindered mainly by my total lack of experience with C++ and COM programming.

Instead of directly importing an .mdd file you may want to use a mesh cache modifier.

A mesh cache modifier can read .mdd and .pc2 files and then replaces the objects mesh data with the files mesh for a given frame. Modifier properties allow adjustments for how the animation is played back.

If the .mdd option for importing isn't available, load the add-on in User Preferences:

File >> User Preferences >> Add-ons >> Import-Export: NewTek MDD format (Enable the add-on)

I had quit using After effects.. I think the last i used was 3 years back around! but I remember the way how to do so..

Click Ctrl/Command + Shift + C

this command will make one layer (composition in After effects and MovieClip in Flash).

Move the playhead (the pointer on the timeline) at the end of your animation, At the last frame.. where you want to stop animation..

This is called "freeze frame" in After effects, and ("Stop()" in Flash :))

Goto "Edit>Split Layer". it will trim the layer and create a duplicate layer above your selected layer. and that layer would begin right after the playhead.

Select the newly created upper layer. Right-Click and Select Freeze Frame. Or from menu "Layer->Time->FreezeFrame".

your layer will freeze at this point.. you can extend or trim this layer as long as you want.

4 Answers 4

If you want to say OR use double pipe ( || ).

(The original OP code using | was simply piping the output of the left side to the right side, in the same way any ordinary pipe works.)

After many years of comments and misunderstanding, allow me to clarify.

Whether you use [ or [[ or test or (( all depends on what you need on a case by case basis. It's wrong to say that one of those is preferred in all cases. Sometimes [ is right and [[ is wrong. But that's not what the question was. OP asked why | didn't work. The answer is because it should be || instead.

Watch the video: Πως να μετατρέψετε ένα PDF αρχείο