More

Conversion of CSV file in MyGeoData Converter - XY columns not recognized

Conversion of CSV file in MyGeoData Converter - XY columns not recognized


Recently I have tried to import various CSV files with XY coordinates to MyGeoData Converter. In some cases I got a message:

Selected data does not have a spatial part!

For example, this file works fine:

id,name,coordx,coordy 1,Start,18.63,49.5 2,End,18.65,49.42

Within this file I am able to see correct extent in the map, assign coordinate system and transform to desired coordinate system and format, but I got a wrong result with this file (no spatial part detected):

id,name,koordinatenx,koordinateny 1,Start,18.25,41.16 2,End,18.26,41.12

Seems like columns with XY coordinates were not recognized.

Any idea?


There is a rule for detection of columns containing coordinates based on attribute name. Coordinate column is detected if the attribute name of X coordinate is:

x, xcoord, xcoordinate, coordx, coordinatex, longitude, long

or the attribute name contains:

x_*, *_x

Similar for Y coordinate:

y, ycoord, ycoordinate, coordy, coordinatey, latitude, lat

or the attribute name contains:

y_*, *_y

Rename koordinatenx, koordinateny attributes for example to x, y or long, lat - then it should work…


Adding GeoPandas Dataframe to PostGIS table?

I would like to upload this GeoDataframe to a PostGIS table. I have a Database setup with the PostGIS extension already but can’t seem to add this Dataframe as a table.

I have tried the following:

4 Answers

As of recently, geopandas has a to_postgis method. Woohoo!

Note: you will need psycopg2-binary , sqlalchemy2 , and geoalchemy2 installed.

Answered 3 months ago by Brylie Christopher Oxley with 4 upvotes

I have a solution which is requires only psycopg2 and shapely (in addition geopandas of course). It is generally bad practice to iterate through (Geo)DataFrame objects because it is slow, but for small ones, or for one-off tasks, it will still get the job done.

Basically it works by dumping the geometry to WKB format in another column and then re-casts it to GEOMETRY type when inserting.

Note that you will have to create the table ahead of time with the right columns.

Answered 3 months ago by wfgeo with 1 upvote

I have also had the same question you've asked and have spent many, many days on it (more than I care to admit) looking for a solution. Assuming the following postgreSQL table with the postGIS extension,

this is what I finally got working:

I can't say if my database connection logic is the best since I basically copied that from another link and was just happy that I was able to successfully automap (or reflect) my existing table with the geometry definition recognized. I've been writing python to sql spatial code for only a few months, so I know there is much to learn.

Answered 3 months ago by user1745564 with 5 upvotes

Using Panda's to_sql method and SQLAlchemy you can store a dataframe in Postgres. And since you're storing a Geodataframe, GeoAlchemy will handle the geom column for you. Here's a code sample:

Worth noting that 'if_exists' parameter allows you to handle the way the dataframe will be added to your postgres table:


You're in good company: Zamzar has converted over 510 million files since 2006

CSV (Document)

File extension .csv
Category Document File
Description A CSV file is a way to collect the data from any table so that it can be conveyed as input to another table-oriented application such as a relational database application. Microsoft Excel, a leading spreadsheet or relational database application, can read CSV files. A CSV file is sometimes referred to as a flat file.
Actions CSV to XLS - Convert file now
View other document file formats
Technical Details In computers, a CSV file contains the different values in a table as a series of ASCII (American Standard Code for Information Interchange) text lines which are then organised so that each column value is separated by a comma from the next columns value and each row starts a new line. CSV is one example of a delimited text file, which uses a comma to separate values (many other implementations of CSV allow different seperators such as back or forward slash to be used). However CSV differs from other delimiter separated file formats in using a double quote character around fields that contain reserved characters (such as commas or newlines). The benefit of this approach is that it allows the transfer of data across different applications.
Associated programs The CSV file format is very simple and supported by almost all spreadsheets and database management systems
Developed by Microsoft
MIME type text/comma-separated-values
Useful links More detailed information on CSV files
How to use CSV files
Programs that open CSV files
CSV Converter

XLS (Document)

File extension .xls
Category Document File
Description Microsoft Excel is a commercial spreadsheet application written and distributed by Microsoft for Microsoft Windows and Mac OS X. Pre-2007 versions of Excel use XLS as the primary format for saving files. It features calculation, graphing tools, pivot tables, and a macro programming language called Visual Basic for Applications. It has been a very widely applied spreadsheet for these platforms, especially since version 5 in 1993, and it has almost completely replaced Lotus 1-2-3 as the industry standard for spreadsheets. Excel forms part of Microsoft Office. The current versions are 2010 for Microsoft Windows and 2011 for Mac OS X.
Actions XLS Converter
View other document file formats
Technical Details Microsoft Excel up until 2007 version used a proprietary binary file format called Binary Interchange File Format (BIFF) as its primary format. Used as the basis for XLS files it is a persistence format that supports authoring and manipulating content in workbooks and workbook templates. Most versions of Microsoft Excel can read CSV, DBF, SYLK, DIF, and other legacy formats.
Associated programs Microsoft Excel
Microsoft Excel Viewer
OpenOffice
Developed by Microsoft
MIME type application/vnd.ms-excel
Useful links More detail about the XLS format
How to open an XLS file without Microsoft Excel
Microsoft Office binary file format specifications
XLS Converter

Convert CSV file

Using Zamzar it is possible to convert CSV files to a variety of other formats


How to import a CSV file with transactions into Xero

From this step, you must have already created a CSV file. You can download it from your Online Banking or you can create with ProperSoft converter, for example, Bank2CSV converter. IMPORTANT: Bank2CSV is now replaced with the Transactions app, which converts from more formats and converts to more formats.

And when you convert to CSV, select Xero, as a 'CSV Target' for your CSV file. So the file will be properly formatted for Xero. If your CSV file downloaded from your Online Banking, maybe you would need to map it.

So first, click on 'Accounting' and then click on 'Bank accounts'. Now you have to select a Bank account or Credit Card account to import transactions into.

So, for example, you have a Checking account. Look for the 'Manually import a statement' or click on 'Manage Account' and look for the link 'Import a Statement'. In your accounting software, look for "Upload a bank file" or similar links to upload the created CSV file.

Click the 'Browse' button and select a CSV file.

Then click the 'Import' button.

Map your CSV file to Xero transaction fields. Make sure that you are converting correctly a CSV file. You can review the transactions before importing them into the Xero account.

Names, like 'Date', 'Amount', 'Payee', 'Description', 'Reference', 'Check number' are from the CSV file. So when you create a CSV file with ProperSoft converter and you select the option 'CSV Target' as Xero, it will add the header column names to the CSV file. So it's easier than to map it when you import into Xero. And another thing it will name the columns that Xero can understand automatically map. So, in this case, Date automatically mapped to transaction 'Date', Amount automatically mapped to transaction 'Amount', Payee to 'Payee' and Description to 'Description'.

Only two columns 'Reference' and 'Check number' are not automatically mapped. But they would be on your CSV file if you convert with ProperSoft converter and the file, which you convert, has checks and check numbers, then the column gonna be filled. And if there's additional information on your OFX file or a PDF file, then it would go to the 'Reference' column and then you can map it as well. QuickBooks doesn't have this', but Xero has this additional column. That is very convenient when you would like to store additional details about transactions.

Create 'Reference', select 'Check Number' and now you can review transactions if you like. Click 'Next' and you can see Dates, Amounts, you can see how they will be in Xero.

And click 'Ok'. Now your transactions are imported.

The next step is to map Payee names to Vendor Records. Let's say you would have Payee something like Shell 1, 2, 3 or Shell, or something like that, but you have Vendor called Shell in your Xero accounting. And then you would select Vendor Record in Xero for that transaction. Categorize transactions by selecting the account. And then transactions will be under your account.

Related articles

Copyright © 2008-2021 ProperSoft Inc. All rights reserved. Privacy Policy


SSIS Data Import from CSV to SQL Table Failing [closed]

Want to improve this question? Update the question so it's on-topic for Database Administrators Stack Exchange.

I have been trying to import the data from CSV file to SQL Table. I have 7 columns which are:

  1. Refresh Date - Db_Date
  2. Report Date - DB_Date
  3. Report Period - Four byte singled Int
  4. Participated - Four byte singled Int
  5. Organized - Four byte singled Int
  6. Peert-To-Peer - Four byte singled Int

I am using Flat File Source --> Data Conversion (to convert the data in my csv to match the data type according to the above table) --> OLE DB Destination.

I am getting following errors:

[Data Conversion [2]] Error: Data conversion failed while converting column "". Report Refresh Date"" (37) to column "Copy of ". Report Refresh Date"" (11). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

[Data Conversion [2]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion.Outputs[Data Conversion Output].Columns[Copy of ". Report Refresh Date"]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion.Outputs[Data Conversion Output].Columns[Copy of ". Report Refresh Date"]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data Conversion" (2) failed with error code 0xC0209029 while processing input "Data Conversion Input" (3). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

I am also attaching the screenshot of how my CSV file looks like.

Any help will be helpful as I have a ton of data which needs to be load to SQL tables from CSV files.


How can I convert a CSV file to XML?

On the community website on converting there is a link to a command line tool called csv2xml. Since it is unmaintained you might want to choose another option.

There is also mention of a java tool called csv2xml (warning: website is in German) and a command line tool called ff-extractor.

The link also has references to Python, Perl, PHP, XSLT but that means you need to code the converter yourself.

When you know the format of the csv file and the structure you need in the xml file, it's fairly straightforward to make a script that can handle the conversion.

You can create the following xml file:

With the following script:

Even if you have never coded before, I think this should be easy to use and modify. The file is read line-by-line in the while loop.

IFS is the internal field specifier. The IFS=

Keeping the file open longer than it's needed

Once you've run fastaLine = file.readlines( ) , you're done with the file, and you can close it — meaning that you're done with the with open block, and you can un-indent the blocks after it. Of course, as Reinderien pointed out, it's probably better to just process each line as you iterate over them. (This is especially important with huge files, where you might want to just load a single line at a time into memory, rather than loading the entire file into memory.) So you'll want that with open block to look more like this:

Note that I've un-indented the amino acid count comment (and everything following it).


Project description

simpledbf is a Python library for converting basic DBF files (see Limitations) to CSV files, Pandas DataFrames, SQL tables, or HDF5 tables. This package is fully compatible with Python >=3.4, with almost complete Python 2.7 support as well. The conversion to CSV and SQL (see to_textsql below) is entirely written in Python, so no additional dependencies are necessary. For other export formats, see Optional Requirements. This code was designed to be very simple, fast and memory efficient for convenient interactive or batch file processing therefore, it lacks many features, such as the ability to write DBF files, that other packages might provide.

Bug fixes, questions, and update requests are encouraged and can be filed at the GitHub repo.

This code is derived from an ActiveState DBF example that works with Python2 and is distributed under a PSF license.

Optional Requirements

  • Pandas >= 0.15.2 (Required for DataFrame)
  • PyTables >= 3.1 (with Pandas required for HDF tables)
  • SQLalchemy >= 0.9 (with Pandas required for DataFrame-SQL tables)

Installation

The most recent release of simpledbf can be installed using pip or conda, if you happen to be using the Anaconda Python distribution.

The development version can be installed from GitHub:

As an alternative, this package only contains a single file, so in principle, you could download the simpledbf.py file from Github and put it in any folder of your choosing.

DBF File Limitations

This package currently supports a subset of dBase III through 5 DBF files. In particular, support is missing for linked memo (i.e. DBT) files. This is mostly due to limitations in the types of files available to the author. Feel free to request an update if you can supply a DBF file with an associated memo file. DBF version 7, the most recent DBF file spec, is not currently supported by this package.

Python 2 Support

Except for HDF file export, this code should work fine with Python >=2.7. However, HDF files created in Python3 are compatible with all Python2 HDF packages, so in principle, you could make any HDF files in a temporary Python3 environment. If you are using the Anaconda Python distribution (recommended), then you can make a small Python3 working environment as follows:

HDF file export is currently broken in Python2 due to a limitation in Pandas HDF export with unicode. This issue may be fixed future versions of Pandas/PyTables.


Export Excel to CSV with UTF-8 or UTF-16 encoding

If your Excel spreadsheets contain some special symbols, foreign characters (tildes, accent etc.) or hieroglyphs, then converting Excel to CSV in the way described above won't work.

The point is the Save As CSV command distorts any characters other than ASCII (American Standard Code for Information Interchange). And if your Excel file has smart quotes or long dashes (e.g. inherited from the original Word document that was copied /pasted to Excel), these characters would be mangled too.

An easy alternative is saving an Excel workbook as a Unicode(.txt) file and then converting it to CSV. In this way you will keep all non-ASCII characters undamaged.

Before we proceed further, let me briefly point out the main differences between UTF-8 and UTF-16 encodings so that you can choose the right format in each particular case.

UTF-8 is a more compact encoding since it uses 1 to 4 bytes for each symbol. Generally, this format is recommended if ASCII characters are most prevalent in your file because most such characters are stored in one byte each. Another advantage is that a UTF-8 file containing only ASCII characters has absolutely the same encoding as an ASCII file.

UTF-16 uses 2 to 4 bytes to store each symbol. However, a UTF-16 file does not always require more storage than UTF-8. For example, Japanese characters take 3 to 4 bytes in UTF-8 and 2 to 4 bytes in UTF-16. So, you may want to use UTF-16 if your Excel data contains any Asian characters, including Japanese, Chinese or Korean. A noticeable disadvantage of this format is that it's not fully compatible with ASCII files and requires some Unicode-aware programs to display them. Please keep this in mind if you are going to import the resulting file somewhere outside of Excel.

How to convert Excel to CSV UTF-8

Suppose you have an Excel worksheet with some foreign characters, Japanese names in our case:

To export this Excel file to CSV keeping all the hieroglyphs intact, follow the steps below:

  1. In your Excel worksheet, go to File > Save As.
  2. Name the file and choose Unicode Text (*.txt) from the drop-down list next to "Save as type", and then click Save.

If you do want a comma-delimited CSV file, proceed with Notepad in the following way:

  • Select a tab character, right click it and choose Copy from the context menu, or simply press CTRL+C as shown in the screenshot below.
  • Press CTRL+H to open the Replace dialog and paste the copied tab ( CTRL+V ) in the Find what field. When you do this, the cursor will move rightwards indicating that the tab was pasted. Type a comma in the Replace with field and click Replace All.

In Notepad, the resulting file should look similar to this:

How to convert an Excel file to CSV UTF-16

Exporting an Excel file as CSV UTF-16 is much quicker and easier than converting to UTF-8. This is because Excel automatically employs the UTF-16 format when saving a file as Unicode (.txt).

So, what you do is simply click File > Save As in Excel, select the Unicode Text (*.txt) file format, and then change the file extension to .csv in Windows Explorer. Done!

If you need a comma-separated or semicolon-separated CSV file, replace all tabs with commas or semicolons, respectively, in a Notepad or any other text editor of your choosing (see Step 6 above for full details).


Conversion of CSV file in MyGeoData Converter - XY columns not recognized - Geographic Information Systems

Portfolio Decision Model Tevere basin

Use Git or checkout with SVN using the web URL.

Work fast with our official CLI. Learn more.

Launching GitHub Desktop

If nothing happens, download GitHub Desktop and try again.

Launching GitHub Desktop

If nothing happens, download GitHub Desktop and try again.

Launching Xcode

If nothing happens, download Xcode and try again.

Launching Visual Studio Code

Your codespace will open once ready.

There was a problem preparing your codespace, please try again.


Watch the video: How to convert Dwg to KML and upload to Google Maps! MyGeoData