[Solved] Loading very large CSV files into Base

dBase, Calc, CSV, MS ACCESS, MySQL, PostgrSQL, OTHER
Post Reply
laughterbob
Posts: 2
Joined: Sun Apr 04, 2010 4:49 am

[Solved] Loading very large CSV files into Base

Post by laughterbob »

Hello,
I am trying to load a large CSV file (>10MB) into Base. The file is too large for Calc (too many records).
I have studied the posts in this area but cannot find a solution to my problem.
Everything I try either attempts to use the Calc filter (which fails to handle the number of records and leaves me with a truncated dataset) or nothing happens.

The first line of the file has Header labels.
Once the data has been loaded, I want to set the type of some of the fields (e.g. change from text to numeric) and perform the kind of data sorting and enquiries that are common with spreadsheets.

I figure I am missing something obvious but can't seem to find an answer.
Thanks, Bob.
Last edited by laughterbob on Sun Apr 04, 2010 9:58 am, edited 1 time in total.
OpenOffice 3.0.1 on Ubuntu 9.04
FJCC
Moderator
Posts: 9625
Joined: Sat Nov 08, 2008 8:08 pm
Location: Colorado, USA

Re: Loading very large CSV files into Base

Post by FJCC »

This tutorial describes how to create a table and then load it with data from a csv file.
OpenOffice 4.1 on Windows 10 and Linux Mint
If your question is answered, please go to your first post, select the Edit button, and add [Solved] to the beginning of the title.
laughterbob
Posts: 2
Joined: Sun Apr 04, 2010 4:49 am

Re: [Solved] Loading very large CSV files into Base

Post by laughterbob »

Thank you. Bit of a problem until I realised I did not have a primary key field, then a bit of adjustment of string length and whoopee!
Cheers, Bob.
OpenOffice 3.0.1 on Ubuntu 9.04
Post Reply