Mara
Besides SQL server I have the following suggestions: �
1) the ff package in R (�http://www.bnosac.be/index.php/blog/22-if-you-are-into-large-data-and-work-a-lot-package-ff)
2) HDF5 seems like a decent option though I have not used it. Link to rhdf5 ( http://bioconductor.org/packages/release/bioc/html/rhdf5.html). Also, SFCTA has some code for getting data into and out of HDF5 ( https://github.com/sfcta/TAutils/tree/master/hdf5)
3) I have found TransCAD to be efficient in processing large datasets.
Hope this helps.
Krishnan
I downloaded the Maryland state raw data (the whole enchilada) that Penelope was good enough to provide me.� It came with documentation that clearly explains what needs to be done but I am being hampered by the sheer size of the dataset.� It's 10 GB and that's without going into joining tables, transposing them to meet my needs, etc.� Even breaking the parts into different databases it can't be handled in Access.� I can fit Part 1 into an ESRI geodatabase but I don't have the flexibility in linking tables that Access has.
�
Does anyone have any suggestions for dealing with large databases?� SQL server is one option.� Are there others?
�
Mara Kaminowitz, GISP
GIS Coordinator
.........................................................................
Baltimore Metropolitan Council
Offices�@ McHenry Row
1500 Whetstone Way
Suite 300
Baltimore, MD� 21230
410-732-0500 ext. 1030
mkaminowitz@baltometro.org
www.baltometro.org
�