RODBC Query Tuning

You're caught in a gap I've struggled with as well. I can't opine on what is "right" or "best" but only what I've done in the past.

I usually do what you did in the first example and just deal with type changes once they get into R. If you wanted to do the latter method, you could convert the date once it was in R. My Oracle systems seem to always be set up to return dates in the "22-NOV-10" format which is annoying as heck to parse. So I would use the Oracle to_char() function in my query to format my dates into something R can easily recognize.

So, for example, I might have this in my SELECT statement:

to_char(myDate, 'yyyy-mm-dd') as myDate

then I pull that into a data frame called myData and do this:

myData$properDate <- strptime(myData$myDate, "%Y-%m-%d")

Whether to deal with fixing dates or deal with fixing other fields really depends on how many date fields you have and how many non-date fields the first method messes up. But in my experience I either end up fixing one, or the other.

Something you might consider when using method 1: Try using cast() in your SQL in order to force a field into a particular type. The only times I've had trouble with RODBC molesting my data types is when the type is ambiguous on the server side. This is usually the result of CASE statements or somesuch on the SQL end.


The as.is argument can be a boolean vector.

So if your result set consists of, say, two date columns followed by one character column, you can do:

uapp <- sqlQuery(ch, SQL, stringsAsFactors = FALSE, as.is = c(FALSE, FALSE, TRUE))

EDIT: as suggested by Kalin you can also refer to the columns that should be "as is". For example

uapp <- sqlQuery(ch, SQL, stringsAsFactors = FALSE, as.is = c(2, 4))

will leave columns two and four "as is".

Tags:

R

Rodbc