SSIS 2008 Rows per batch and Maximum insert commit size
I find this useful to guide me: Top 10 SQL Server Integration Services Best Practices
Simply because I don't use SSIS enough. However, as HLGEM said, you'll just have to give a shot yourself to see what happens...
I found, If you are on a simple or bulk load database, and have truncated the table, or dropped your indexes, and if you select fast load option, then it should not be logging the rows as inserts and no impact will be felt by the transaction log. And these settings should be left as is with a single commit on the end and index rebuilding will only be done one time at the end.
If you leave the settings that are default, you will have to insert all 100 million rows before the batch is committed which can cause your transaction log to grow very large. If you want to keep that down, pick a number lower than the default.
There is no best value, it depends greatly on the design of the database, the number of users, the kind of hardware you are operating one etc. THat is why you need to test for yourself with your system.