Speaking from personal experience, it is easy to corrupt a GDB if you manipulate the files individually outside of ArcGIS (i.e. ArcMap or ArcCatalog) with something like Windows Explorer. The individual files that you describe make up the guts of a GDB. Instead of adding these individually to a folder, you should be able to locate, view and utilize the GDB directly as long as you are using ArcMap or ArcCatalog. A folder containing the files you mentioned without a .gdb extension is likely a corrupted GDB.
You are not giving a whole lot of information here...
What version of the ArcGIS Editor for OpenStreetMap are you using? The latest version has seen some major improvements in terms of processing speed, and added much better multipolygon handling, meaning you get more accurate and better polygon output, close to what osm2pgsql achieves. Still, especially multipolygon creation, is a major technical hurdle and processing burden for any application working with OSM data.
What data do you need? The editor by default converts all data in an osm file, so if that data includes multipolygons as well, it may take a while... If you are just interested in a road network, it might be better to filter out the line data first using one the open source tools for that, and convert the remaining data to a File Geodatabase using the Load OSM File tool.
That said, 100GB is a lot, if you are not prepared to wait a couple of days for certain processes to finish, than you should probably not be handling this type of datasets at all... (unless you intend to do Big Data processing in a cluster, and have the facilities for that to shorten process time, ESRI has tools for putting data in a Big Data cluster).
Anyway, I have processed uncompressed XML osm of up to 25GB using the Editor's Load OSM File tool, and if I remember it well, for the latest release, it took around 3-4 days max on a Core i5 quadcore processor.
Best Answer
I connected the .gdb folder to a generic folder (not .gdb) and it's working perfect.