Je dois écrire d'énormes données dans un fichier texte [csv]. J'ai utilisé BufferedWriter pour écrire les données et cela a pris environ 40 secondes pour écrire 174 Mo de données. Est-ce la vitesse la plus rapide Java peut offrir)?
bufferedWriter = new BufferedWriter ( new FileWriter ( "fileName.csv" ) );
Remarque: Ces 40 secondes incluent le temps d'itération et de récupération des enregistrements à partir du jeu de résultats. :) 174 Mo correspond à 400 000 lignes dans le jeu de résultats.
Vous pouvez essayer de supprimer BufferedWriter et d’utiliser directement FileWriter. Sur un système moderne, il est fort probable que vous écriviez simplement dans la mémoire cache du lecteur.
Il me faut entre 4 et 5 secondes pour écrire 175 Mo (4 millions de chaînes) - il s’agit d’un Dell double cœur 2,4 GHz sous Windows XP avec 80 Go à 7200 tr/min.) Disque Hitachi.
Pouvez-vous isoler combien de temps est la récupération des enregistrements et combien d'écriture de fichier?
import Java.io.BufferedWriter;
import Java.io.File;
import Java.io.FileWriter;
import Java.io.IOException;
import Java.io.Writer;
import Java.util.ArrayList;
import Java.util.List;
public class FileWritingPerfTest {
private static final int ITERATIONS = 5;
private static final double MEG = (Math.pow(1024, 2));
private static final int RECORD_COUNT = 4000000;
private static final String RECORD = "Help I am trapped in a fortune cookie factory\n";
private static final int RECSIZE = RECORD.getBytes().length;
public static void main(String[] args) throws Exception {
List<String> records = new ArrayList<String>(RECORD_COUNT);
int size = 0;
for (int i = 0; i < RECORD_COUNT; i++) {
records.add(RECORD);
size += RECSIZE;
}
System.out.println(records.size() + " 'records'");
System.out.println(size / MEG + " MB");
for (int i = 0; i < ITERATIONS; i++) {
System.out.println("\nIteration " + i);
writeRaw(records);
writeBuffered(records, 8192);
writeBuffered(records, (int) MEG);
writeBuffered(records, 4 * (int) MEG);
}
}
private static void writeRaw(List<String> records) throws IOException {
File file = File.createTempFile("foo", ".txt");
try {
FileWriter writer = new FileWriter(file);
System.out.print("Writing raw... ");
write(records, writer);
} finally {
// comment this out if you want to inspect the files afterward
file.delete();
}
}
private static void writeBuffered(List<String> records, int bufSize) throws IOException {
File file = File.createTempFile("foo", ".txt");
try {
FileWriter writer = new FileWriter(file);
BufferedWriter bufferedWriter = new BufferedWriter(writer, bufSize);
System.out.print("Writing buffered (buffer size: " + bufSize + ")... ");
write(records, bufferedWriter);
} finally {
// comment this out if you want to inspect the files afterward
file.delete();
}
}
private static void write(List<String> records, Writer writer) throws IOException {
long start = System.currentTimeMillis();
for (String record: records) {
writer.write(record);
}
writer.flush();
writer.close();
long end = System.currentTimeMillis();
System.out.println((end - start) / 1000f + " seconds");
}
}
essayez les fichiers mappés en mémoire (il faut 300 m/s pour écrire 174 Mo dans mon m/c, noyau 2 duo, 2,5 Go de RAM):
byte[] buffer = "Help I am trapped in a fortune cookie factory\n".getBytes();
int number_of_lines = 400000;
FileChannel rwChannel = new RandomAccessFile("textfile.txt", "rw").getChannel();
ByteBuffer wrBuf = rwChannel.map(FileChannel.MapMode.READ_WRITE, 0, buffer.length * number_of_lines);
for (int i = 0; i < number_of_lines; i++)
{
wrBuf.put(buffer);
}
rwChannel.close();
Seulement pour des raisons de statistiques:
La machine est vieux Dell avec le nouveau SSD
CPU: Intel Pentium D 2,8 Ghz
SSD: SSD Patriot Inferno 120 Go
4000000 'records'
175.47607421875 MB
Iteration 0
Writing raw... 3.547 seconds
Writing buffered (buffer size: 8192)... 2.625 seconds
Writing buffered (buffer size: 1048576)... 2.203 seconds
Writing buffered (buffer size: 4194304)... 2.312 seconds
Iteration 1
Writing raw... 2.922 seconds
Writing buffered (buffer size: 8192)... 2.406 seconds
Writing buffered (buffer size: 1048576)... 2.015 seconds
Writing buffered (buffer size: 4194304)... 2.282 seconds
Iteration 2
Writing raw... 2.828 seconds
Writing buffered (buffer size: 8192)... 2.109 seconds
Writing buffered (buffer size: 1048576)... 2.078 seconds
Writing buffered (buffer size: 4194304)... 2.015 seconds
Iteration 3
Writing raw... 3.187 seconds
Writing buffered (buffer size: 8192)... 2.109 seconds
Writing buffered (buffer size: 1048576)... 2.094 seconds
Writing buffered (buffer size: 4194304)... 2.031 seconds
Iteration 4
Writing raw... 3.093 seconds
Writing buffered (buffer size: 8192)... 2.141 seconds
Writing buffered (buffer size: 1048576)... 2.063 seconds
Writing buffered (buffer size: 4194304)... 2.016 seconds
Comme on peut le voir, la méthode brute est plus lente que la mémoire tampon.
Votre vitesse de transfert ne sera probablement pas limitée par Java. Au lieu de cela, je soupçonne (sans ordre particulier)
Si vous lisez l'ensemble de données complet et que vous l'écrivez ensuite sur le disque, cela prendra plus de temps, car la machine virtuelle Java devra allouer de la mémoire et l'écriture de la base de données/du disque s'effectuera de manière séquentielle. Au lieu de cela, j'écrirais à l'écrivain en mémoire tampon pour chaque lecture que vous effectuez à partir de la base de données. L'opération sera donc plus proche d'une opération simultanée (je ne sais pas si vous le faites ou non).
Pour ces lectures volumineuses de la base de données, vous souhaiterez peut-être ajuster la taille d'extraction de votre instruction . Cela pourrait économiser beaucoup d'allers et retours pour DB.
http://download.Oracle.com/javase/1.5.0/docs/api/Java/sql/Statement.html#setFetchSize%28int%29
package all.is.well;
import Java.io.IOException;
import Java.io.RandomAccessFile;
import Java.util.concurrent.ExecutorService;
import Java.util.concurrent.Executors;
import junit.framework.TestCase;
/**
* @author Naresh Bhabat
*
Following implementation helps to deal with extra large files in Java.
This program is tested for dealing with 2GB input file.
There are some points where extra logic can be added in future.
Pleasenote: if we want to deal with binary input file, then instead of reading line,we need to read bytes from read file object.
It uses random access file,which is almost like streaming API.
* ****************************************
Notes regarding executor framework and its readings.
Please note :ExecutorService executor = Executors.newFixedThreadPool(10);
* for 10 threads:Total time required for reading and writing the text in
* :seconds 349.317
*
* For 100:Total time required for reading the text and writing : seconds 464.042
*
* For 1000 : Total time required for reading and writing text :466.538
* For 10000 Total time required for reading and writing in seconds 479.701
*
*
*/
public class DealWithHugeRecordsinFile extends TestCase {
static final String FILEPATH = "C:\\springbatch\\bigfile1.txt.txt";
static final String FILEPATH_WRITE = "C:\\springbatch\\writinghere.txt";
static volatile RandomAccessFile fileToWrite;
static volatile RandomAccessFile file;
static volatile String fileContentsIter;
static volatile int position = 0;
public static void main(String[] args) throws IOException, InterruptedException {
long currentTimeMillis = System.currentTimeMillis();
try {
fileToWrite = new RandomAccessFile(FILEPATH_WRITE, "rw");//for random write,independent of thread obstacles
file = new RandomAccessFile(FILEPATH, "r");//for random read,independent of thread obstacles
seriouslyReadProcessAndWriteAsynch();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Thread currentThread = Thread.currentThread();
System.out.println(currentThread.getName());
long currentTimeMillis2 = System.currentTimeMillis();
double time_seconds = (currentTimeMillis2 - currentTimeMillis) / 1000.0;
System.out.println("Total time required for reading the text in seconds " + time_seconds);
}
/**
* @throws IOException
* Something asynchronously serious
*/
public static void seriouslyReadProcessAndWriteAsynch() throws IOException {
ExecutorService executor = Executors.newFixedThreadPool(10);//pls see for explanation in comments section of the class
while (true) {
String readLine = file.readLine();
if (readLine == null) {
break;
}
Runnable genuineWorker = new Runnable() {
@Override
public void run() {
// do hard processing here in this thread,i have consumed
// some time and eat some exception in write method.
writeToFile(FILEPATH_WRITE, readLine);
// System.out.println(" :" +
// Thread.currentThread().getName());
}
};
executor.execute(genuineWorker);
}
executor.shutdown();
while (!executor.isTerminated()) {
}
System.out.println("Finished all threads");
file.close();
fileToWrite.close();
}
/**
* @param filePath
* @param data
* @param position
*/
private static void writeToFile(String filePath, String data) {
try {
// fileToWrite.seek(position);
data = "\n" + data;
if (!data.contains("Randomization")) {
return;
}
System.out.println("Let us do something time consuming to make this thread busy"+(position++) + " :" + data);
System.out.println("Lets consume through this loop");
int i=1000;
while(i>0){
i--;
}
fileToWrite.write(data.getBytes());
throw new Exception();
} catch (Exception exception) {
System.out.println("exception was thrown but still we are able to proceeed further"
+ " \n This can be used for marking failure of the records");
//exception.printStackTrace();
}
}
}