FieldsMapping createFieldMapping( String fileName, CsvInputMeta csvInputMeta ) throws KettleException { FieldsMapping mapping = null; if ( csvInputMeta.isHeaderPresent() ) { String[] fieldNames = readFieldNamesFromFile( fileName, csvInputMeta ); mapping = NamedFieldsMapping.mapping( fieldNames, fieldNames( csvInputMeta ) ); } else { int fieldsCount = csvInputMeta.getInputFields() == null ? 0 : csvInputMeta.getInputFields().length; mapping = UnnamedFieldsMapping.mapping( fieldsCount ); } return mapping; }
@Override public StepInterface getStep( StepMeta stepMeta, StepDataInterface stepDataInterface, int cnr, TransMeta tr, Trans trans ) { return new CsvInput( stepMeta, stepDataInterface, cnr, tr, trans ); }
@Override public int fieldMetaIndex( int index ) { if ( index >= size() || index < 0 ) { return FIELD_DOES_NOT_EXIST; } return actualToMetaFieldMapping[index]; }
private CsvInputMeta createStepMeta( final String testFilePath, final String encoding ) { final CsvInputMeta meta = new CsvInputMeta(); meta.setFilename( testFilePath ); meta.setDelimiter( "\t" ); meta.setEncoding( encoding ); meta.setEnclosure( "\"" ); meta.setBufferSize( "50000" ); meta.setInputFields( getInputFileFields() ); meta.setHeaderPresent( true ); return meta; }
/** * Initialize for processing specified file. */ protected void init( String file ) throws Exception { meta.setFilename( getFile( file ).getURL().getFile() ); step = new CsvInput( stepMeta, null, 1, transMeta, trans ); step.init( meta, data ); step.addRowListener( rowListener ); }
private StepMetaDataCombi createBaseCombi( File sharedFile, boolean headerPresent, String delimiter ) { StepMetaDataCombi combi = new StepMetaDataCombi(); CsvInputData data = new CsvInputData(); CsvInputMeta meta = createMeta( sharedFile, createInputFileFields( "Field_000", "Field_001" ), headerPresent, delimiter ); CsvInput csvInput = createCsvInput(); csvInput.init( meta, data ); combi.step = csvInput; combi.data = data; combi.meta = meta; return combi; }
@Override public void modify( StepMetaInterface someMeta ) { if ( someMeta instanceof CsvInputMeta ) { ( (CsvInputMeta) someMeta ).allocate( 5 ); } }
@Override public CsvInputMeta getNewMetaInstance() { return new CsvInputMeta(); }
private int createAndRunOneStep( File sharedFile, int stepNr, int totalNumberOfSteps, boolean headersPresent, String delimiter ) throws Exception { StepMetaDataCombi combiStep1 = createBaseCombi( sharedFile, headersPresent, delimiter ); configureData( (CsvInputData) combiStep1.data, stepNr, totalNumberOfSteps ); return processRows( combiStep1 ); }
@Override public StepDataInterface getStepData() { return new CsvInputData(); }
@Override public boolean hasHeader() { return isHeaderPresent(); }
/** * This method should be used very carefully. Moving pointer without increasing number of written bytes * can lead to data corruption. */ boolean moveEndBufferPointer( boolean increaseTotalBytes ) throws IOException { endBuffer++; if ( increaseTotalBytes ) { totalBytesRead++; } return resizeBufferIfNeeded(); }
/** * Describe the metadata attributes that can be injected into this step metadata object. * * @throws KettleException */ @Override public List<StepInjectionMetaEntry> getStepInjectionMetadataEntries() throws KettleException { return getStepInjectionMetadataEntries( PKG ); }
/** * Moves the endBuffer pointer by one.<br> * If there is not enough room in the buffer to go there, resize the byte buffer and read more data.<br> * if there is no more data to read and if the endBuffer pointer has reached the end of the byte buffer, we return * true.<br> * * @return true if we reached the end of the byte buffer. * @throws IOException * In case we get an error reading from the input file. */ boolean moveEndBufferPointer() throws IOException { return moveEndBufferPointer( true ); }
@Override public void loadXML( Node stepnode, List<DatabaseMeta> databases, IMetaStore metaStore ) throws KettleXMLException { readData( stepnode ); }
TextFileInputField[] createInputFileFields( String... names ) { TextFileInputField[] fields = new TextFileInputField[ names.length ]; for ( int i = 0; i < names.length; i++ ) { fields[ i ] = createField( names[ i ] ); } return fields; }
private TextFileInputField[] getInputFileFields() { return createInputFileFields( "Header1", "Header2" ); } }
private TextFileInputField[] getInputFileFields() { return createInputFileFields( "Header1", "Header2" ); } }
private CsvInputMeta createStepMeta( final String testFilePath, final String encoding, final String delimiter, final boolean useHeader ) { final CsvInputMeta meta = new CsvInputMeta(); meta.setFilename( testFilePath ); meta.setDelimiter( delimiter ); meta.setEncoding( encoding ); meta.setEnclosure( "\"" ); meta.setBufferSize( "50000" ); meta.setInputFields( getInputFileFields() ); meta.setHeaderPresent( useHeader ); return meta; }
private CsvInput createCsvInput() { return new CsvInput( stepMockHelper.stepMeta, stepMockHelper.stepDataInterface, 0, stepMockHelper.transMeta, stepMockHelper.trans ); }