Skip navigation
Log in to follow, share, and participate in this community.

Recent Activity

Augusto Santos
Hi, im trying to access the youtube API data from our youtube channel with pentaho data integration but i have no idea how i can acces it.   I tried with javascript step for example but didnt work.     Anyone have an idea of how i could access Youtube API with PDI?   Thx.
in Data Integration
saket Maheshwary
I am using PDI 8.1 CE in Windows environment. I am trying to access some network folders using UNC but getting below error: File is there but PDI is not able to see it. One thing that I have observed is that where we get error instead of accessing the location with \\ at the beginning we have only one slash… (Show more)
in Data Integration
Sam Alexander
I'm using Data Integration 8.1 on Windows, and using CSV Input I have the CSV on a network drive - \\DevServer\Dev\test.csv for example.  When I create the CSV File and put the path in Filename I click Get Fields which loads the files just fine.  But when I Preview or use the CSV Input to say feed a table I get this error:   2019/04/23 13:30:11 -… (Show more)
in Data Integration
Stephane TRAORE
Bonsoir ! Comment peut-on calculer le nombre de jours ouvrés entre deux dates en tenant compte des jours fériés dans spoon ?
in Data Integration
Virgilio Pierini
Hi all I'm interested in setting up different semantics when retrieving data from Kafka with PDI: at most once, at least once, exactly once. I read [PDI-17272] Implement explicit commit in Kafka Consumer - Pentaho Platform Tracking and since there are no comments I'm a bit puzzled.   What am I doing? I have one transformation with Kafka… (Show more)
in Data Integration
Smith Hutchings
We've run into a problem with inconsistent data typing from the Database Lookup step between version 7.1 and 8.2.   Using a postgresql database, any integer/bigint key lookup value typed to integer in the lookup step outputs as a BigNumber in the data stream. This lookup pattern is one we've used consistently through our ETL process and now… (Show more)
in Data Integration
Wenxing Xu
Click to view contentI wanna integrate kettle in my java application with maven, but many dependencies in Repository - Nexus Repository Manager are unreachable and the nexus repository returns 500 status code.     I hope someone can help me with this.   Cheers. Wenxing
in Data Integration
Matthew Kennedy
Click to view contentI have a transformation that has been tested locally (Windows Environment) and runs successfully. When I promote to the repository and try to run from the Pentaho Server I receive errors about not being able open the zip entry source stream.   Has anyone seen this before and have tips on fixing it?   The Excel file is on the local file system in… (Show more)
in Data Integration
Huriye Akbas
Hello,   One of our case is to use a data source for BI tool, which data source should be provided on PDI.   Some of the data is on the Hadoop and a part on the DWH. If the data is on the Hadoop while the report is being generated, the PDI must extract the data from Hadoop. If the data is over DWH, the PDI must draw the data over DWH.   Do… (Show more)
in Data Integration
Shankar Panda
Hi All,   I need a urgent help. As My project is completely in ig data environment and dealing with hive tables, I have used Hadoop copy files to place the file in HDFS system.   When i trigger the job directly from Spoon, it works fine. File is getting successfully copied to HDFS system.   But when i run the same job using Kitchen.bat, hadoop… (Show more)
in Data Integration
Load more items