Getting error while generating the report on spark #161
Unanswered
AmiableAnil
asked this question in
Installation
Replies: 1 comment 3 replies
-
@AmiableAnil Can you please post more details on this issue like complete ingestion spec json, is it being run on azure cloud storage? |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
@sowmya-dixit @manjudr Cloudify team is getting this error when executing the report in spark. Your inputs will be helpful here.
{"eid":"JOB_LOG","ets":1713958810926,"ver":"3.0","mid":"2D150932122DEE5C768C806C34B5ED81","actor":{"id":"","type":"System"},"context":{"channel":"in.ekstep","pdata":{"id":"AnalyticsDataPipeline","ver":"1.0","pid":"CollectionSummaryJobV2","model":null},"env":"analytics","sid":null,"did":null,"cdata":null,"rollup":null},"object":null,"edata":{"message":"Ingestion Task Id: Map(error -> Could not resolve type id 'static-azure-blobstore' as a subtype of
org.apache.druid.data.input.FirehoseFactory: known type ids = [clipped, combining, fixedCount, http, ingestSegment, inline, local, receiver, sql, static-s3, timed] (for POJO property 'firehose')\n at [Source: (org.eclipse.jetty.server.HttpInputOverHTTP); line: 233, column: 17] (through reference chain: org.apache.druid.indexing.common.task.IndexTask[\"spec\"]->org.apache.druid.indexing.common.task.IndexTask$IndexIngestionSpec[\"ioConfig\"]->org.apache.druid.indexing.common.task.IndexTask$IndexIOConfig[\"firehose\"]))","class":"org.sunbird.core.util.CourseUtils","level":"INFO"},"@timestamp":"2024-04-24T11:40:10+00:00"}
cc: @rhwarrier @mohitga
Beta Was this translation helpful? Give feedback.
All reactions