trait ReportsSourceMetrics extends SparkDataStream
A mix-in interface for SparkDataStream streaming sources to signal that they can report
metrics.
- Annotations
- @Evolving()
- Since
3.2.0
- Alphabetic
- By Inheritance
- ReportsSourceMetrics
- SparkDataStream
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Abstract Value Members
- abstract def commit(end: Offset): Unit
Informs the source that Spark has completed processing all data for offsets less than or equal to
endand will only request offsets greater thanendin the future.Informs the source that Spark has completed processing all data for offsets less than or equal to
endand will only request offsets greater thanendin the future.- Definition Classes
- SparkDataStream
- abstract def deserializeOffset(json: String): Offset
Deserialize a JSON string into an Offset of the implementation-defined offset type.
Deserialize a JSON string into an Offset of the implementation-defined offset type.
- Definition Classes
- SparkDataStream
- Exceptions thrown
IllegalArgumentExceptionif the JSON does not encode a valid offset for this reader
- abstract def initialOffset(): Offset
Returns the initial offset for a streaming query to start reading from.
Returns the initial offset for a streaming query to start reading from. Note that the streaming data source should not assume that it will start reading from its initial offset: if Spark is restarting an existing query, it will restart from the check-pointed offset rather than the initial one.
- Definition Classes
- SparkDataStream
- abstract def metrics(latestConsumedOffset: Optional[Offset]): Map[String, String]
Returns the metrics reported by the streaming source with respect to the latest consumed offset.
Returns the metrics reported by the streaming source with respect to the latest consumed offset.
- latestConsumedOffset
the end offset (exclusive) of the latest triggered batch.
- abstract def stop(): Unit
Stop this source and free any resources it has allocated.
Stop this source and free any resources it has allocated.
- Definition Classes
- SparkDataStream
Concrete Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##: Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()