Skip to content

Instantly share code, notes, and snippets.

View slissner's full-sized avatar

Samuel Lissner slissner

View GitHub Profile
Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
ch.qos.logback.core.LogbackException: Failed to initialize or to run Configurator: ch.qos.logback.classic.util.DefaultJoranConfigurator
at ch.qos.logback.classic.util.ContextInitializer.invokeConfigure(ContextInitializer.java:133)
at ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:103)
at ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:66)
at ch.qos.logback.classic.spi.LogbackServiceProvider.initializeLoggerContext(LogbackServiceProvider.java:52)
at ch.qos.logback.classic.spi.LogbackServiceProvider.initialize(LogbackServiceProvider.java:41)
at org.slf4j.LoggerFactory.bind(LoggerFactory.java:199)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:186)
@slissner
slissner / aws-cdk-permissions.json
Created August 27, 2024 09:23
AWS CDK IAM Permissions
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "StsAccess",
"Effect": "Allow",
"Action": [
"sts:AssumeRole",
"iam:*Role*"
],
@slissner
slissner / BarChart.jsx
Created December 21, 2017 11:13
Bar Chart (d3+react) – the flawed approach
// @flow
import React from 'react';
import {descending, max} from "d3-array";
import {entries} from "d3-collection";
import {select} from "d3-selection";
import {scaleLinear} from "d3-scale";
import "./BarChart.scss";
import ChartTooltip from "./components/ChartTooltip";
@slissner
slissner / BarChart.jsx
Created December 21, 2017 11:11
Bar Chart (d3+react) – the right way
// @flow
import React from 'react';
import {descending, max} from "d3-array";
import {entries} from "d3-collection";
import {scaleLinear} from "d3-scale";
import "./BarChart.scss";
import ChartTooltip from "./components/ChartTooltip";
type Props = {
@slissner
slissner / kafka_spark_integration_06.scala
Created November 9, 2017 10:11
Integrating Kafka with Spark Streaming - Example 6
stream.foreachRDD { rdd =>
val offsetRanges = rdd.asInstanceOf[HasOffsetRanges].offsetRanges
// DO YOUR STUFF with DATA
stream.asInstanceOf[CanCommitOffsets].commitAsync(offsetRanges)
}
}
@slissner
slissner / kafka_spark_integration_05.scala
Created November 9, 2017 10:10
Integrating Kafka with Spark Streaming - Example 5
streamingContext.start()
streamingContext.awaitTermination()
@slissner
slissner / kafka_spark_integration_04.scala
Created November 9, 2017 10:08
Integrating Kafka with Spark Streaming - Example 4
stream.map(record => record.value)
.flatMap(line => line.split("[ ,\\.;:\\-]+"))
.map(word => word.toLowerCase)
.filter(_.size > 0)
.map(word => (word, 1))
.reduceByKey(_ + _)
.repartition(1)
.transform(rdd => rdd.sortBy(-_._2))
.saveAsTextFiles("./output/words")
@slissner
slissner / kafka_spark_integration_03.scala
Created November 9, 2017 10:08
Integrating Kafka with Spark Streaming - Example 3
val stream = KafkaUtils.createDirectStream[String, String](
streamingContext,
PreferConsistent,
Subscribe[String, String](topics, kafkaParams)
)
@slissner
slissner / kafka_spark_integration_02.scala
Created November 9, 2017 10:07
Integrating Kafka with Spark Streaming - Example 2
val topics = Array("text")
@slissner
slissner / kafka_spark_integration_01.scala
Created November 9, 2017 10:06
Integrating Kafka with Spark Streaming - Example 1
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> "localhost:9092",
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> "kafka_demo_group",
"auto.offset.reset" -> "earliest",
"enable.auto.commit" -> (true: java.lang.Boolean)
)