Words
-
Tesla interview for senior software engineer role
It’s been two more month journey for my current interview with Tesla Berlin, for senior software engineer role. I received the initial interview call from this end of February, and conducted interviews with HR, HM for different rounds till this April 20th. Each round takes 2 more weeks and each round I need to sign a new NDA with different HR interview coordinators. The interview was cancelled just before the last technical interview I will conduct with another HM, as this interview was confirmed one week ago. I have daily practiced DSA, but it made sense from the email I received from HR that the current hiring is freeze, due to Tesla layoff situation.
-
Recap for advanced Python training from David Beazley
Back to 2022 August, I have attended one extensive training from Python expert David Beazley, I didn’t really pick up and review what I have learnt. Now sitting in the quiet coffee shop in Helsinki, I restructure this article. This was I wrote previously, I like words in a depth, which builds with curiosity, so I like use words when understanding the universe, for its the language with the creator. Maybe this is also the reason why I like programming, for it keeps me wonders the beauty of the maths, philosophy and logics. I still keep the wonder of this matter.
-
The Journey
I looked at the windows side of Kaffee shop when waiting for my food, the meoment I feel the world is so amazing and beautiful. That moment I’m so grateful to tell myself I’m glad where I have travelled. I can call it a journey I have to travel. It was a journey, indeed. When I come back home, I read the poem
The Journey
wrote by Mary Oliver. I have been the fan of her for some time, but until today when I read her poem, I never feel so much connected. I knew poem is always connected to the soul, which is a gift for the human spiritual world. But today I deeply feel poem is the way of living, the live we for words. -
Reactjs component - multi select
Reactjs is truly FP style, to adjust the new business logic I have to redesign the codebase, which made it more module and new changes is to place the logic in the components level. During the weekend, I wanted to implement
TF-IDF algorithm
in the search component, which backend I will useHaystack
library, while to achieve the same result I can implement this logic viareact-select
library. To achieve I will still use theDRF
to build the communication bridge between the client and server. Still I will illustrate with the high level code without details, abstractly. -
Reactjs in Django project
I haven’t really written any words for one more year, but I learnt a life lesson and Singapore really teaches me a lot in the subject of life, which is a missed course to me. Move on, to this writing, which is the Reactjs project I’m working on, part of my Django project, which is an interesting experience that combines these two tech stacks in one full stack project, I did feel joy here and much learning, along the way. Who said life is not then?
-
Syntax and Semantics
For a reason to connect with why when writing this one, I felt I have bit of ideas but just don’t know what is it (Yet!). Maybe the key point I want to express, that code writing should well manage components to put right components in the right place. I started pick up syntax and semantics recently, which to be honest attracted me. For this topic relates to logic, mathematics, computer science, philosophy, linguistics, and maybe other fields. Like arts meets sciences, the beauty exists between its relation.
-
_Resolution
Time flies, if I don’t pay attention, I would miss it. 2020 is an interesting year, adventure. I have taught in coding community for the summer two month, solo biked to Mogan Mount for 400 KM, if say what I have accomplished on my 2020 resolution list, I only have this two among ten.
-
Apache Kafka Developers training- module 2
The second module for Kafka Developer Training, this module is about Kafka Architecture, which focuses on configuration and strategy of partitioning and replication. For partitioning data is good for performance but not for reliability, in a distributed system, should apply partition and replica for both scalability and reliability.
-
Apache Kafka Developers training- module 1
I’m preparing the Kafka Developer Certification, so taking Confluent training Confluent Developer Skills for Building Apache Kafka. To me, I write notes down for they’re a useful resource, so if when I want to refresh my memory on Kafka topic that I learnt, I can go here to review. In the meanwhile, it gives me a motivation to record this developer training.
-
Growth in _fear
It’s been a week more from my bike ride, total 400KM from Shanghai to Mogan Mount, it was a journey I always look for, to ride far away, then take back a filtered me. I give this activity a verb
purify
and I know I need this to fill my human heart, for at end of each day I’m not the machine I work on, I’m the human, broken and flawed, emotional… but all happened just is true, then is beautiful. -
Back to learning
Every time I come to a stale stage, don’t have a clear idea to motivate myself to take action, I always come back to watch
The Story of Aaron Swartz
documentary, for in the beginning of I self-taught programming, Swartz’s story is spirit source motivates me. His words always can enlighten me from different sides and depths. The the simple function call this time. -
Monads, the typeclass
Monad attracts me, for its abstraction and logical reasoning, so I started try to dig into by studying Type and Category Theory, read one book Category Theory for Programmer, not finished yet. Now listening to Monads song from Dylan Beattie, I think maybe it’s a high time to review Monads, what the new mixed in my brain from that rabbit hole.
-
Short thoughts for Scala!
Read a new article which posted today The Death of Hype: What’s Next for Scala from active Scala developer Haoyi, generally he is good at Python and Java too. From his article that he mentioned Scala’s hype was gone, on another side maybe, as he said it’s a sign for a language to become mature. For me Scala is beautiful and terse, for its sugared syntax, mathematical logic, and usage for big data system Spark. I have only used Spark with Python, for its Python API, well, I think using with Scala would be cooler and getting better for understanding distributed system, in general.
-
Scala pattern - Monad is everywhere!
Since I fall in love for Scala, by its pure functionality and the way it been expressed from both FP and OOP sides, so I can’t stop but look this language further. The further leads me to tiny step into space of FP, not the way like lambda, closure, callback etc in Python, JavaScript, but the mathematically it shines with logic, imagination and universal language we are speaking.
-
Scala Pattern (aka. practice) - Part 1
For the learning, logic is very important, it needs thinking not memorization, so if something I can’t understand it, then drop it, redo, but do not memorize it [ban!]. Therefore, to write this blog about Scala, based on so far what I understand, especially after watching 12 steps to better Scala. I plan to explain a few tips for better writing Scala by below toy code. My goal for Scala, I would like to experiment derivative learning, to practice something in Scala based on what similar I used in Python. Hope this can boostrap the learning and help structure my knowledge. (One reason for it, I strongly felt programming is the domain full of the rules (aka.pattern), only get enough or make peace with the strict rules, then there are some free space to be creative. yeah, freedom comes with discipline, hard lesson learnt with pain).
-
Polymorphism in Scala
Polymorphism is one key part in OOP when we use class, as we all known class servers two purposes in software programming. 1).construct values(instance) 2).define a data type. Here is one good explanation from SO, which summarizes that polymorphism is the ability (in programming) to present the same interface for differing underlying forms (data types). Therefore, what is to say in OOP, is transform the “verb”, the behavior of the group of subjects, be it class A, B,C,D as in one abstract object (interface), then implement concrete in its own building block, so the variable or function applied are able to take many different forms. It’s one of the ways to follow
DRY
principle, reuse code, which is the high level abstraction. -
Python lru_cache
I just read and inspired by this medium article Every Python Programmer Should Know Lru_cache From the Standard Library. From this article, it uses cache function to speed up Python code. The concept is similar to Redis case, which saves the first query result to the dictionary as key/value pairs or pre define the key value, which are stored on Redis database as a cache layer/memory, set TTL and other parameters. The purpose to do this is to save execution time when the same result needs to be called more than once, the second time call can be looked up through cache database, then to the main database, which saves time to run high volume dataset during run time, to reduce time cost and money (code cheap not code expensive).
-
Avro and Kafka REST for ETL- Part 2
Applying REST Proxy and Avro Schema Registry is good for front-facing, be it APP or any API rendering data or whatever. So we are using Apache Avro as a schema writer, to encode and decode messages between Kafka Producer and Consumer(s). Avro is different from JSON, it has schema, which is binary, well it’s flexible and easy to use. At least it’s the schema that Confluent highly recommend. Key value and key schema which can be used to register the schema:
- topic-key: unique integer.
- topic-value, the result of streamed message, is part of payload structure.
-
avro schema registry with Kafka REST - Part 1
I plan to make a separate article to do avro and its usage with Kafka REST, which I don’t know before. I will set up and run a simplified exploratory, so that I can understand. Confluent’s schema-registry provides a RESTful interface for storing and retrieving Apache Avro schema, is an HTTP-based proxy for your Kafka cluster. Goal: use schema registery to store all the schema, to send and receive messages in Python using Avro.
-
a letter to programming souls
I have been more active since this year, for I want to get involved in community. The place I came from. I came from learning programming is currently a lot of mates called ‘self taught’, yes I was and I am still, will be.
-
life is a journey
Life is not about happiness, but something above which is exploring the human capacity and feeling the full range of emotions with body functions we all gifted with.
Walking far away from somewhere with dirty feet, which brings back a wonderful story, getting hands dirty to build a craft, which truly the bliss behind the surface of the convenient happiness. Walk into the nature and after solo biking for 200KM this week. Nature itself is the perfect algorithm, but for the individual life that pass by for a short span, life. If we have been giving too much from the universe, maybe our life in this world is giving. I believe it. Giving is not the comfort zone, for we must need to create the value which can be giving back to the universe. It will be a hard and uncomfortable path, but a worthy and exploring journey, which is the true happiness indeed.
-
How to use Kafka for ETL
I choose to use Apache Kafka, one of the reasons is it provides API for user-friendly Python which easy to integrate to many other tools via Kafka Connect, also it’s so sophisticated for big data integration. I had confusion when I tested with both database connector through Python code and Kafka Connect. The Kafka Connect one is hard for the first time setting up and requires strict schema registry, otherwise it broke the ‘contract’ between schema reads and writes that Producer and Consumer requests. Though Python DB Connect and Kafka Connect both work, but I have one friend, who taught me few big data cases last year suggested me, Kafka Connect is industrial usage, which can make life easier. So I will demonstrate both approaches for streaming integration.
-
_2020 Resolutions
When I was young as a kid, I liked making dream and resolutions far away in time span, like I want to become a fashion designer at 20ish, hmm.. its far away, for when life goes, a lot of real dreams or something more suits me, actually is found on the way, after I explore and test. So I don’t plan for a long time, just have a big goal that I know I will devote my whole lifetime to do, then just doing and practice.