The advent of transformer-based models has revolutionized the field of artificial intelligence, ushering in a new era of powerful and efficient AI systems. Transformers, introduced in 2017, have become the backbone of many modern AI applications, including natural language processing (NLP) and computer vision. These models leverage self-attention mechanisms to process data more effectively, enabling them to handle complex tasks such as language translation, text summarization, and image recognition with remarkable accuracy.
There’s no escaping the buzz around AI(artificial intelligence) these days. And yet, it’s hardly a couple of years before the pandemic, that Big Tech was relying heavily on human-input to fine-tune their models, something that used to be called the last mile problem of AI. e.g. Amazon was using its infamous Mechanical Turk to crowdsource cheap human labor for tasks like verifying AI inputs or labeling data.
Quick-commerce has taken the world by storm! Move over traditional retailers, welcome to the new kids of quick-commerce and fast-fashion! The relatively stable global e-commerce landscape is getting disrupted by Chinese quick commerce companies like Shein, AliExpress, Temu or Tiktok. All of them leveraging agile and lean methodologies to deliver their products with unprecedented speed and efficiency.
Let’s look at how lean, agile and product management mindset is driving this digital transformation in retail.
With rapid advance in the technology world, it makes sense sometimes to rethink our objective and outline the roadmap instead of just following the new and shiny. Data Mesh might seem a relatively new concept, however what its creator Zhamak Dehghani has done is spell out in words the approach required to reach the strategic goal of being a data-driven organization.
Figure: Data Mesh domain product notation, adapted from Zhamak Dehghani
The technology world often sees upheavals when disparate concepts are put together to achieve different objectives, creating something which is much more than the sum of its parts. Delta Lake is one such concept, which has melded log and ACID, bringing transaction and atomicity concepts into the ETL-analytics-big.data field, creating a revolution of sorts.
The problem(s): Since traditional data warehousing, the design and modeling of analytics systems relied on denormalized tables, as analytics systems were considered separate from transactional systems. This started to change with the move to the cloud and availability of more real-time data. With the advent of big data technology like HDFS/Hadoop, additional constraints on updates and storage of relational datasets were added due to performance costs. The difficulty was particularly acute for cloud customers who faced additional latency compared to on-premises HDFS/Hadoop users.
GDPR compliance meant deleting or correcting customer data required massive table-wide updates for a few records, with increased probability of data corruption and consistency issues in case of crashed updates.