News AggregatorSemi-Supervised Learning: How To Overcome the Lack of LabelsAggregated on: 2024-08-07 21:07:53 All successfully implemented machine learning models are backed by at least two strong components: data and model. In my discussions with ML engineers, I heard many times that, instead of spending a significant amount of time on data preparation, including labeling for supervised learning, they would rather spend their time on model development. When it comes to most problems, labeling huge amounts of data is way more difficult than obtaining it in the first place. Unlabeled data fails to provide the desired accuracy during training, and labeling huge datasets for supervised learning can be time-consuming and expensive. What if the data labeling budget was limited? What data should be labeled first? These are just some of the daunting questions facing ML engineers who would rather be doing productive work instead. View more...10 Kubernetes Cost Optimization TechniquesAggregated on: 2024-08-07 20:07:53 These are 10 strategies for reducing Kubernetes costs. We’ve split them into pre-deployment, post-deployment, and ongoing cost optimization techniques to help people at the beginning and middle of their cloud journeys, as well as those who have fully adopted the cloud and are just looking for a few extra pointers. So, let’s get started. View more...Docker vs. Podman: Exploring Container Technologies for Modern Web DevelopmentAggregated on: 2024-08-07 19:07:52 Among the most often used containerizing technologies in the realm of software development are Docker and Podman. Examining their use cases, benefits, and limitations, this article offers a thorough comparison of Docker and Podman. We will also go over useful cases of deploying web apps utilizing both technologies, stressing important commands and factors for producing container images. Introduction Containerization has become an essential technique for creating, transporting, and executing applications with unmatched uniformity across various computer environments. Docker, a pioneer in this field, has transformed software development techniques by introducing developers to the capabilities and adaptability of containers. This technology employs containerization to package an application and all its necessary components into a self-contained entity. This provides consistent functionality regardless of variations in development, staging, and production environments. View more...How To Check and Update Newer Versions for Dependencies in Maven ProjectsAggregated on: 2024-08-07 18:07:52 With the passing of time, new versions of the dependencies are released into the market. We need to update the respective dependencies versions in the project as these versions have new changes and fixes for the security vulnerabilities. It is better to update the dependencies frequently in the project. Now arises the question: View more...How You Can Avoid a CrowdStrike FiascoAggregated on: 2024-08-07 17:07:52 By now we've all heard about — or been affected by — the CrowdStrike fiasco. If you haven't, here's a quick recap. An update to the CrowdStrike Falcon platform, pushed on a Friday afternoon, caused computers to crash and be unbootable. The update was pushed to all customers at once, and the only way to recover was to boot into "Safe Mode" and uninstall the update. That often required direct physical access to the affected computer, making recovery times even longer. View more...Introduction to Salesforce Batch Apex [Video]Aggregated on: 2024-08-07 16:07:52 Salesforce Batch Apex is a powerful tool for handling large data volumes and complex data processing tasks asynchronously. This tutorial will walk you through the core concepts and practical applications of Batch Apex in Salesforce, including the structure of an Apex batch class, writing unit tests for batch classes, scheduling batch classes, running batch classes ad hoc for testing, and understanding Batch Apex limits. Structure of an Apex Batch Class An Apex Batch class in Salesforce must implement the Database.Batchable interface. This interface requires the implementation of three methods: View more...SQL Interview Preparation Series: Mastering Questions and Answers QuicklyAggregated on: 2024-08-07 15:07:52 Welcome to the lesson of our "SQL Interview Preparation Series: Mastering Questions and Answers Quickly!" Throughout this series, we aim to assist you in getting ready, for SQL interviews by delving into different topics. Today we're delving into the core variances between SQL and NoSQL databases, a subject for any data focused job interview. Understanding SQL and NoSQL Relational databases, commonly referred to as SQL databases, are crafted to handle data. They adhere to a predefined schema, which makes them well-suited for situations where data integrity and consistency are crucial. On the other side, NoSQL databases offer flexibility and scalability by managing data and adapting to dynamic rapidly changing information. They find usage in web applications and social media platforms. View more...Enhancing Java Application Logging: A Comprehensive GuideAggregated on: 2024-08-07 14:52:53 Logging is crucial for monitoring and debugging Java applications, particularly in production environments. It provides valuable insights into application behavior, aids in issue diagnosis, and ensures smooth operation. This article will walk you through creating effective logs in Java applications, emphasizing three key aspects. Logging Good Information Creating Trackable Logs Ensuring Security and Avoiding Data Breaches We will use the java.util.logging package for demonstration, but these principles apply to any logging framework. View more...Enhancing Agile Product Development With AI and LLMsAggregated on: 2024-08-07 14:07:52 During my 10+ years of experience in Agile product development, I have seen the difficulties of meeting the rapid requirements of the digital market. Manual procedures can slow down highly flexible software engineering and delivery teams, resulting in missed chances and postponed launches. With AI and Large Language Models (LLMs) becoming more prevalent, we are on the verge of a major change. Gartner points out a 25% increase in project success rates for those using predictive analytics (Gartner, 2021). These technologies are changing the way agile product development is optimized - by automating tasks, improving decision-making, and forecasting future trends. As stated in a report from McKinsey, companies using AI experience a 20% decrease in project costs (McKinsey & Company, 2023). View more...How To Create and Run A Job In Jenkins Using Jenkins Freestyle ProjectAggregated on: 2024-08-07 13:52:52 As per the official Jenkins wiki information, a Jenkins freestyle project is a typical build job or task. This may be as simple as building or packaging an application, running tests, building or sending a report, or even merely running a few commands. Collating data for tests can also be done by Jenkins. For instance, a real-world scenario could involve Jenkins allowing you to submit reports to log management at any specified stage concerning management, which may include details about artifacts or shipping application logs. In this Jenkins tutorial, we will dive deeper into how to create a job in Jenkins and eventually, a Jenkins freestyle project. Let’s find out more about Jenkins Build Job before we begin creating a Freestyle Project. View more...extended Berkeley Packet Filter (eBPF) for Cloud ComputingAggregated on: 2024-08-07 13:07:52 eBPF, or extended Berkeley Packet Filter, is a revolutionary technology with origins in the Linux kernel that can run sandboxed programs in a privileged context such as the operating system kernel. eBPF is increasingly being integrated into Kubernetes for various purposes, including network observability, security, and performance monitoring. View more...JMeter Plugin HTTP Simple Table Server (STS) In-DepthAggregated on: 2024-08-06 23:07:52 The Need for the Creation of the STS Plugin From a Web Application in Tomcat The idea of having a server to manage the dataset was born during the performance tests of the income tax declaration application of the French Ministry of Public Finance in 2012. The dataset consisted of millions of lines to simulate tens of thousands of people who filled out their income tax return form per hour and there were a dozen injectors to distribute the injection load of a performance shot. The dataset was consumed, that is to say, once the line with the person's information was read or consumed, we could no longer take the person's information again. The management of the dataset in a centralized way had been implemented with a Java web application (war) running in Tomcat. Injectors requesting a row of the dataset from the web application. View more...Why You Should Use Buildpacks Over DockerAggregated on: 2024-08-06 22:07:52 Docker is the obvious choice for building containers, but there is a catch: writing optimized and secure Dockerfiles and managing a library of them at scale can be a real challenge. In this article, I will explain why you may want to use Cloud Native Buildpacks instead of Docker. Common Issue Using Docker When a company begins using Docker, it typically starts with a simple Dockerfile. However, as more projects require Dockerfiles, the following problematic situation often comes up: View more...Building an LLM-Powered Product To Learn the AI Stack: Part 1Aggregated on: 2024-08-06 21:07:52 Forget what you think you know about AI. It's not just for tech giants and universities with deep pockets and armies of engineers and grad students. The power to build useful intelligent systems is within your reach. Thanks to incredible advancements in Large Language Models (LLMs) – like the ones powering Gemini and ChatGPT – you can create AI-driven products that used to require a team of engineers. In this series, we'll demystify the process of building LLM-powered applications, starting with a delicious use case: creating a personalized AI meal planner. Our Use Case As an example use case for our journey, we're going to be building a meal-planning app. There’s no shortage of meal plans available online, including those customized for different needs (varying goals, underlying health conditions, etc.). The problem is that it’s often difficult (sometimes impossible) to find guidance tailored specifically for you without hiring a health professional. View more...Handling Schema Versioning and Updates in Event Streaming Platforms Without Schema RegistriesAggregated on: 2024-08-06 20:07:52 In real life, change is constant. As businesses evolve, the technology systems supporting them must also evolve. In many event-driven systems today, event streaming platforms like Kafka, Kinesis, and Event Hubs are crucial components facilitating communication between different technology systems and services. As these systems and services change, the schema of event streaming platform messages needs to be updated. The most common way to address this problem is by using schema registries like Confluent Schema Registry, AWS Glue Schema Registry, and Azure Schema Registry. However, in this article, I am going to discuss a simple solution that does not use any of these schema registries. Although I will use Kafka as an example in this article, this strategy can be applied to any other event streaming platform or messaging queue. View more...Not All MFA Is Equal: Lessons From MFA Bypass AttacksAggregated on: 2024-08-06 19:07:52 One-time passwords are one of the most relied-on forms of multi-factor authentication (MFA). They’re also failing miserably at keeping simple attacks at bay. Any shared secret a user can unknowingly hand over is a target for cybercriminals, even short-lived TOTPs. Consider this: What if the multi-factor authentication your users rely on couldn’t save your organization from a large-scale account takeover? That’s what happened to an organization using SMS one-time passwords to secure customer accounts. We’ll call the affected organization “Example Company,” or EC for short. View more...How To Solve OutOfMemoryError: Java Heap SpaceAggregated on: 2024-08-06 18:07:52 There are 9 types of java.lang.OutOfMemoryError, each signaling a unique memory-related issue within Java applications. Among these, java.lang.OutOfMemoryError: Java heap space stands out as one of the most prevalent and challenging errors developers encounter. In this post, we’ll delve into the root causes behind this error, explore potential solutions, and discuss effective diagnostic methods to troubleshoot this problem. Let’s equip ourselves with the knowledge and tools to conquer this common adversary. JVM Memory Regions To better understand OutOfMemoryError, we first need to understand different JVM Memory regions (see this video clip that gives a good introduction to different JVM memory regions). But in a nutshell, JVM has the following memory regions: View more...What Is "Progressive Disclosure" and How Does It Impact Developer Portals?Aggregated on: 2024-08-06 17:07:52 Progressive disclosure is a UX design pattern that reduces cognitive load by gradually revealing more complex information or features as the user progresses through the UI of a digital product (such as a portal). Why Should Platform Engineers Care? I've encountered the term a few times, but its value really hit home in July during two panels I participated in. First, in the LeadDev panel "How to implement platform engineering at scale," Smruti Patel mentioned this. Then, in a panel that I hosted with Abby Bangser, "When Terraform Met Backstage," our guest, Seve Kim, also mentioned progressive disclosure. View more...Harnessing DevOps Potential: Why Backup Is a Missing PieceAggregated on: 2024-08-06 16:07:52 We often hear about the importance of developers and the role they play in the success of a business. After all, they are those craftsmen who create the software and apps that make businesses run smoothly. However, there is one key element of development that is still overlooked – backup. Why? DevOps is constantly focused on delivering the best user experience and making sure the apps they build are bug-free. Yet what if something goes wrong one day or another? Let’s move on step-by-step. View more...Oracle: Migrate PDB to Another DatabaseAggregated on: 2024-08-06 15:52:52 If you want to migrate or relocate the PDB from one database to another, there are multiple options available in Oracle. Here, we will discuss a few of them. The source and target database can be in a standalone database, RAC, cloud, or autonomous database. After verifying the PDB is on the target, open it for customer access and remove it from the source database based on company policy. Prerequisites The target database version should same or higher than the source database. The source database should be accessible from the target. The degree of parallelism should be calculated properly. Aware of DBA privileged username/password on the source to create DB link The encryption key is different from the user password. Must have access to the encryption key - it may be either database or tablespace or table level The user in the remote database that the database link connects to, must have the CREATE PLUGGABLE DATABASE privilege. The character set on source and target should be compatible. Known Issues Tablespace may be in a big file. Tablespace may be encrypted. Using a database link, the target database should be able to access the source database. Create an Access Control List (ACL) or whitelist the IP address and port if required. To access the DB link, either enter the source database information in TNSnames.ora or give a full connection string. Stable network connectivity between source and target RMAN jobs may interfere with refreshable cloning. Port from source to target should be opened to copy the files or to access the DB link Remote CDB uses local undo mode. Otherwise, remote PDB may be opened in read-only mode. Copy/cloning/synchronization between source and target may vary by network traffic and speed. A few of the approaches are as follows: View more...Quick Scrum GainsAggregated on: 2024-08-06 15:07:52 TL; DR: Quick Scrum Gains Suppose you are a Scrum Master or Agile Coach. Have you recently been asked to explain your contribution to the organization’s value creation? In other words, does management want to know whether you are pulling your weight or if your salary is an expendable expenditure? This article points to ten quick Scrum gains you can pull off without asking for permission or budget to prove your contribution to your organization’s survival in these challenging times. Ten Quick Scrum Gains You Can Start Tomorrow A few years ago, when money was cheap, valuations high, and profits more than decent, no one questioned the necessity of a Scrum Master or Agile Coach. View more...How To Become a Software Engineer Without a CS Degree: Essential Strategies for SuccessAggregated on: 2024-08-06 14:52:52 Here is how I became a software engineer without a computer science degree. Let me be real with you: coding was hard. I wasted so much time fixing missing semicolons, mismatched brackets, and misspelled variables. Even when the code was compiled, it would not work as expected, and I would spend hours staring at the screen and questioning my life choices. But over time, I picked up some strategies that made coding click for me, and I'm going to share these strategies with you today. Don’t Try To Know Everything The first thing I learned was that as a programmer, you don't need to know everything. When I began my first programming job, I was unfamiliar with Linux commands. When I joined Amazon, I did not fully understand G. At Amazon, my first project was in Python, and I had never written a single line of code in Python. Later, when I joined Google, I could not program in C++, but most of my work was in C++. The point I'm trying to make is that you don't need to know everything; you just need to know where to find it when you need it. But when I was a beginner, I would try to do these 30-40 hour boot camps to learn a programming language, thinking that I was going to learn everything. In reality, you cannot learn everything there is to learn. So, do not wait until you have the right skills to start your project; your project will teach you the skills. Do not wait until you have the confidence to do what you want; the confidence will come when you start doing it. View more...Develop With OCI Real-Time Speech Transcription and Oracle Database NL2SQL/Select AI To Speak With Your DataAggregated on: 2024-08-06 14:07:52 Speak in your natural language, ask questions about your data, and have the answers returned to you in your natural language as well: that's the objective, and what I'll show in this quick blog and, as always, provide full src repos for as well. I'll leave the use cases up to you from there. You can learn more about these Oracle Database features here for the free cloud version and here for the free container/image version. Also, you can check out the Develop with Oracle AI and Database Services: Gen, Vision, Speech, Language, and OML workshop, which explains how to create this application and numerous other examples as well as the GitHub repos that contain all the src code. View more...Idempotency in Data Pipelines: OverviewAggregated on: 2024-08-06 13:52:52 Idempotency is an important concept in data engineering, particularly when working with distributed systems or databases. In simple terms, an operation is said to be idempotent if running it multiple times has the same effect as running it once. This can be incredibly useful when dealing with unpredictable network conditions, errors, or other types of unexpected behavior, as it ensures that even if something goes wrong, the system can be brought back to a consistent state by simply running the operation again. In this blog post, we will take a look at some examples of how idempotency can be achieved in data engineering using Python. View more...Creating a Command Line Tool With JBang and PicoCLI To Generate Release NotesAggregated on: 2024-08-06 13:07:52 Lately, I have been playing with JBang and PicoCLI, and I am pretty amazed at what we can do with these tools. I needed to create a script that would go to a specified repository on GitHub, check the commit range, and verify if any tickets were associated with them. Additionally, I wanted to check if the ticket was accepted and if the commit was approved or not. The idea was to integrate this script along with the CI/CD pipeline. While the traditional approach might involve using bash scripts or Python, as a Java developer, I feel more at home doing this in Java. This is where JBang comes into the picture. And since I want this to be a command-line tool, PicoCLI comes in handy. View more...Buh-Bye, Webpack and Node.js; Hello, Rails and Import MapsAggregated on: 2024-08-05 23:07:52 I enjoy spending time learning new technologies. However, often the biggest drawback of working with new technologies is the inevitable pain points that come with early adoption. I saw this quite a bit when I was getting up to speed with Web3 in “Moving From Full-Stack Developer To Web3 Pioneer.” As software engineers, we’re accustomed to accepting these early-adopter challenges when giving new tech a test drive. What works best for me is to keep a running list of notes and commands I’ve executed, since seemingly illogical steps don’t remain in my memory. View more...Go Serverless: Unleash Next-Gen ComputingAggregated on: 2024-08-05 22:07:52 In the digital revolution, where bytes fly faster than thoughts, one concept is bringing a paradigm shift in the tech cosmos: serverless computing. The thought of dealing with servers often makes us freak out. Server maintenance, scalability issues, and huge infrastructure costs can all be part of our nightmares. This is where serverless computing can be a game-changer. It aims to virtually save modern-day technology trouble so we can just focus on coding. “Serverless” doesn't literally mean the servers completely vanish. Instead, they are hidden behind the curtains until summoned. Think of it like a magic genie that is always at your beck and call to grant your computing wishes without the hassles of hardware management. View more...Scaling Prometheus With ThanosAggregated on: 2024-08-05 21:07:52 Observability is a crucial pillar of any application, and monitoring is an essential component of it. Having a well-suited, robust monitoring system is crucial. It can help you detect issues in your application and provide insights once it is deployed. It aids in performance, resource management, and observability. Most importantly, it can help you save costs by identifying issues in your infrastructure. One of the most popular tools in monitoring is Prometheus. It sets a de facto standard with its straightforward and powerful query language PromQL, but it has limitations that make it unsuitable for long-term monitoring. Querying historical metrics in Prometheus is challenging because it is not designed for this purpose. Obtaining a global metrics view in Prometheus can be complex. While Prometheus can scale horizontally with ease on a small scale, it faces challenges when dealing with hundreds of clusters. In such scenarios, Prometheus requires significant disk space to store metrics, typically retaining data for around 15 days. For instance, generating 1TB of metrics per week can lead to increased costs when scaling horizontally, especially with the Horizontal Pod Autoscaler (HPA). Additionally, querying data beyond 15 days without downsampling further escalates these costs. View more...Reimagining AI: Ensuring Trust, Security, and Ethical UseAggregated on: 2024-08-05 20:07:52 The birth of AI dates back to the 1950s when Alan Turing asked, "Can machines think?" Since then, 73 years have passed, and technological advancements have led to the development of unfathomably intelligent systems that can recreate everything from images and voices to emotions (deep fake). These innovations have greatly benefited professionals in countless fields, be they data engineers, healthcare professionals, or finance personnel. However, this increased convergence of AI within our daily operations has also posed certain challenges and risks, and the assurance of reliable AI systems has become a growing concern nowadays. View more...Building an IoT-based Waste Management System: A Software Architect's GuideAggregated on: 2024-08-05 19:07:51 The Internet of Things is a network of physical devices. These devices can be anything, like smart bins or home appliances. They have sensors that collect information. They also have software that processes this information. These devices are connected to the internet. This allows them to share the data they collect. For example, a smart bin can tell how full it is and send this information to a cloud platform. We can use IoT to manage waste better. Sensors can gather data about waste levels. This helps in organizing waste collection more efficiently. View more...How To Setup OAuth JWT in the Salesforce ConnectorAggregated on: 2024-08-05 18:07:51 In this post, we'll explain all the steps required to connect a Mule application to Salesforce using the Salesforce connector with the OAuth JWT flow. You can also create your own certificate for the OAuth JWT flow with Salesforce or with OpenSSL (signed by a CA or self-signed). Both options are very well explained in the video at the conclusion of the article from Stefano Bernardini, MuleSoft Ambassador. In this post, we’ll be using a self-signed certificate created by Salesforce but, keep in mind, that for production environments, a certificate issued by a Trusted Certificate Authority is always recommended. View more...Streaming Data Joins: A Deep Dive Into Real-Time Data EnrichmentAggregated on: 2024-08-05 17:07:51 Introduction to Data Joins In the world of data, a "join" is like merging information from different sources into a unified result. To do this, it needs a condition – typically a shared column – to link the sources together. Think of it as finding common ground between different datasets. In SQL, these sources are referred to as "tables," and the result of using a JOIN clause is a new table. Fundamentally, traditional (batch) SQL joins operate on static datasets, where you have prior knowledge of the number of rows and the content within the source tables before executing the Join. These join operations are typically simple to implement and computationally efficient. However, the dynamic and unbounded nature of streaming data presents unique challenges for performing joins in near-real-time scenarios. View more...Building a To-Do List With MongoDB and GolangAggregated on: 2024-08-05 16:07:51 Hi, there! Many have wondered how a simple task sheet or applications that provide such functionality work. In this article, I invite you to consider how you can write your small service in Go in a couple of hours and put everything in a database. Let's start our journey with Golang and MongoDB. View more...Free Tier API With Apache APISIXAggregated on: 2024-08-05 15:52:52 Lots of service providers offer a free tier of their service. The idea is to let you kick their service's tires freely. If you need to go above the free tier at any point, you'll likely stay on the service and pay. In this day and age, most services are online and accessible via an API. Today, we will implement a free tier with Apache APISIX. A Naive Approach I implemented a free tier in my post, "Evolving Your RESTful APIs: A Step-by-Step Approach," albeit in a very naive way. I copy-pasted the limit-count plugin and added my required logic. View more...Finding Your Voice: Navigating Tech as a Solo Female Engineer on Your TeamAggregated on: 2024-08-05 15:07:51 For most of my career, I have been the only female engineer on my team. You may wonder, what’s so significant about that? As I navigated the tech industry as the only female engineer on my team, I often felt isolated and lonely. The lack of community and a sense of belonging led to a growing imposter syndrome, and unfortunately, many women in tech resonate with this feeling. Throughout my 5+ years of experience as a software engineer, I have realized the importance of having a strategy and a supportive network to navigate this landscape. Here are some of my tips to tackle this head-on: View more...Harnessing the Power of AWS Aurora for Scalable and Reliable DatabasesAggregated on: 2024-08-05 14:52:52 In the era of digital transformation, businesses require database solutions that provide scalability and reliability. AWS Aurora, a relational database that supports MySQL and PostgreSQL, has become a popular choice for companies looking for high performance, durability, and cost efficiency. This article delves into the benefits of AWS Aurora and presents a real-life example of how it is used in an online social media platform. Comparison of AWS Aurora: Benefits vs. Challenges Key Benefits Description Challenges Description High Performance and Scalability Aurora's design segregates storage and computing functions, delivering a bandwidth that is five times greater than MySQL and twice that of PostgreSQL. It guarantees consistent performance even during peak traffic periods by utilizing auto-scaling capabilities. View more...Overview of Classical Time Series Analysis: Techniques, Applications, and ModelsAggregated on: 2024-08-05 14:07:51 Time series data represents a sequence of data points collected over time. Unlike other data types, time series data has a temporal aspect, where the order and timing of the data points matter. This makes time series analysis unique and requires specialized techniques and models to understand and predict future patterns or trends. Applications of Time Series Modeling Time series modeling has a wide range of applications across various fields including: View more...Upgrading Spark Pipelines Code: A Comprehensive GuideAggregated on: 2024-08-05 13:52:51 In today's data-driven world, keeping your data processing pipelines up-to-date is crucial for maintaining efficiency and leveraging new features. Upgrading Spark versions can be a daunting task, but with the right tools and strategies, it can be streamlined and automated. Upgrading Spark pipelines is essential for leveraging the latest features and improvements. This upgrade process not only ensures compatibility with newer versions but also aligns with the principles of modern data architectures like the Open Data Lakehouse (Apache Iceberg). In this guide, we will discuss the strategic importance of Spark code upgrades and introduce a powerful toolkit designed to streamline this process. View more...Automation Resilience: The Hidden Lesson of the CrowdStrike DebacleAggregated on: 2024-08-05 13:07:51 The recent CrowdStrike debacle was a wake-up call of epic proportions. A simple null pointer error in a routine software update brought airlines, media companies, first responder networks, and many other enterprises to their knees. There is plenty of blame to spread around. The hapless developer who coded the bug, of course. But also the quality assurance team at CrowdStrike, CrowdStrike itself and its CEO, and Microsoft, whose systems were only too happy to roll over and blue screen. View more...Optimizing Software Quality: Unit Testing and AutomationAggregated on: 2024-08-04 13:07:51 Unit testing is the first line of defense against bugs. This level of protection is essential as it lays the foundation for the following testing processes: integration tests, acceptance testing, and finally manual testing, including exploratory testing. In this article, I will shed some light on what differentiates unit testing from other methods and will bring examples of when we can or cannot do without unit testing. We'll also touch upon automation testing, which plays an important role in ensuring code reliability and quality. View more...Compound Interest in SoftwareAggregated on: 2024-08-03 13:22:50 Many articles and posts online show how regular, small, incremental changes result in much larger gains over time. This article applies compound interest calculations to show that a 1% daily increase provides a 37.8-times improvement/growth/whatever over the year — the power of compound interest! Frustrating at first, but the gains grew each day — in the end very pleasing. Software engineers make decisions daily about the design and implementation of the solution being worked on: functionality, code structure, data validation, flow, error handling, security, performance, reliability, external dependencies, configurability, and more. Expected functionality and time-to-market often prioritize what is included and what is deferred/ignored: certain failures/errors are accepted, edge cases require manual implementation, etc. It's feature/implementation/tech debt that likely needs future remediation. View more...Industrial IoT: Exploring Microcontrollers for Robust ApplicationsAggregated on: 2024-08-02 23:07:50 When we think of the Internet of Things (IoT), our imaginations are often limited to consumer applications: smart homes, smart transportation, smart offices, etc. However, these account for just the tip of the iceberg when it comes to IoT. Just beneath the surface lies a sprawling world of industrial IoT and industrial microcontrollers (MCUs). But just like an industrial solvent requires a formula special and distinct from your ordinary kitchen hand soap, the technical components of industrial IoT devices are starkly different from those that make your smart home HVAC system run smoothly. Nowhere is this distinction more sharp than when it comes to MCUs. View more...Leveraging Open-Source Contributions To Boost Your Freelancing ProfileAggregated on: 2024-08-02 20:52:50 Building your reputation as a freelance developer means diligently improving your skills and reputation through real-world efforts. One of the best ways to do that is to contribute to open-source projects. How can you maximize that strategy for the best results? Align Projects With Skills and Professional Goals Developer communities and the internet at large are full of open-source projects that need developers’ expertise. Narrow the abundant possibilities by focusing on those that match your skills and career goals. Then, you will stay more motivated because the work gets you closer to long-term aspirations while increasing your marketability. View more...Criticality in Data Stream Processing and a Few Effective ApproachesAggregated on: 2024-08-02 18:22:50 In the current fast-paced digital age, many data sources generate an unending flow of information: a never-ending torrent of facts and figures that, while perplexing when examined separately, provide profound insights when examined together. Stream processing can be useful in this situation. It fills the void between real-time data collecting and actionable insights. It's a data processing practice that handles continuous data streams from an array of sources. About Stream Processing Opposite to traditional batch data processing techniques, here, processing works on the data as it is produced in real time. In simple words, we can say processing data to get actionable insights when it is in motion before stationary on the repository. Data streaming processing is a continuous method of ingestion, processing, and, eventually, analyzing the data as it is generated from various sources. View more...Software Testing Errors To Look Out For (With Examples)Aggregated on: 2024-08-02 16:22:50 Software will never be bug-free. However, it’s important to minimize the number of bugs such that the impact on the functionality and user experience of an application is minimized. Bugs could come up due to different reasons. In this article, we will discuss them from the perspective of software errors. These are the errors that also need attention during the testing phase. We can divide the errors during software development into two sections for easy understanding: View more...The C̶a̶k̶e̶ User Location Is a Lie!!!Aggregated on: 2024-08-02 13:22:49 I recently sat in on a discussion about programming based on user location. Folks that are way smarter than me covered technical limitations, legal concerns, and privacy rights. It was nuanced, to say the least. So, I thought I’d share some details. View more...The One-Pixel Threat: How Minuscule Changes Can Fool Deep Learning SystemsAggregated on: 2024-08-01 23:07:49 AI vulnerabilities: From medical diagnostics to autonomous vehicles, discover how changing a single pixel can compromise advanced deep learning models and explore the critical challenges to securing our AI-powered future. Introduction Deep learning (DL) is a fundamental component of Artificial Intelligence (AI). It aims to enable machines to perform tasks that require decision-making mechanisms that tend to approximate those of human reasoning. DL models are at the heart of many advanced applications, such as medical diagnostics and autonomous vehicle driving. View more...My Journey to Master SQL Data AnalysisAggregated on: 2024-08-01 21:07:49 Dealing with sales data can feel like a task especially when you're trying to extract insights for important business decisions. In this article, we'll delve into the experience of a data analyst tackling these challenges and delve into why mastering SQL is key to achieving success. By the end, you'll grasp the role that data analysts play and how they can transform data into actionable information. Understanding the Role of a Data Analyst Data analysts often go unnoticed in organizations. They play a role. Their job involves sorting through volumes of data to identify patterns, trends, and insights that drive decision-making processes. Whether it's analyzing sales figures, customer feedback, or market research findings, data analysts are responsible for making sense of this information. However, this isn't a task. This demands a specific skill set that includes a solid grasp of SQL (Structured Query Language). View more...3 Reasons Why VPs of Engineering Are Choosing Low-CodeAggregated on: 2024-08-01 19:22:49 I often hear the phrases "Low code is not flexible" and "Scaling is the major issue with low code" when talking to developers who have used it to build complex use cases like food ordering apps, e-commerce apps, and tools like that. For any type of application building, it's important to understand who you are developing it for, what is the ROI, and what they are going to achieve. Low code is something that can be best used when you want to build apps for internal use, rich data dashboards, CMS apps, or any app for operational needs within your organization. Developers, the primary creators, benefit hugely from pre-built UI components, ready databases, and API integrations. Important stat: 70% of new applications created by enterprises are expected to utilize low-code or no-code technologies, stats Gartner findings. View more...Navigating BNPL Integration: Key Steps and Best Practices for DevelopersAggregated on: 2024-08-01 17:22:49 Buy Now, Pay Later (BNPL) solutions are altering the way in which clients connect to payment systems by offering financial flexibility and enhancing the checkout experience. This guide offers developers and architects comprehensive knowledge on integrating your checkout process with BNPL as well as technical details, best practices, and common challenges. Importance of BNPL for Developers Enhanced Conversion Rates The abandonment of carts is reduced by the use of BNPL solutions that provide flexible payment options; hence, making it possible for users to complete their transactions. Users will not easily abandon their carts due to high upfront costs whenever payments can be broken down into small manageable amounts. View more... |
|