Posts

Showing posts from November, 2025

Week 6 - BALT 4363 - Replit

Image
The plan for this week's blog was to try out Replit. I have never used Replit before, so I went to ChatGPT and asked for suggestions on what to do on the website. I was pleasantly surprised by ChatGPT's response, and I thought it gave me some fun but simple ideas to try out. Its suggestions were to make a calculator app, a quiz game, or a Mad Lib generator. I ended up choosing the Mad Lib idea, so I went ahead and took this idea to Replit. I put in my name and email, and that I was a student, chose the free version of the membership, and put in the prompt "I would like to make a Thanksgiving Madlib." Not even 10 minutes later, it created a working interactive app that could generate multiple stories.  Okay, maybe it was not completely finished after ten minutes. I started inputting different words, and in the middle of this doing the website refreshed, and Replit replied with this explanation.  " Oops, it looks like the Mad Libs form is still a bit shy! When you ...

Week 5 - BALT 4364 - CH 5

Image
I have always wondered how our phones can guess the next word or phrase we are going to type. Natural Language Processing (NLP) seems to be the answer to that, as it is the reason that computers can actually understand and work with human language. I am still impressed with the process, because it is either accurate or very close to predicting the rest of our sentences. I believe iPhones even adapt to the person who is using the phone and begin to recognize the owner of the phone's way of talking/lingo. Learning about text classification, sentiment analysis, and language modeling helped me see how AI can make sense of huge amounts of text data.  Using NLP to sort emails or analyze social media posts is a good example of how these models work and are used in real life. I also found it helpful to learn about important steps like tokenization, padding, and embedding, which prepare language data for machine learning. It makes sense that models like ChatGPT are built using these same te...

Week 5 - BALT 4363 - CH 5

Image
Chapter five introduces two more forms of data that can be simplified with Python. I believe the hands-on exercises in this chapter are among the best the book has to offer. The examples, such as calculating the mean and standard deviation of heights or analyzing real datasets like the Titanic and Iris data, make the concepts much easier to grasp. Rather than just reading definitions, actually seeing the code in action helps connect theory to practice. These exercises show how data scientists use simple tools like NumPy and Pandas to quickly summarize and interpret real-world data, which is a skill that is sought after in the job market. Equally important is the ability to transform raw numbers into visual representations, like histograms and plots, for example. Visuals make data more understandable, allowing patterns, trends, and outliers to stand out clearly. It is important to remember that data, just by itself, is not useful. There needs to be a purpose and a plan for what to do w...

Week 4 - BALT 4364 - CH 4

Image
 Chapter four covers a topic that I think is hard for most people to understand, including myself. That topic would be deep learning, and I get caught up in trying to figure out how something that is not "alive" is able to learn. Lucky for me, this chapter was able to answer some of those questions. The computers use neural networks similar to the structure and functions of the human brain. What separates deep learning from other types of machine learning is the fact that it can learn from the data on its own without us telling it what to look for. In my opinion, that is amazing and scary at the same time. So, what are some examples of deep learning that we interact with? Our phones have a couple of good examples. Siri uses speech recognition deep learning, while face recognition also implements this technology. The end of the chapter provides us with some ChatGPT prompts that I thought would help me make sense of some of this, but I decided to add a prompt of my own. I asked...

Week 4 - BALT 4363 - CH 4

Image
Chapter four of  Data Toolkit: Python + Hands-On Math teaches us how machine learning and AI are able to sort through large datasets as efficiently as possible, and that is linear algebra. The hardest class I have taken in college was a finite mathematics class, so when I saw that a whole section in this chapter was dedicated to matrices, I began to sweat. While I am mostly kidding, it took a lot of studying to understand matrices, so I am glad all those hours went toward something with practical applications. If there is one thing I will take away from these data AI classes, it is that people who can analyze and use data to their advantage will always be in demand. With that, tools make this data work much easier, and the people who can program the tools are in even more demand.  Learning this would be useful in practical situations, especially in my field, accounting. For example, I am in the middle of audit fieldwork working with large sets of financial data, and matrix op...

Week 3 - BALT 4364 - CH 3

Image
 Because I am still unfamiliar with coding, I had to put the code into ChatGPT to understand how it produced a linear regression model. ChatGPT broke it down into six steps that walked me through each part of the code and what its purpose was. The first section of code involved importing libraries, in this case, NumPy and Matplotlib. The next part generates synthetic data that involves creating the values and the randomness of the outputs. Step three is the splitting of data into training and testing sets. This is the part of the code that has X_train, X_test, etc. The section that includes model - LinearRegression() and model.fit creates the model using the training data. Y_pred = model.predict(X_test) uses the model to predict the y values for the testing set. The plt. section brings the model to life with labels, titles, and different colors for the dots.  It was helpful to get the explanation from ChatGPT. Typing out the steps took my understanding of this code a step furt...

Week 3 - BALT 4363 - CH 3

Image
  Learning how to handle and clean data with Python libraries like Pandas and NumPy is a valuable skill, especially in the data-driven world we live in today. These tools make it possible to organize, analyze, and interpret large amounts of information that would be too time-consuming to handle in spreadsheets. Knowing how to clean messy data, identify errors, and automate repetitive processes not only saves time but also ensures that the information used for decision-making is accurate and reliable. It’s a skill that combines efficiency with precision, two qualities that are essential in any analytical or business environment. In the accounting field, the field I am currently in, these skills can be applied in many practical ways. For example, I could use Python to automate the process of reconciling financial data, analyzing expense trends, or reviewing large transaction datasets for irregularities. I have had to do these tasks numerous times already, and Python would have saved...