Disclaimer: This is an example of a student written essay.
Click here for sample essays written by our professional writers.

Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of UKEssays.com.

Report from a Building Company Internship

Paper Type: Free Essay Subject: Data Analysis
Wordcount: 9828 words Published: 12th Oct 2021

Reference this

Abstract

This is a report of projects undertaken, and the experience gained while doing the internship at Fletcher Building. Multiple projects revolving around designing business processes, master data management and analyzing cost savings were conducted to support the concrete division of the Company. This helped them lay out a financial plan for the year 2020. Detailed research on the existing technologies that could support the company's business unit was conducted. Master Data cleaning and business process were designed. This helps the company for providing effective customer service Cloud-based solution was designed to support the three business units of Concrete. This supports the company's growth by being advance in the market. Cost analyzing around hardware devices was done and the costsaving process was implemented. This provides a way for the company to reduce a 2% cost reduction around business units. Implementing the same business process across the other division unit of the company will be incorporated in the future. Reflection on the knowledge gained is also presented in the report.

Index Terms - Data Analytics, Machine learning, Visualization, Concrete, Cloud Computing, Master data

I. INTRODUCTION

The report is a summary of 10 weeks internship conducted at Fletcher Building, New Zealand. The company has six main divisions and has 36 business units under it. This leads to high management of customer data and providing efficient fast services to its customers on a daily basis. My internship was carried under the Concrete division of the Company which has three business units under it, named Firth, Golden bay Cement, and Winstone Aggregates. The Business Units of Concrete faces a problem by having discrepancies in Master data which links to a chain of subsequent problems for the company during the business line in terms of billing, delay in support, and time delays to process the order. Fletcher building being the largest construction company of New Zealand is highly reliable on providing the best customer service to its customers in a fast and efficient way. Every time a customer is on board, a sales representative is assigned to the customer based on the customer's location. The Sales Representative is then responsible to take correct details from the customer in terms of billings, product and product drop off location. This creates a problem when sales representative leaves the company, as the site managers become unaware of what correct location of the product is, the wrong product being loaded, the wrong customer being charged for the wrong product and long queues at Quarry to pick the product. The company does not have a provision of having real-time data to correct the discrepancies in the customer data or add/modify customer details. The delay in sorting the issue causes the company a high waste of resources in terms of time and poor customer service. Besides that, the company wishes to get rid of obsolete technologies in the business process and bring in new technologies to provide an efficient way of customer service. Apart from that, the three business units under concrete divisions want to perform some cost savings around 470 printers spread across New Zealand and also around landline/mobile billings.

The Concrete Division needs a system in line to do a) cost savings, b) a business process to manage master data and c) new technologies to support the customer by providing efficient customer service. However, keeping in mind the time-bound of 10 weeks internship, realistic multiple project goals were laid in terms of all the different projects which are:-

  1. Establish criteria to identify data that is outside of expectation or norm
  2. Put in place data quality guidelines
  3. Put in place a mechanism for following up discrepancies both with the customer and within the business
  4. Designing a Cloud-Based System for efficient Product Delivery
  5. Analyze the data on usage of printers and Phones and achieve at least 2% cost reductions in terms of energy and services usage purely by analyzing and understanding the data and chasing discrepancies.

The existing report is set in terms of reviewing the existing technologies related to the problem followed by the design and implementation of the business process built to cater to the goals. Lastly, the reflection on lessons learned during the project and the future scope will be discussed.

II. LITERATURE REVIEW

A. Review Stage

The project goals laid out for my internship can be divided into three broad categories

1) Data Analysing

This is where cost savings/reduction around printers and phone billings will be discussed

2) Master Data Management

This is where discrepancies in the data around a sales representative and customers will be discussed.

3) Cloud-based Technology

This is where Quarry management, of removing obsolete technologies and bringing in new technologies for efficient customer service will be discussed.

Work needs to be done around these three areas to provide a complete solution to the business units of concrete. In order to meet the project goal a detailed exploration of the technologies and business process which could cater to the need is carried out in the division of categories mentioned above.

1. Data Analysing

In order to understand the existing process, to perform cost savings and reductions contextual data needs to be collected to perform data analysis. For this, data mining and machine learning play a crucial role. A research was conducted on machine learning and then deciding which algorithm will be best suited for conducting exploratory analysis on data.

A) Machine Learning

Machine learning is a part of the Artificial Intelligence tool. "It is useful to characterize learning problems according to the type of data they use"[1]. The data entered is portioned into training and testing data. The algorithm runs a set of procedures over the training data to perform a series of tasks. "High computing now holds the power to be trained by providing data set to it and the knowledge can be extracted"[2]. The knowledge extracted from the training set will further enhance the training mode and keeps on developing skills. Machine learning generally trains the model and gains explicit knowledge to perform and solve the task. Pang quoted "Machine learning is a study concerned about the plan and improvement of calculations that enable PCs to create conduct dependent on experimental information" [3]

Machine Learning systems are very strong at learning empirical associations in data but are less effective when the task requires long chains of reasoning or complex planning that rely on common sense or background knowledge unknown to the computer.

Historically, the chuck of a large amount of data stored was of no utilization to the researchers. But now due to the high computing power, machine learning has added an asset to analyze the gigantic data and reach a greater insight. Machine learning makes use of any data mining algorithms that have been designed and implemented in several research areas such as statistics, machine learning, mathematics, artificial intelligence, pattern recognition, and databases, each of which uses specialized techniques from the respective application field[4].

There is a vast variety of tasks that are being processed by machine learning. Be it image analyzing, video streaming, processing a large chunk of data machine learning and high computing power generates high quality of output for the specified industry.

Since the database of the company provided with the last four months of statistical data, exploratory analysis can be performed in it. Exploratory Data Analysis is well known in statistics and sciences as that operative approach to data analysis aimed to improve understanding and accessibility of the results[5]

Machine learning can be broken down into two categories Supervised learning generally is done when the model could be used to analyze the same type of data to produce output which is almost similar to the output where the model was trained. A training set is provided to the model and algorithm run and gets trained on the desired output. Therefore, when the system encounters a similar problem, carrying analysis on the data becomes easy and the system learns on the way which makes machine learning classified as an intelligent System. The model that runs under supervised learning is generally named as classification, regression, and neural networks.

Get Help With Your Essay

If you need assistance with writing your essay, our professional essay writing service is here to help!

Essay Writing Service

Classification model, build a top-down structure, where one rule is assigned at each level regarding one of the features of the data set and the data set will execute through each level generating down the exact category a user wants the system to generate. To build the model the algorithm requires a set of available items together with their correct class label[4]. The classification model can be generated by various algorithms such as decision tree, K-tree or statistical tree.

Decision tree inducers are algorithms that automatically construct a decision tree from a given dataset[6]. A decision tree is best suited when you have nominal(yes/no) and numeric data attributes.

Therefore, for our data set, the attributes fall under nominal and numeric, we will use the decision tree to analyze our data set for cost savings.

Unsupervised learning means when the data set has no desired output which neither the user nor the system knows. The system starts representing the data and generates a pattern that helps predict the outcome for the future or generate a new insight that one might be completely unaware of.

It might appear to be to some degree secretive to envision what the machine might realize given that it doesn't get any input from its condition. However, it is possible to develop of a formal framework for unsupervised learning based on the notion that the machine's goal is to build representations of the input that can be used for decision making, predicting future inputs, efficiently communicating the inputs to another machine, etc[7]. Since we know the outcome we wish to achieve from our all data set, we decide to go ahead with supervised learning and decision tree algorithm as it is best suited for carrying analysis.

B) Visualization Tools

Using machine learning to analyze the data set helps, but a correct dashboard, that displays the result is highly needed in today's enterprise world.

Having the right kind of visualization tools, helps the business taking quick decisions. The enormous data set lying in the company's database makes no sense to the business heads until a visualization is a tool is put in place. Data visualization helps the company find the loopholes that exist in the current business process and provides an insight into meaningful data.

By using data visualization, the company can control and analyze the exact value of big data by accelerating the understanding value of the data, gaining deep insights and enabling the company executives to make perfect and quick decisions on advantageous business opportunities[8] Data visualization tools can be broadly classified into two main categories termed as streaming data and static data. Streaming Data source tools can be used on real-time data set to perform analytics and generate real-time graphical results. Analytics on real-time data is highly necessary for proactive decision making[9]

Visualization tool captures the live data and performs the desired operations on it if needed such as merging, splitting, etc. The functioned file can be stored over the cloud in JSON format and then the desired chart can be generated from it. The dashboard gives the business head real-time graph which captures data at regular intervals. High-frequency algorithmic trading in which market and news data must be analyzed and trading decisions must make now occurs within a window of few microseconds [10].Realtime Interactive Analytics allows users to explore a large data set by issuing interactive queries [11].

Real-time Analysis can be used by the company for tracking truck status for Quarry management. The real-time data tool can benefit the company by tracking the status of product pick up from Quarry and drop off at a customer location. This could help them increase efficiency in providing customer service and provide some good insight.

Besides this, another tool is used for static data set. This tool is used for data set which is in excel and CSV format. The tool performs visualization on the tabular data which is either received from opensource or is obtained through proprietary.

Since the data set obtained for cost analyzing is proprietary data, the static tool can be used for visualization. The visualization can be done by using any static data tool for example tableau, Infogrames, Power Bi or Qlik view. Since the company owns the license for Qlik view, static visualization can be carried on it to provide insight into the business.

2. Master Data Management

In order to understand the existing process, to establish criteria to identify discrepancies in the data, a mechanism needs to be put in place to determine where the loopholes are. Once done a business process needs to be placed for efficient management of customer data.

Project Management

Irrespective of the technology used, the business process needs to design and executed for the identification of data that is outside of exception or norm. In order to successfully execute the project, the project management process needs to be selected. There are two-three broad categories of models that can be implemented by the business for the project named as waterfall, agile and lean model.

Every model has its own pros and cons. Keeping that in mind the advantages and best practices were taken from agile and waterfall while designing the business process for Master data. Waterfall and agile transformation is showing a rapid increase in industrial large scale project[12]

On the other hand, the lean canvas was implemented in constructing the idea against business value for the Cloud-based System.

When it comes to the waterfall model, the approach followed is in a pipeline manner where the desired steps are followed in order to go from one input to another. The steps are predefined which are requirements followed by analysis, design, implementation, testing, execution, and maintenance This makes the system interdependent which means if one stage is not completed, the project cannot move to another phase. Besides that, this could be considered as a disadvantage of the waterfall model for smaller projects because if requirements are not well understood/defined or are likely to change in the course of the project[13], the waterfall model is considered unsuitable for it. One can often overcome the drawback of the waterfall model by adopting the characteristics of an agile model. The agile model allows one to visit the previous stages of project development and make the changes as needed.

This is generally seen as an iterative process. The idea of revisiting phases over and over is called "incremental and iterative development"[14]

Agile works better as compared to the waterfall model where the requirement laid are not defined enough to develop the right kind of business process/product.

Due to the time constraint of 10 weeks for the internship, and the projects being of exploratory nature, the Agile model was best suited for our smaller projects. The reason being, that the requirement laid out ere not well defined as some assumptions were made while defining the requirements.

Having agility in the project help laying the requirement in an efficient manner as effective communication can be laid out between the user and the designer. The agile model has its high advantages by overcoming the drawback of the waterfall model. However, time and performance of project development gets a bit difficult to evaluate.

3. Cloud-Based Technology

AWS states "Cloud computing is the on-demand delivery of computing power, database storage, applications, and other IT resources through a cloud services platform via the Internet with pay-as-you-go pricing" [15]. Cloud Technology has gained exponentially due to the feature of providing services to the user on the devices they use such as laptops, mobile phones, tablets, etc[16]. Cloud Computing provides a highly scalable environment by providing three service models a) Infrastructure as a service (Iaas) b) platform as a service(Paas) c) software as a service(Saas) as shown in Figure 1

Figure 1 Cloud Service Models

According to AWS, "Infrastructure as a service contains the the basic building blocks for cloud IT and provides access to networking features"[15].The platform as a service giver the enterprise the flexibility of deploying their code and get rid of the worry of hardware patching and servicing as that would be taken care of by the cloud service providers.

Software as a service provides the user with the feature of hosting their application and making them available to the user of the internet.

Cloud Technology comes with high features of auto-scaling, storage, accessibility, security, and usage.

Often the enterprise worry revolves around the issue of security, failover, and accessibility. But these barriers can be overcome by studying the case studies of the deployment of cloud over multiple enterprises of varied sectors.

The layers of cloud computing are interdependent on each other as shown in figure 2.

Figure 2 Layers of Cloud Computing

This generalizes the concept of high security being provided to the user data over each layer. Besides that, if enterprise wishes, they can encrypt the data and store over the cloud which will be encrypted again by cloud providers. This would provide indepth security if the enterprise needs it.

The generalized idea behind cloud services says "Enterprise do what they are good at and let cloud providers worry about the rest."

Designing a cloud-based solution for the company will help the company in the long run as the data from various quarries sites are operated around New Zealand can be accessible to the company anywhere on their IoT device. The high chunk of customer data can be stored in the S3 database of cloud and realtime analytics can be performed on them as cloud supports JSON file.

Besides this, the company can focus on what they are good at and the cloud providers can take care of hardware patching and maintenance. The company wishes to get rid of old legacy systems such as Citrix which was inbuilt and now causes accessibility and auto-scaling issues.

Using Infrastructure as a service will help the company in the long run and stay in line with the technologies. Besides that, fewer resources of the company will be wasted in maintaining the old legacy system which is getting obsolete over the years. Besides that, cloud services have high accessibility in terms of network failure. This is highly needed by the company when it comes to accessing data as the company wishes to expand its horizon over the Australia market in the future and spread market accessibility around Asia.

The real-time solution is also best suited as per the requirement of the customer ordering process, as the company wishes to display real-time fast information to the designates employee IoT devices or dashboards in order to provide effective customer service to its clients.

III. PROJECT EXECUTION

In order to achieve an outcome, different strategies discussed in the literature review were implemented. Since multiple projects were executed, a different approach was followed for each of them.

1. Data Analysis

In order to perform some cost savings around printers and phone billings, a standard approach was followed for both.

a) Gather the data from the service provider for the last four months

b) Perform cleaning on the Data if needed

c) Perform supervised learning on the Data

d) Present the outcome to the business heads

e) Take the appropriate action

The approach was taken for printers and for phone billings as discussed below

Phone Billings

Spark is the service provider for the three business units of concrete. Almost NZ$ 45,000 are spent monthly on phone/landline billings in one of the three business units of a concrete company called Firth. The internal restructuring of the company in 2018 has left an impact on the phone billing of the Firth company across New Zealand. The company wants to analyze the usage of phone billing in order to reduce and save 2% of the expenditure.

Gathering of Data

In order to first understand how the spark company does billing for its customers, I started with my research on the services it's providing to the company. Spark provides services such as Call minder, Broadband, Data plan, Call Prompting, Fax line, and Mailbox, etc. Besides that Spark has provided telephone mainline with DDI extensions for the Firth company across New Zealand. Once I understood the services spark provides and what it is used for, I started analyzing and collecting details of the firth plan sites located across new Zealand. I collected four months of data to find the usage of landlines and mobiles. I also started reviewing locations of these landlines and mobiles, placed and allocated to across the masonries and quarries.

To carry out the work of the project, the initiation phase was carried where classification and data cleaning was done. This was needed to understand the services offered by Spark Billing against each individual line numbers.

Cleaning of Data

In order to put in place a mechanism for data quality guidelines, I started identifying the data categories and the description provided of the services, the numbers they are used for. This helps me place a category of classes which is as mentioned below.

1. Mobile Rentals – Rentals paid for the mobile numbers bought from spark

2. Landline Rentals – Rentals paid for the IP Telephone Main lines and their DDI extensions bought from spark

3. Landline Services Rental- Services bought on those landline phones such as broadband and data card

4. Mobile Activity – call usages such as national and international calls made from mobiles

5. Landline Activity - call usages such as national and international calls made from landlines

6. Mobile Purchases – purchase of mobile phones for the companies representatives

7. Other activity- purchases of fibers and accessories related to the phones and telephone lines

8. Other services – Services such as Call prompting, call minder mailbox, Call forwarding and Call restricting being used over landline/ mobile phones

9. Other Purchases- Miscellaneous purchases

Supervised learning on Data

After doing data cleaning, data analytics was performed using the Qlik View tool to get some visual understanding of the data. The visualization was the total sum of expenses spent on mobiles and landlines across the firth.

Since the visualization was done as shown in Figure 3, it became easier to understand that the company pays rental for landline and mobiles and there has been no usage on those devices for the past four months. After identifying those numbers having rental with no usage, they were clustered according to their stationed geographical area. Once that was done, calls were paid to the respective service managers of the plant to discuss with them and ask them if they are aware of the services for not being used. A good insight that came after performing this process was that the company pays almost $2000 for fax services each month which can be reduced by simply moving from fax services to email services.

Figure 3 Classification Analysis of Phone Billing

Presenting the Outcome to the Business Heads

Besides this, a meeting was conducted with the CFO and IT Concrete Head to present the analysis report of the phone billing. This task seems lengthy and minor to some business heads but if not put in place, the company will pay the high amount over the course of 5 years.

This suggestion of transformation of services was suggested to the respected supervisors of the individual plants and queries were made as to how many customers still use fax. The number was under 25 in total. Therefore the business heads agreed to remove fax line services.

Apart from them, around the company was able to do a costsaving of $3313.27 by removing the services such as broadband plans on locations where it is not used. Some of the lines were disconnected as those line numbers haven't been used in the past three months and on some lines a block was placed to see if those services could be removed in the future. A request was placed in the company's portal for the removal and disconnection of services as shown in Figure 4 and Figure 5.

Figure 4 Disconnection of service

Figure 5 Disconnection of Spark Line in Company's Portal

Future Works

The company expects to be able to achieve cost reductions in terms of energy and services usage purely by analyzing and understanding the data and chasing discrepancies. If these task leads to cost savings than the company can look for applying the same process around the other two business units called Golden bay Cement and Winstone Aggregates.

Printers

Ricoh is the service provider of printers and its accessories to the concrete divisions. Around 450 printers of different specifications are installed across the three business units of concrete Divisions. The company spends around NZ$ 32,000 per month on the total cost including rental for the devices. The company wants to analyze the cost savings around the printers, by making sure the right printer is used against the right volume.

Gathering of Data

The four months printer usage by volume for both black and white and the color printer was gathered for the three business units of concrete divisions. In order to understand the data, research was done for various models of printers that Ricoh provides and the specifications of those models with functions. I also did my research on the various sites where the printers are installed in order to understand the no of printers installed and the need for them.

To carry out the work of the project, the initiation phase was carried where classification and data cleaning was done. This was needed to understand the usage of the printer by volume and rental charged.

Cleaning of Data

In order to put in place a mechanism for data quality guidelines, I started identifying the location of printers installed and started putting them in the category of model numbers and excluded the rental from the total charge and adding the new category for rental. I also added a category of the business units in order to better understand the data against the three business units.

Table 1 below is to explain how the same model has spread across different business units and the usage of volume with respect to the rental charge.

BU

No

MP301SPF

SP4310N

MPC3002

Firth

17

33.5

50

96.5

Golden bay Cement

10

45.5

65.5

120

Winstone aggregates

9

56.3

95

135.6

TABLE 1 Printer Model Rate Descripancies

Supervised learning on Data

After doing data cleaning, data analytics was performed using the Qlik View tool to get some visual understanding of the data.

While performing the data analysis, unexpected results were found for certain models.

For example, the Figure 6 below shows one of the models (MP301SPF) provided by Ricoh printer and the number of printers installed across the three business units. This insight was helpful and generated an unexpected result of the rental paid. The same model has different rental charged against the three business units. Winstone aggregates pay the rental of $ 56.3 NZD per device irrespective of the user while the other pays $45.5 NZD and $ 33.5 NZD shown in the image below.

Figure 6 Qlik Reporting on printer model MP301SPF

 

Figure 7 Printer MP301SPF on Company's Portal

This high difference in the proposed rental and the rental charges for the same device made me carry the analysis into find cost difference for the same models besides the usage. Below is another analysis done for Model SP4310N. The same model has different rental charged against the three business units. Winstone aggregates pay the rental of $ 95 NZD per device irrespective of the user while the other pays $65.5 NZD and NZ$ 50 shown in the figure 8 below.

Figure 8 Qlik Reporting on printer model SP4310N

Besides that, the same printer has different Rental price offered by Ricoh on the company's profile is $9.07 NZD as shown in Figure 9.

Presenting the outcome to the Business Heads

Before analyzing the usage around printers by volume, the difference in the rental was presented to the business heads as this demanded the immediate attention of the business heads. Table 2 below shows the three models with the actual rental cost and the average cost paid by the BU(Business Units) in total. Also, the column cost-saving shows the amount the company can save per month if the rental of the printer is charged as per the right amount which is the total saving of $ 7820 NZD. The Business Heads are in discussion with Ricoh company to justify the pricing list and decided to start a new licensing for printers spread across the whole of the Fletcher Building.

TABLE 2 Printer Model Rate

Model

Actual cost

BU

(Average

)

Cost savings

(Monthly)

MP301SP

F

20.20

46.7

3750

SP4310N

9.07

70.6

2570

MPC3002

45.05

57.5

1500

Future Works

The company expects to be able to achieve cost reductions in terms of rental and replace old printers with new printers as per the usage analysis. But in order to do so, they still are waiting for the new price list from Ricoh printers for the year 2020.

2Master Data Management

Firth manages a large number of customers on their online JDE portal. Every new customer who gets registered with the company is automatically assigned a Sales Representative to look after the project and products they need. The company believes that master data is not reviewed and is inconsistent in the system and this could result in a hassle when the software needs to be upgraded or changed.

Find Out How UKEssays.com Can Help You!

Our academic experts are ready and waiting to assist with any writing project you may have. From simple essay plans, through to full dissertations, you can guarantee we have a service perfectly matched to your needs.

View our services

In order to first understand how the system manages the master data for its customers, I was taken to a couple of meetings to understand the current business process for handling firth's customer data. I was given access to firths companies portal to understand the company's operational structure To carry out the work of the project, the initiation phase was carried where the two meetings in one week were conducted to understand the issue faced by the Pricing Manager of the firth. Ones the sales representative leaves, there is no way to know what price the customer is quoted for with respect to the products. Therefore, a feasibility study was conducted to understand whether all the problems faced by different key holders of the project lead to the same objectives that are being set up. The data was analysed in terms of Sales representative being assigned to the Customer and following was the result as shown in Table 3

TABLE 3 Sales Rep status

Sales rep Status

Sales rep Count

Customer count

Active

150

1500

Non active

90

1850

The table showed an insight that 90 non active sales rep were assigned to 1850 customers on board. The loophole was identified that whenever a sales rep leaves, the sales manager does not pass on the information to the pricing manager team. But we believed that once an employee leaves, the offboarding process of deactivating customer account should deactivate the sales rep status as non active and the customers allocated to the sale rep should be sent to the are manager in order for them to assign new sales rep. This lead to the designing of inhouse solution where, the system will replace the non active sales rep with the active sales rep present in the system based on the location of the customer and inform the are manager for approval.

3Cloud-based Technology

Parallel, to handling the customer data, a task to analyze the Quarry Point of Sale across Winstone aggregates and the issues faced by the Operation manager was given. Winstone aggregates have almost 10 Quarries around Auckland. Quarries are the land areas where a product like cement, gray sand, dust, and aggregates are produced for construction work. Some of these quarries are manned, i.e. handled by the assigned Quarry supervisor in charge present on-site and while some being unmanned. The vision is to shift the Quarry Point of Sale where old technologies are used to Cloud Point of Sale for better ease of the process both for providing high customer service and business value.

In order to first understand how the system manages, I was taken to two big Quarries of Winstone aggregates, one in Hanua and the other in Whangarei. The site visits made me understand from scratch as to how the materials/products distributed across the industry are made. Besides that, I was also able to understand the sustainability task undertaken by the company in order to save and protect the environment from emissions and resources. The site visits helped me clearly understand the realtime process that happens at the Quarry point of Sale, as in when the truck arrives at the Quarry for the material pick up and when the truck leaves the Quarry bridge. All this provided me with the contextual side of the problem faced by business units and customers' side.

Besides this a meeting was conducted with the Area Sales Heads of the three divisions of Winstone aggregates, Firth and GBC to review the problems, requirements, and actions. The initial meeting reviews the problem around Truck tracking across the sites and the solution was in high demand to the plant supervisors and Orders/Dispatch team.

In order to understand the reason for transforming the QPOS to CPOS, the Customers and users were identified, the ones who will get the value from this process and the one who would use the solution. This summarised a class of customers and users across the section as mentioned below

10. Orders & Despatch/Plant Supervisors/

11. Plant Batchers –Schedule of Truck

12. Sales- Problem Resolution

13. Customers – Current location to jobs

14. Finance – Customer Administration,

15. Credits, Waiting time, Real-time customer questioning

16. Orders office- Route Mapping optimization

17. Customers – Real-Time location

18. Truck Drivers- Better visibility of jobs, time management

19. Pricing Team – Out of area awareness, holding turn around

Once the customers and users were identified, the crucial problem that customers and users were facing were outlined. The detailed problems were identified as

1. The truck has a location on Docket-Not Reading of Docket, Wrong location

2. No actual address, Landmark Only

3. Poor logbook management

4. Intermittent quality of statuses

5. Technology is mixed- Not highly RT, coverage

6. No real Dashboard

7. Unmanaged Information- too much data

8. The existing system is legacy

9. Upgrades and Support are manual

10. Single person support

11. No Helicopter view of Truck Plotting

To carry out the work of the project. The initiation phase was carried where the first meetings in the third week of my internship were conducted to understand the issue faced by Managers of the firth. After understanding the issues, I outlined the problems, users and the impact of designing a robust solution for the same. The meeting helped me understand the different problems and their intensity while designing a solution as to which problems need to be addressed first and which will have a low impact if ignored for the time being. Based on the problems addressed, I moved to the designing phase of designing a business plan on a lean canvas model as shown in Figure 10. This helped me deconstruct the business heads' ideas around key assumptions and Business value.

Figure 10: Lean Canvas for Fletcher

Designing lean canvas helped me construct some of the ideas around business value, customer value and the solution. Once I was able to highlight the key points of the solution, I moved to design a Cloud-Based Point of Sale solution for the same.

Below is the cloud-based order process designed. Information is fed and maintained in the system every time a customer is set up on the portal. Whenever the customer places an order, a job is generated and the data maintained on the system which is visible to all the other sections of business units to process. This system lets the customer be in charge of entering the correct details including the registration number of the truck and the drop off location which solves one of the problems where the driver was responsible to enter the correct details. Besides that on phone orders and on-site pick up can be entered on the spot and be in sync with the rest of the data which avoids further discrepancies where the wrong customer is charged for the wrong product.

Once the order process is fed into the system as shown in figure 11, the date and time of the order pickup become the triggering event which then reflects on the other portals through cloudbased technology. The Applications loaded in driver's phone, Truck and Quarry kiosk pick up the scheduled order for the day.

Figure 11: Order Process for Winstone aggregates

Once the truck arrives at the quarry, the Quarry process is followed as shown in Figure 12 below.


Figure 12: Order Process for Winstone aggregates

The Driver selects the registration number displayed in the kiosk which removes the problem of a driver entering the wrong details. The loader who loads the product by volume is notified once the registration is selected which makes the loader aware of what product needs to be loaded into which truck. This completely takes away the responsibility from the driver's hand and is managed by the system. Besides that, once the truck is loaded with the correct process, the loader marks the job complete which automatically is updated in the system and the customer is notified about the status of the truck. This helps the concrete division provides high customer efficient skills for its customers.

Apart from that, the data is easily accessible from anywhere by the Quarry supervisor of the company, which helps them provide the right estimation of tonnage by-product and based on that produce the same.

Designing the business process focussing on customer improves Customer Data Management by making the customer responsible to create and fill their order and transport details which avoid discrepancies around order management. This business process also helps estimate customer needs and the ease of handling data around cloud management.

The flow chart mentioned above in Figure 11 and Figure 12 were presented to the business heads by showcasing the features it will bring to the company as mentioned in Table 3 below.

TABLE 3 Benefits of Cloud-based System

S.No

BENEFITS FOR FUTURE

1

Order volume for the day by product

2

Customer transparency through the process

8 Customer ownership of process + quality of data

Future Works

The business process is not designed yet due to the following reasons

1. High operational expenditure

2. The transformation from QPOS to CPOS and to let go off the legacy systems to be replaced with new systems seems hard.

On the contrary, there would be a high impact if there exists no future plan of doing the transformation from QPOS to CPOS as mentioned below

1. The upgrade in the older legacy system will become obsolete soon

2. It will get harder to manage the logbook efficiently over time

3. The ongoing maintenance which is currently being done for the Truck will likely increase

4. The business process will get behind the times as compared to the competitors.

IV. REFLECTION

A. Explicit & Tacit knowledge applied

While doing my 10 weeks of internship, the courses taken by me in my semesters helped me lay a good foundation which gave me the confidence to take the project and a roadmap on how to start my research.

1) Project management

I was able to understand that until and unless not all the key members of the project and the key stakeholders are identified, one cannot move forward with the project case. Besides that, a clear understanding of what is required both from the business side and customer side needs to be properly laid out. The lean canvas model became very effective, as while designing the cloud-based point of scale system, all the business heads, and the difficulties they faces were identified. While scheduling the meeting, one of the key stakeholders was away, I made sure to reschedule the meeting so that all business heads are present while the requirement is been laid out.

Besides that, I use to set up a timely meeting with the head of the concrete division, to update her on the progress and also with the Group technology team to understand the drawback with the existing technology uses at Quarry

2) Data Mining and Machine Learning

Understanding the class of data and how to perform data cleaning helped a lot of time and effort during my cost analyzing the project. With the help of the course, I was able to apply the right algorithm to generate the output. The knowledge gained in the course saved time which helped to present my output to the business heads during the internship period. This expedite process, helped the business heads to take the course of action by removing the unused telephone line and disconnecting the services which were no longer in use.

It helped me understand shifting a legacy old system off onsite to cloud-based and its importance. The critical thinking skills gained by this course helped me understand how to design an architecture that would cater to both customers and the business needs.

Besides that the company has a license of visualization tool "Qlik view" and therefore using this tool became very easy to learn. Since during the course taken in the university for data mining, I worked with various tools for performing data analytics, therefore, learning Qlik view was not a timeconsuming process. This made me concentrate on the projects more as the tools were known by me. Learning how to extract value from a large chunk of data helped me perform cost analyzing during my internship in an effective manner.

3) Adaptive Enterprise System

Besides that, my acquired knowledge from my other course of Adaptive Enterprise System, I was able to understand how data is stored in SAP and how Master data is recorded. This course helped me a lot in understanding the business process and the importance of master data. I was able to relate the contextual knowledge to practical work. Understanding of how data is stored in SAP helped me in understanding the customer data storage of the company. The company uses SAP ERP to maintain customer database which flows in other systems for processing.

While looking at the master data at SAP ERP of the company I was able to identify some loopholes in the system, which showed that the data was not updated and in sync with another system. I questioned the Group technology team of the company and they agreed that they have multiple different records of the same customer.

This made me set up a meeting with the Head of IT to address the importance of having one version of the truth in the database. The Adaptive Enterprise System really helped me to understand the importance of maintaining master data.

4) Special Topic in Information System

From getting rid of the obsolete technology and bringing in a cloud-based Point of Sale, this subject helped me understand the architecture and the features of cloud computing. When Business heads presented their queries regarding cloud-based deployment such as security, network failing, and autoscaling. I was able to address those queries with confidence. Besides that the course prepared for the industry-recognized certificate as a certified cloud Practitioner which helped me in understanding and designing a solution that could cater to the future expansion of the company around the world.

B. Lessons learned from the project

The internship helped me grow an individual. Besides the knowledge gain, I discovered my strengths and my flaws which

I can improve on

1) Take time to invest in yourself

When I started my internship, I went with a mindset of making an impact on the company by delivering the result. This mindset deviated me from learning new tools that were in the repository. Since I worked with PowerBI software in university, the idea of presenting the analysis report as quickly as I can to the business head, never made me explore other tools.

But when my supervisor explained to me to invest in myself by learning more tools, I realized I gained confidence and was more efficient in performing the task. This changed my mindset and made me realized Learning is a life long process and one should always invest in it.

2) Improve as you go

After doing cost analyzing for the printer, I realized that the business process was not set in line for requesting a printer. The specifications and the cost mentioned for varied printers on the company's portal were outdated. This made me iterate my steps and put in line a proper business process to be set up for renewing licensing/agreement with the service providers. Therefore when the cost analysis was done for phone billings, the first step that I did was checking the license and specification of mobiles on the company's portal.

C ) Project attributes developed

Professionalism is something that needs to be inbuild in every employee in order to respect each other's work time. During my course of 10 weeks of internship, I tried to adapt to the level of work and environment Fletcher buildings provide.

The following attributes mentioned below helped me gain the necessary skills to grow in the business world.

1. Productivity

While working in an organization, your productivity increases if one is always with an open mind and believes learning is a lifelong process. The company provided me with the repository of tools to use and learn. I was assigned a mentor and a buddy with whom weekly catch up was done. They motivated me and help me understand the working style of the company. This really helped me adjust in the first two weeks of my internship. Besides that my industry supervisor set a realistic goal for me which I was able to achieve and offered support and nice working environment to work

2. Professionalism

This is one of the attributed that I developed during my internship. Fletcher Building who has more than 2000 employees does not have a mandatory timesheet that needs to be filled my employees to put in their 40 hours per week. Yet every employee is on time and if late makeup for the lost time. This showed how professional everyone is when it comes to strong work ethics. Besides that in order to discuss any query or meeting, a calendar invite was sent. I learned that everyone respected each other's time

3. Networking Skills

Fletcher runs multiple seminars and workshops for graduate/intern. There was a fortnightly weekly session where the head of different division units will share stories of their career growth. Besides, everyone is always happy to network and provide you with the right amount of support. I gained my networking skills by going to such events and speaking to them one one one.

V. CONCLUSION

A. Project Evaluation

The 10 weeks internship is a combination of multiple projects, makes a bit difficult to evaluate the scope of success for a whole project as one.When it comes to cost analyzing, $ 3317.27 NZD per month is saved around phone billings. This cost-saving is done across one business unit of concrete divisions but if the same business process is applied to the other 35 business units, more than 2% cost saving will be achieved.

Besides that, the goals set out before the start of the internship are achieved by documenting the new analytics function and process to be implemented in other business units.In terms of managing master data, loopholes were identified, data was cleaned and the business process is placed for handling future discrepancies.

A business process was designed for product delivery by making use of existing technology and the whole system is designed to deploy and run over cloud-based technology for effective customer service. The business heads have given a "go ahead" flag to the product delivery designing phase and now the work revolves around financial assessment when the financial budget is released for the year 2020 by the company.Keeping that in mind, The multiple projects done are considered as success. The reason being, the desired goals set in the initial phase are achieved by the end of the internship.

REFERENCES

[1] Y. Baştanlar and M. Ozu ysal, "Introduction to machine learning," Methods in Molecular Biology (Clifton, N.J.), vol. 1107, pp. 105-128, 2014. Available: https://www.ncbi.nlm.nih.gov/pubmed/24272434. DOI: 10.1007/978-1-62703-748-8_7.

[2] R. Das and D. J. Wales, "Machine learning prediction for classification of outcomes in local minimisation," Chemical Physics Letters, vol. 667, pp. 158-164, 2017. Available: https://www.sciencedirect.com/science/article/pii/S000926141 6309204. DOI: 10.1016/j.cplett.2016.11.031.

[3] P. Tan, Introduction to Data Mining. (1st ed.. ed.) 2005.

[4] Anonymous "Chapter 1 - introduction to data mining," in Data Analysis in the CloudAnonymous 2016, . DOI: 10.1016/B978-0-12-802881-0.00001-9.

[5] F. Hartwig and B. E. Dearing, Exploratory Data Analysis. (14. print. ed.) 199216.

[6] L. Rokach and O. Maimon. (). Data mining with decision trees. Available: http://www.worldscientific.com/worldscibooks/10.1142/9097# t=toc.

[7] B. Love, "Comparing supervised and unsupervised category learning," Psychonomic Bulletin & Review, vol. 9, (4), pp. 829-835, 2002. Available: https://www.ncbi.nlm.nih.gov/pubmed/12613690. DOI: 10.3758/BF03196342.

[8] J. D. Miller. (). Big data visualization. Available: http://proquest.tech.safaribooksonline.de/9781785281945.

[9] R.Krithivasan Dr V.S Felix Enigo B.E Computer Science et al, "Data Visualization tools – A case study," vol. 14, .

[10] F. A. Rabhi et al, Real-Time Analytics. 2016.

[11] S. Perera and S. Suhothayan, "Solution patterns for realtime streaming analytics," in Jun 24, 2015, Available: http://dl.acm.org/citation.cfm?id=2774214. DOI: 10.1145/2675743.2774214.

[12] B. ., R. M. Reddy A and S. Bindu C, "A Systematic Survey on Waterfall Vs. Agile Vs. Lean Process Paradigms," I-Manager's Journal on Software Engineering, vol. 9, (3), pp. 34-59, 2015. Available: https://search.proquest.com/docview/1728276011. DOI: 10.26634/jse.9.3.3471.

[13] Wilfred Van Casteren, "The Waterfall Model and the Agile Methodologies : A comparison by project characteristics - short," 2017. Available: https://search.datacite.org/works/10.13140/RG.2.2.10021.5040 3. DOI: 10.13140/RG.2.2.10021.50403.

[14] Anonymous (). Agile software development.

[15] Overview of and Amazon Web Services, "Overview of Amazon Web Services," .

[16] B. de Bruin and L. Floridi, "The Ethics of Cloud Computing," Sci Eng Ethics, vol. 23, (1), pp. 21-39, 2017. Available: https://www.ncbi.nlm.nih.gov/pubmed/26886482. DOI: 10.1007/s11948-016-9759-0.

 

Cite This Work

To export a reference to this article please select a referencing stye below:

Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.
Reference Copied to Clipboard.

Related Services

View all

DMCA / Removal Request

If you are the original writer of this essay and no longer wish to have your work published on UKEssays.com then please: