Summary:
The space sector is currently undergoing a transformation with the rise of a stronger commercial sector. These changes impact notably the role of intergovernmental agencies which must adapt to meet the needs of the new players. Innovation labs have proven in other sectors to be a useful and efficient tool to welcome these transformations and in that sense the European Space Agency has created the Φ-lab focusing on the Earth Observation sector. This research consisted in comparing the Φ-lab with the main innovative models inside and outside of ESA to understand what can be learnt and implemented.
The approach chosen to answer this problematic was a hybrid methodology mixing systems and design thinking. The first phase of the project consisted in understanding how the Φ-lab is currently working and the reasons behind, through data collection and visualisation techniques. The second phase looked at where and how value can be added regarding its innovation practices with the help of an opportunity mind map and the design of a suggested concept.
Results:
The study outputs were threefold. Firstly, general knowledge was generated and organized on defining, modelling, and measuring innovation. A set of good practices for innovation organizations was assembled under the format of a tradespace exploration. Then, a more focused study of the Φ-lab’s model and mechanisms was performed. Notably, the ESA innovation landscape, stakeholder management and its internal innovation processes were looked at demonstrating the high value provided to Φ-lab stakeholders, particularly outside ESA. Comparing it to stakeholders’ expectations and other innovation organizations, areas of opportunities were identified, the main ones being: strengthening the interactions with ESA mainstream activities, actively promoting its culture of innovation within ESA and creating new flows between its internal offices to bridge the valley of death. Finally, with these opportunities in mind, suggestions were formulated in the form of a recommended concept and a list of selected methodologies that the Φ-lab may adopt in the future.
This research was conducted over a six-month stay in partnership with the Φ-lab management team and supervised by an innovation management professor. The outputs of the study were left in the hands of the management to act upon and some tested solutions will be adopted by the team. Disregarding the practical applications of this research, it also demonstrated the work and transformation that still need to be achieved in large organizations to respond to the changing needs of the industry. Innovation labs may be one of the solutions.
Summary:
This thesis delves into an analysis of Merck Group’s data related to project scheduling. The initial aim of the study was to correlate project milestone delays with different factors. Identifying relations between milestone delays and other variables is valuable for project scheduling, as it improves resource allocation and ensures that deliverables are met within a stipulated timeframe. However, such relationships were difficult to observe due to poor data quality, and recommendations for improved data practices were proposed. To model projects, the thesis uses graph theory and highlights methods such as PERT and CPM. The Monte Carlo simulation was also used to understand patterns in critical path, which offers insights into impacts on project decision-making.
Summary:
Employee rostering is a process of assigning available employees to open shifts. Automating it has ubiquitous practical benefits for nearly all industries such as reducing manual workload and producing flexible, high-quality schedules. In this project, a mixed integer linear program (MILP) was developed to optimize employee rostering for Swissgrid where it is still a largely manual process. To solve larger problems more tractably, a hybrid methodology was also developed which combined the MILP and scatter search, an evolutionary algorithm that has achieved success in many problem domains. The employee rostering model also aptly served as a simulator to evaluate the adequacy of any given number of employees so as to find, ultimately, the optimal staffing level in a company – an important aspect of strategic planning. To this end, golden section search technique was applied to find the optimal number of employees efficiently under a number of different parameter settings.
Results:
Relations between milestone delays and other variables were difficult to observe due to poor data quality, and recommendations for improved data practices were proposed in the scope of this thesis.
Summary:
Deviation management has become a major pillar in the quality assurance and quality control strategies of pharmaceutical companies, as the industry is one the most highly regulated sectors in the world. With a 40% increase in number of deviations observed and an average of 51.4% of late deviations completed after the due date, this study anticipates building a prediction tool to guide the root cause investigation process. The study explored deviation data of 6,542 data points and carried out an exploratory data analysis to observe the relationships between deviation categories, root causes, and lead times. A Multinomial Logistic Regression model was developed for structured data. It explored the relationship between the dependent variable which is the root causes with seven categories, and independent variables: deviation main categories variable which is categorical and was factorized, product related variable which is binary, and the year of deviation detection variable, which spanned from 2017 to 2023.
On the other hand, we deployed machine learning models, more specifically Natural Language Processing techniques, to predict the root cause of deviations based on their textual description, which is unstructured data. We performed data cleaning, data preparation and vectorization steps to feed into the algorithm input. The thesis leveraged the power of unstructured data embedded in the description of deviations to predict the root causes. The study also revealed some limitations of these methodologies and identified possible areas for future research, such as expanding the machine learning model for improved accuracy and combining regression and Natural Language Processing models for more comprehensive investigation guidance. These findings demonstrate the potential of Natural Language Processing and machine learning in aiding the root cause investigation process, offering helpful insights for managers to improve the overall efficiency of their deviation management process.
Results:
It was shown that the deviation main categories, statistically significant to the 95% confidence interval, strongly impact the root cause. By predicting the dataset with the multinomial logistic regression model, the accuracy was found to be 48.5%. On the other hand, the NLP results indicated that Linear SVC model and TFIDF Vectorizer with a balanced dataset had a higher accuracy of 88.9% and performed better than other machine learning models and regression model, in predicting the root causes of deviations.
Summary:
The paper aims to support the Operations function of a multinational company in successfully overcoming the challenges faced during the implementation of a new three-year strategy by making the project portfolio delivery cycle more agile. The main objective of the thesis is to prevent the recurrence of the inefficiencies in the use of time and resources experienced in the previous year. To this end, optimization proposals are identified to establish connections and alignment between the three main processes analysed: the strategic cycle, which defines long- term strategic goals & targets; the SDDS cycle, which establishes a roadmap for the functions and sub-functions to ensure strategic alignment; and the financial cycle, which sets the budget allocation. Specifically, key challenges include ensuring alignment between the portfolio definition cycle and financial cycle, and between the SDDS cycles, setting a clear direction by the Operations Management Team, and defining functional objectives with associated deliverables and milestones.
Results:
The analysis and mapping approach undertaken in this study provides a high-level understanding of
the key processes and connection points between them, allowing inefficiencies to be identified and
providing valuable insights for optimization.
The main recommendations suggested to the company are:
– Develop a comprehensive plan that integrates the financial and strategic cycles: establish
high-level financial boundaries from the outset and include both functional and cross- functional initiatives in the budget review.
– Properly handle the cross-functional SDDS cycle to define the targets and their ownership.
– Ensure time alignment between the assessed cycles by following the newly defined timeline.
– Obtain clear strategic goals & actions for the Operations sub-functions: implement the suggested
modifications for the SDDS cycle and begin the detailed masterplanning only once the Operations
strategic direction is established to avoid unnecessary re-loops of the project portfolio definition cycle.
– Adopt the proposed standardized approach to translate goals into targets, initiatives,
deliverables & milestones to ensure consistency between the method used by different Operations sub-functions and enhance the visibility of the goals set at all organizational levels. By adopting the suggestions outlined in this study, the Operations function can update its processes and practices, enhancing planning and execution.
Summary:
The goal of the thesis can be divided into three parts. Firstly, it tackles the sustainability accounting burden that organizations carry and defines a framework for measuring environmental impact (footprint) in the specific company’s value chain. Secondly, it dives deeper into the upstream sustainability strategy of the company, defines the implementation challenges it is facing, and does the mapping of the value chain carbon footprint methodology of upstream emissions in order to spot the places for potential improvement and suggest possible solutions. More specifically, the focus is paid to forming the optimization model for volume allocation and on the roadmap for the average emission factor reduction of a certain supplier group. Thirdly, as an incremental part of upstream emission reductions, supplier
collaboration is thoroughly assessed, starting from the main challenges and best practices, through benchmarking from other industries, leading to the proposed plan for the future supplier engagement strategy.
Results:
Firstly, current sustainability activity is mapped based on the SDGs model to enable a full understanding of the company’s current engagement. Then, Earth Systems Boundaries (ESB) model is used to define a comprehensive framework that the company can use to map the environmental footprint. Framework has been applied to one value chain, outlining the parts of the process that are having the greatest impact on the environmental footprint, taking into consideration air, water, land, and biodiversity. Secondly, an analysis of the current upstream sustainability strategy is conducted, challenging the feasibility of the plan in a given time framework. Emissions of raw material suppliers have been analyzed, and a volume optimization model is formed highlighting the volume distribution which will allow the company not only to achieve the next year’s targets and main 2030 scope 3 targets but also to minimize cost and maintain financial stability. The roadmap for certain supplier groups is provided, giving the distribution of necessary emission factors which have to be achieved per year. Methods used are piecewise negative exponentials, power law distribution, and Zipf’s law. Lastly, challenges and best practices for dealing with upstream scope 3 emission reductions have been investigated and discussed. A supplier collaboration framework is being introduced to help tackle supplier engagement in a more structured manner. Within the framework, it is suggested firstly how to identify the parts of the supply chain that are the emission hotspots, how to prioritize the activities and allocate the needed resources such that the biggest improvement is made, how to analyze suppliers and classify them within the subgroups which can be approached similarly. Additionally, it shows how to engage and develop sustainability strategies with different groups of suppliers, help them raise the level of their sustainability maturity, and ultimately which KPIs are important to be followed and how to continuously track the performance and create preventive strategies leading to the low-carbon future.
Summary:
The birth of the internet, and the widespread of information access that followed have completely disrupted the traditional equilibrium between consumers and vendors, pressuring marketers to rethink their entire approach. In today’s digital world, the marketing end goal goes well beyond making people aware of the brand. Establishing an individual relationship between the consumer and the brand has become the key into conducting powerful marketing campaigns. This is known as the customer-centric marketing, and digitalization, with its capabilities of targeting individuals with personalized content, is the playground where this approach thrives the most.
The marketing Graal, known as the “omnichannel strategy”, is the ultimate approach for marketing teams to achieve this level of closeness with their customers. It aims to create a seamless and unified experience between the different channels a customer uses. The approach drives customer satisfaction, loyalty, and revenue growth. An efficient data management is however required to succeed in the omnichannel transition, data being the driver behind marketing operations when any level of personalization is involved.
This turns out to be very difficult to achieve in practice, especially for larger sized businesses.
Indeed, technical complexity and inadequate organizational frameworks are barriers that hinder the execution of omnichannel strategies. In addition, new privacy regulations, no matter if they have been issued by governmental entities, or by tech giants, keep restricting and complexifying further the use of data. Especially, cookie-based marketing, which was the industry standard for almost two decades, has been rendered obsolete, forcing businesses to transition toward serverside data collection. This involves the need to build a complex data infrastructure. Fortunately, the apparition of third-party Software-as-a-Service (SaaS) solutions is starting to democratize data leveraging in the industry. The modern data stack landscape is full of specialized products working hand by hand together, letting marketers fully unleash the power of their customer’s data.
Summary:
The objective of implementing a Power BI dedicated to the performance follow-up of new products is to
efficiently monitor key performance indicators related to jewellery novelties both before and after their
launch. This will significantly enhance the demand planning team’s ability to swiftly identify anomalies
and reduce the time needed for forecast reviews. The tool will be helpful for the demand planning team
but also for the supply planning team, the marketing team and the management.
The Power BI will encompass various functionalities, such as tracking upcoming launches, monitoring
adherence to production and demand plans, analyzing performances against forecast over several time
periods, comparing different forecast snapshots, and tracking deliveries against the initial production
plan.
Up until now, the existing method consists in exporting data from Excel into a static PDF, which limits its
dynamic capabilities.
The tool was designed to answer the main questions asked by demand planners, namely:
– What are the future launches? Which ones are the most important of the fiscal?
– Is my launch ready to be launched at the scheduled date? Does my launch need to be postponed?
Is my stock good enough for the replenishment?
– Does the launch follow the initial forecast? How to review the forecast? What is the gap between
the initial forecast and the reviewed forecast?
– Is the metric split of the initial forecast in phase with the sales?
– The deliveries from the manufacture are they aligned to my production plan?
– During the worldwide meeting, when deciding the final potential of novelties, which step of the
process has the closest potential to the sales?
Results:
By using this Power BI tool, the demand planning team will increase reactivity and therefore will
experience significant time savings during every review cycle as they can quickly assess product
performance and make necessary adjustments. This increased responsiveness allows for a more accurate
sales forecast at the beginning life of the novelty. The estimated time saved for the team is about sixty
hours by month (ten hours by planner).
Furthermore, the implementation of this tool has brought attention to certain dysfunctions and
inaccuracies in the other models where the data is sourced. It has also highlighted the limitations of the
existing tools in managing unique cases and evolving requirements. To achieve optimal data processing efficiency, ongoing efforts are required to enhance the quality of sourced data and reduce its complexity.
Moreover, with additional data on will allow the creation of new dashboards allowing the tracking of the
pieces ordered by the markets, the deliveries, and the market availability.
In addition, a comparison to the range product could be added in order to quantify the cannibalization
caused by the introduction of the novelty.
Summary:
The study examines the decision-making process for cloud adoption. Special attention is paid to
connections with the Firm Boundaries question and the cost of building capabilities. Based on the TCO(Total Cost of Ownership), a model is developed to compare the cost-effectiveness of different hosting options: public and on-premise. Model is using factors such as hardware costs, cloud costs, DevOps expenses, and efficiency of IT capabilities.
On the one hand, the model recommends non-hierarchical governance mechanisms, such as using the cloud computing for teams with volatile service load, small or medium scale and low level of IT capabilities. On the other hand, hierarchical governance meaning on-premise setup, could be beneficial for teams with high IT capabilities and medium and large scale as well as for the teams with specific needs such as GPU-intensive computations. The results of the model are then evaluated through the interviews and industry case studies. Finally, instead of “cloud-first” narrative, data driven recommendations regarding cloud adoption are provided based on the model results, model limitations, and real-world examples.
Results:
This paper introduces a model for costs evaluation applying TCO approach. Simple yet powerful model suggests the most cost-effective hosting based on the scale of services and IT capabilities of the team. Quantitative results based on hardware costs, cloud costs, DevOps expenses and other factors reveal significant shift towards on-premise in case of large services and IT-capable teams as well as in case of specific needs such as GPU-intense computing.
According to the model, cost of building capabilities usually plays a crucial role in decision-making
process. Evaluation of limitations of proposed model enables to confirm some of the well-known cloud best practices.
Summary:
The increasing awareness of environmental and social impacts of economic activity has made sustainability a crucial component in business strategies. Start-ups, as important actors in the entrepreneurship ecosystem, can drive sustainable growth. However, obtaining funding from traditional sources is challenging due to the high level of risk involved with early-stage companies. Venture capitalists (VCs) offer capital and expertise to start-ups in exchange for an ownership stake in the company. Incorporating sustainability strategies into its operations can help VCs mitigate risks, identify opportunities, attract sustainability-concerned investors, and enhance long-term value creation. To evaluate the sustainability performance of their fund, VCs must obtain comprehensive information from start-ups about sustainability. Nonetheless, startups face unique challenges in the sustainability approach due to their characteristics. Despite the existence of various sustainability assessment methods, selecting and implementing a suitable method can be a complicated task for start-ups due to their wide variety. The thesis aims to develop recommendations for venture capital funds to incorporate sustainability in their operations while taking into account the capacity of start-ups to approach sustainability.
Results:
The study first proposes a framework for start-ups to approach sustainability by analyzing various
sustainability assessment methods and categorizing them based on properties related to startups.
Using this framework, three levels of approach to sustainability are identified: defining sustainability using Sustainable Development Goals and stakeholder analysis, determining materiality and monitoring to enable early iteration, and choosing appropriate sustainability standards. Subsequently, based on the framework and interviews, recommendations are made for implementing sustainability in Venture Capital firms at different levels of its operations. Investment decisions can be supported using start-up sustainability definitions and materiality assessment, utilizing the EU taxonomy for sustainability opportunities and EU Principal Adverse Impacts for risk assessment. Portfolio sustainability value creation can be achieved through case-by-case support, assessment, and action plans. Finally, the thesis outlines methods for the fund to communicate its sustainability efforts to stakeholders, including investors and potential portfolio start-ups, and to exchange best practices for sustainable development.