
Released under FOI Act
Document 3
Incubator Support Initiative
Post-commencement
Evaluation
Section 22 and 47F
February 2019
Page 1 of 59

Released under FOI Act
Document 3
For further information on this research paper please contact:
Evaluation Unit
Department of Industry, Science, Energy and Resources
GPO Box 9839
Canberra ACT 2601
Phone : +61 2 6102 8901
Email: xxxxxxxxxx.xxxx@xxxxxxxx.xxx.xx
Creative Commons Licence
© Commonwealth of Australia 2020
Creative Commons
Attribution 4.0 International Licence
CC BY 4.0
All material in this publication is licensed under a Creative Commons Attribution 4.0 International
Licence, with the exception of:
•
the Commonwealth Coat of Arms;
•
content supplied by third parties;
•
logos; and
•
any material protected by trademark or otherwise noted in this publication.
Creative Commons Attribution 4.0 International Licence is a standard form licence agreement that
allows you to copy, distribute, transmit and adapt this publication provided you attribute the work. A
summary of the licence terms is available fro
m https://creativecommons.org/licenses/by/4.0/
Wherever a third party holds copyright in material contained in this publication, the copyright remains
with that party. Their permission may be required to use the material. Please contact them directly.
Attribution
Content contained herein should be attributed as follows:
Department of Industry, Science, Energy and Resources, Commonwealth of Australia,
Incubator Support initiative Post-commencement Evaluation The Commonwealth of Australia does
not necessarily endorse the content of this publication.
Requests and inquiries concerning reproduction and rights should be addressed to
xxxxxxxxxxxxxx@xxxxxxxx.xxx.xx.
Disclaimer
The views expressed in this report are those of the author(s) and do not necessarily reflect those of
the Australian Government or the Department of Industry, Innovation and Science.
This publication is not legal or professional advice. The Commonwealth of Australia does not
guarantee the accuracy or reliability of the information and data in the publication. Third parties rely
upon this publication entirely at their own risk.
For more information on Office of the Chief Economist research papers please access the
Department’s website at:
www.industry.gov.au/OCE
Incubator Support Initiative Post-commencement Evaluation
0
Page 2 of 59
Released under FOI Act
Document 3
Contents
Executive summary
1
Incubator Support
1
Objectives and outcomes
8
This evaluation
9
Methodology
10
1.
Need
1
1.1
There is a justifiable role for the Australian Government to support
exporting activity and innovation among Australian businesses
1
1.2
International evidence on the value-add of government support for
incubators is inconclusive
2
2.
Design
3
2.1
All outcomes need to be clear, consistent and integrated into the
initiative’s design
3
2.2
Expected actual outcomes for the regional start-ups may not match the
intended outcomes
4
2.3
Incubator Support fits reasonably well within EP
5
2.4
The department does not have much visibility of which start-ups access
each of the incubators’ services
5
2.5
Future evaluations should consider the time required by new incubators
to become self-sustaining
6
2.6
The current design may not be feasible if funding is reduced as is planned
6
2.7
Staff resourcing arrangements require clarification in order to assess
risks to ongoing management
7
3.
Implementation
8
3.1
Outputs are consistent with the initiative’s design and purpose
8
3.2
The initiative has mostly been implemented as planned, although some
changes were made to improve NEI application quality and better adapt
to regional circumstances
10
3.3
Incubators align moderately well with the intended primary target marke
11
3.4
It is difficult to determine the extent to which the program is reaching the
intended start-ups
14
3.5
Early outcomes for grant recipients are mostly positive
15
3.6
The EIR application, assessment and reporting processes are suitable
for participants
15
3.7
NEI applicants require further guidance on the type of information and
level of detail required in applications
17
3.8
Survey respondents are satisfied with the advice and support they have
received
20
Incubator Support Initiative Post-commencement Evaluation
i
Page 3 of 59
Released under FOI Act
Document 3
3.9
Governance processes are mostly effective, but efficiency could be
improved by delegating funding decisions, increasing information
sharing, and clarifying roles and responsibilities
20
4.
Performance assessment
23
4.1
Data collection would be improved if grantees were fully aware of
requirements and templates aligned with agreed data collection needs
23
4.2
Indicators need to be reviewed to ensure alignment with program
objectives and outcomes
24
4.3
The incubator model creates some challenges for assessing
performance
25
5.
Conclusion
26
Appendix A
Incubator Support initiative post-commencement evaluation
Terms of Reference
27
Appendix B
Methodology
31
Appendix C
Response to Terms of Reference questions
34
Appendix D
Analysis of applicant surveys
35
Incubator Support Initiative Post-commencement Evaluation
ii
Page 4 of 59
Released under FOI Act
Document 3
Abbreviations and acronyms
Abbreviation or acronym
Definition
ANAO
Australian National Audit Office
APS
Australian Public Service
ASL
Average Staffing Level
BGH
Business Grants Hub
CSM
Customer Service Manager
DIIS
Department of Industry, Innovation and Science
EP
Entrepreneurs’ Programme
EP Committee
Entrepreneurs’ Programme Committee
ISA Board
Innovation and Science Australia Board
KPI
Key performance indicator
NISA
National Innovation and Science Agenda
OECD
Organisation for Economic Co-operation and
Development
PM&C
Department of the Prime Minister and Cabinet
R&D
Research and Development
RIF
Regional Incubator Facilitator
Incubator Support Initiative Post-commencement Evaluation
iii
Page 5 of 59
Released under FOI Act
Document 3
Executive summary
Between May and November 2018, the Evaluation Unit in the Department of
Industry, Innovation and Science carried out a post-commencement evaluation
of the Australian Government’s Incubator Support initiative. Post-
commencement evaluations generally follow a program’s first year of operation
and examine program design and initial implementation. They identify early
issues, and recommend corrective action where needed.
This evaluation found that while initial implementation of the initiative has
progressed well, there is room for improvement. A summary of the evaluation
findings and recommendations is provided in table i below.
Incubator Support
The Incubator Support initiative (the initiative) is a measure under the National
Innovation and Science Agenda1 (NISA) and one of the four elements of the
Entrepreneurs’ Programme (EP). The initiative was announced in December
2015 as an $8 million initiative, had its funding increased to $23 million in May
2016, and was formally launched by the Minister in September 2016. The
design of the initiative was later changed to better reflect the Australian
Government’s focus on regional development. This included lower co-funding
requirements for regional activities and the establishment of Regional Incubator
Facilitators (RIFs). At the same time, a public data outcome was added through
the transfer of the DataStart initiative from the Department of the Prime Minister
and Cabinet (PM&C) to DIIS. The initiative was re-launched in December 2017
with new program guidelines that reflected changes to the program design.
Incubator Support provides grant funding to incubators — business support
organisations that help nurture innovative start-up firms by providing services
such as seed funding, co-location, mentoring and access to networks.
The stated objectives of the initiative are to assist Australian start-up firms to
develop the capabilities required to achieve commercial success; and develop
Australia’s innovation ecosystem, including in regional areas.
The initiative has two components:
Expert in Residence (EIR), which provides incubators with grants of up to
$100,000 for the secondment of national or international experts.
New and Existing Incubators (NEI), which provides grants of up to $500,000
to help develop new incubators, boost the effectiveness of high performing
incubators, or encourage incubators to work with data-driven start-ups.
Evaluation approach
The evaluation employed a mixed-methods approach, which included desktop
literature review, stakeholder interviews (28 in total), and a survey of successful
and unsuccessful applicants (64 applicants were contacted, with 20 responding
to the survey).
1 DIIS (2015) National Innovation and Science Agenda Report, accessed online at:
https://www.industry.gov.au/data-and-publications/national-innovation-and-science-agenda-report
Incubator Support Initiative Post-commencement Evaluation
1
Page 6 of 59
Released under FOI Act
Document 3
Evaluation findings
The evaluation found that there is a justifiable role for the Australian
Government to support exports and innovation among Australian businesses.
Nevertheless, we found that the value-add of direct government support for
incubators is inconclusive — international empirical research about the impact
of incubators on start-ups’ performance is mixed.
Incubator Support has been found to fit reasonably well within the EP on the
basis that it complements other elements that support later stage business
development and has the efficiency benefits of some shared governance
arrangements. The initiative has mostly been implemented as planned,
although some changes were made to improve NEI application quality and
better meet regional needs.
A desktop review of key documents found that in some cases stated outcomes
lack clarity and consistency, making it difficult to determine whether the
Incubator Support inputs and activities are appropriate. We also note that it
may be more difficult for regional start-ups to achieve the intended outcomes.
We suggest further research into the differences between incubators and start-
ups in metropolitan versus regional areas, to explore what might be more
realistic outcomes for regional start-ups.
Stakeholders surveyed are satisfied with the advice and support they have
received. The governance processes are found to be mostly effective, but
some changes could be made to increase efficiency. These could include:
seeking the Minister’s approval for NEI funding decisions to be transferred to
the program delegate, based on the recommendations of the EP Committee;
and sharing more information about the rationale, outputs, outcomes and
evidence for the initiative’s design with internal stakeholders.
Early outcomes for grant recipients are mostly positive. Stakeholder feedback
focused mostly on positive intended outcomes. They said that the grant
enabled them to access national and international resources and connections,
provide better experts and mentors, develop regional relationships, and extend
services to regional organisations. The evaluation found that the EIR
application, assessment and reporting processes are suitable for participants.
However, we found that the NEI application, assessment and reporting process
could be improved. This could be done by providing further guidance for
applicants on the type of information and level of detail required in the
application.
Importantly, the evaluation also found that data collection would be more
reliable and efficient if grantees were fully aware of all reporting and associated
data collection requirements at the start of the project, and if reporting
templates aligned with agreed data collection needs — noting that the
incubator model creates some challenges for assessing performance. One of
the issues we identified is that the department does not have much visibility of
the kind of start-ups that access each of the incubators’ services. The current
design of Incubator Support limits the extent to which the department is able to
define or collect information on characteristics of start-ups accessing services
funded by Incubator Support.
Incubator Support Initiative Post-commencement Evaluation
2
Page 7 of 59
Released under FOI Act
Document 3
To address these issues, we recommend that key performance indicators for
Incubator Support be reviewed and that reporting templates be revised to align
with agreed data collection needs. In addition, early awareness of reporting and
associated data requirements should be reinforced among grantees in the
interests of assuring the availability and quality of data submitted.
Concurrent with this evaluation, the Evaluation Unit, in consultation with
relevant policy and program areas, has reviewed the Incubator Support
program logic, data matrix and associated performance indicators.
Table i: Overview of evaluation findings and recommendations
Findings
Recommendations
1
Need
1.1
There is a justifiable role for the Australian
Government to support exporting activity and
innovation among Australian businesses
1.2
International evidence on the value-add of
government support for incubators is
inconclusive
2
Design
2.1
All outcomes need to be clear, consistent and
integrated into the initiative’s design
2.2
Expected actual outcomes for the regional start-
Future research should investigate the extent
ups may not match the intended outcomes
of differences in outcomes and implementation
between metropolitan and regional areas to
inform future program design decisions
2.3
Incubator Support fits reasonably well within EP
2.4
The department does not have much visibility of
See recommendations #2 and #3 below
which start-ups access each of the incubators’
services
2.5
Future evaluations should consider the time
required by new incubators to become self-
sustaining
2.6
The current design may not be feasible if
funding is reduced as is planned
2.7
Staff resourcing arrangements require
clarification in order to assess risks to ongoing
management
3
Implementation
3.1
Outputs are consistent with the initiative’s design
and purpose
3.2
The initiative has mostly been implemented as
planned, although some changes were made to
improve NEI application quality and better adapt
to regional circumstances
3.3
Incubators align moderately well with the
intended primary target market
Incubator Support Initiative Post-commencement Evaluation
3
Page 8 of 59
Released under FOI Act
Document 3
Findings
Recommendations
3.4
It is difficult to determine the extent to which the
Require applicants to specify how they will help
program is reaching the intended start-ups
start-ups reach international markets, how they
are meeting a need in a particular region or
sector, and how they intend to track the start-
ups they support
3.5
Early outcomes for grant recipients are mostly
positive
3.6
The EIR application, assessment and reporting
processes are suitable for participants
3.7
NEI applicants require further guidance on the
Provide further guidance for applicants to help
type of information and level of detail required in
clarify the type of information and level of detail
applications
required in NEI applications
(see also recs #7 and #8 below)
3.8
Survey respondents are satisfied with the advice
and support they have received
3.9
Governance processes are mostly effective, but
Investigate the merit of seeking the Minister’s
efficiency could be improved by delegating
approval for NEI funding decisions to be
funding decisions, increasing information
transferred to the program delegate, based on
sharing, and clarifying roles and responsibilities
the recommendations of the EP Committee
Share more information about the rationale,
outputs, outcomes and evidence for the
initiative’s design with internal stakeholders
Clarify and clearly communicate roles and
responsibilities, including for overall
coordination, when making changes to the
guidelines, application templates or related
documents
4
Performance assessment
4.1
Data collection would be improved if grantees
Customer Service Managers and Regional
were fully aware of requirements and templates
Incubator Facilitators should reinforce early
aligned with agreed data collection needs
awareness among grantees of their reporting
and associated data collection requirements in
the interests of assuring the availability and
quality of data submitted
Revise reporting templates to align with agreed
data collection needs in accordance with the
new program logic and data matrix for
Incubator Support*
4.2
Indicators need to be reviewed to ensure
Revise key performance indicators for
alignment with program objectives and
Incubator Support, based on the new data
outcomes
matrix*
4.3
The incubator model creates some challenges
Clarify how information will be sourced from
for assessing performance
start-ups to assess whether the initiative is
having its intended impact on the ultimate
beneficiary
*The program logic and data matrix, including KPIs, have been reviewed and updated concurrently with this evaluation.
Incubator Support Initiative Post-commencement Evaluation
4
Page 9 of 59
Released under FOI Act
Document 3
Management response
Response to report as a whole
Overall, the evaluation provides an objective assessment of the initiative that will strengthen its design
and implementation.
Through its continuous improvement processes, the department is already implementing a range of
actions, described below, that respond to the report.
We note that several of the report’s findings and recommendations relate to data collection. We also note
that — concurrent to this evaluation — the department’s Evaluation Unit led the development of a new
program logic and data matrix for the Incubator Support Initiative in consultation with Program and Policy
teams. These products will guide the initiative’s future data collection strategy.
Recommendation
Response
Future research should investigate the extent of
Supported. The Regional Incubator Facilitators
differences in outcomes and implementation
are likely to be leveraged to contribute to this body
between metropolitan and regional areas to inform
of research work, guided by the Policy area, as
future program design decisions
well as the Program Management team once a
comprehensive data set is available.
Require applicants to specify how they will help
Supported. Application and reporting templates
start-ups reach international markets, how they are
will be revised as required to align with agreed
meeting a need in a particular region or sector, and
data collection needs, in accordance with the new
how they intend to track the start-ups they support
program logic and data matrix for Incubator
Support.
This will include the introduction of a 12 month
Post Project report to expand the collection of
outcome data for target start-ups.
This approach is consistent with Recommendation
8 below.
Provide further guidance for applicants to help
Supported. The Business Grants Hub (BGH)
clarify the type of information and level of detail
drafts Grant Opportunity Guidelines based on the
required in NEI applications
Department of Finance approved template. This
feedback can be incorporated in the next iteration
of the ISP grant opportunity guidelines. The Hub
has engaged the Department of Human Services’
Experience Design Lab to run 3 projects over 2019
to test and receive feedback on BGH grant
opportunity guidelines, grant agreements and
reporting templates with the objective of improving
the user experience for our customers. Lessons
learned will be incorporated in all programs.
Investigate the merit of seeking the Minister’s
Supported. This will be investigated as part of the
approval for NEI funding decisions to be
program’s continuous improvement processes.
transferred to the program delegate, based on the
recommendations of the EP Committee
Share more information about the rationale,
Supported. A strategy has been implemented to
outputs, outcomes and evidence for the initiative’s
inform EPC members of grantee project outcomes,
design with internal stakeholders
as well as providing similar information as part of
the briefing process to the Minister as decision
maker for the New and Existing program element.
Further updates will be provided to relevant
stakeholders as the program matures.
Clarify and clearly communicate roles and
Supported. The BGH has a project underway to
responsibilities, including for overall coordination,
better define roles and responsibilities for senior
when making changes to the guidelines,
responsible offers and program managers at each
application templates or related documents
stage of the program life cycle. This should assist
in providing improved understanding for staff roles
and accountabilities.
Incubator Support Initiative Post-commencement Evaluation
5
Page 10 of 59
Released under FOI Act
Document 3
Response to report as a whole
Customer Service Managers and Regional
Supported. Operational guidance for Customer
Incubator Facilitators should reinforce early
Service Managers and Regional Incubator
awareness among grantees of their reporting and
Facilitators will be reviewed and updated as
associated data collection requirements in the
required, to reinforce grantees’ early awareness of
interests of assuring the availability and quality of
their reporting requirements.
data submitted
Revise reporting templates to align with agreed
Supported. Reporting templates will be revised as
data collection needs in accordance with the new
required, based on the new data matrix.
program logic and data matrix for Incubator
Support
Revise performance indicators for Incubator
Supported. Performance indicators will be revised
Support, based on the new data matrix
as required, based on the new data matrix.
Clarify how information will be sourced from start-
Supported. As identified above, the revised data
ups to assess whether the initiative is having its
matrix will set out how information will be sourced.
intended impact on the ultimate beneficiary
The introduction of a 12 month Post Project report
template for grantee incubators to complete will
expand the collection of outcome data for target
start-ups.
Incubator Support Initiative Post-commencement Evaluation
6
Page 11 of 59
Released under FOI Act
Document 3
Introduction
The Incubator Support initiative (the initiative) is a measure under the National
Innovation and Science Agenda2 (NISA) and one of the four elements of the
Entrepreneurs’ Programme (EP). Incubator Support provides grants to
incubators, with the aim of improving the capabilities and networks of start-ups.
The initiative was announced in December 2015 as an $8 million initiative, had
its funding increased to $23 million in May 2016, and was formally launched by
the Minister in September 2016. The design of the initiative was later changed
to better reflect the Australian Government’s focus on regional development.
This included lower co-funding requirements for regional activities and the
establishment of Regional Incubator Facilitators (RIFs). At the same time, a
public data outcome was added through the transfer of the DataStart initiative
from the Department of the Prime Minister and Cabinet (PM&C) to DIIS. The
initiative was re-launched in December 2017 with new program guidelines that
reflected changes to the program design.
For the purposes of the initiative, an incubator is defined as a ‘business support
organisation that fosters innovative start-ups, focused on international trade,
through the provision of services such as seed funding, co-location, mentoring,
professional services and access to business networks’.3 Acting through
incubators, the initiative aims to improve the capabilities and networks of start-
ups. A start-up is defined for the purpose of the initiative as ‘an innovative,
adaptive early-stage and scalable company, with global potential’.4
Another term that is important to this initiative is ‘innovation ecosystem’. This
refers to an open network of organisations that interact with each other and
operate within framework conditions that regulate their activities and
interactions. There are three components of the innovation ecosystem:
Innovation activities — the discrete activities that lead to discoveries with
commercial potential including research & development (R&D),
entrepreneurial activity, innovation funding (e.g. venture capital), and the
generation of skills for innovation.
Networks — the formal and informal linkages between people and
organisations in the innovation system, including communities of practice
(such as medical professionals and software developers), joint research
2 DIIS (2015) National Innovation and Science Agenda Report, accessed online at
https://www.industry.gov.au/data-and-publications/national-innovation-and-science-agenda-
report
3 DIIS (2017)
Incubator Support Program Guidelines Version – November 2017, accessed online
at
https://www.business.gov.au/assistance/entrepreneurs-programme/incubator-support-new-
and-existing-incubators
4 DIIS (2017)
Incubator Support FAQs, accessed online at
https://www.business.gov.au/assistance/entrepreneurs-programme/incubator-support-new-and-
existing-incubators
Incubator Support Initiative Post-commencement Evaluation
7
Page 12 of 59
Released under FOI Act
Document 3
arrangements, industry-research collaboration and public procurement of
private sector research outputs.
Framework conditions — the institutional environment and general
conditions for innovation activities, networks and collaboration. These
components collectively function to produce and diffuse innovations that
have economic, social and/or environmental value.5
Objectives and outcomes
The objectives of Incubator Support are to:
assist Australian start-ups to develop the capabilities required to achieve
commercial success in international markets and realise their economic
potential faster than they otherwise would
develop Australia’s innovation ecosystem including in regional areas.6
The intended outcomes of Incubator Support are to:
support new Australian incubators targeting innovative start-ups to assist
them to trade internationally
expand the scale and operations of existing Australian incubators to
increase innovative start-ups’ chances of success in international markets
develop new innovative Australian start-ups with a focus on international
markets
create opportunities for Australian start-ups to develop sustainable
international businesses through access to open public data.7
The initiative seeks to achieve these objectives and outcomes by providing
grant funding to business incubators through two components:
Expert in Residence (EIR) provides business incubators with grants of up
to $100,000 to:
increase the capabilities of incubators and improve the chance of
commercial success for start-ups in international markets by organising
and providing access to top quality research, managerial and technical
talent through incoming and outgoing secondments of national or
international experts.
New and Existing Incubators (NEI) provides new and existing business
incubators with grants of up to $500,000 to:
5 DIIS (2017), Australian Innovation System report,
https://publications.industry.gov.au/publications/australianinnovationsystemreport2017/docume
nts/australian-innovation-system-report-2017.pdf, pp. 7-8.
6 DIIS (2017)
Incubator Support Program Guidelines Version – November 2017, accessed online
at
https://www.business.gov.au/assistance/entrepreneurs-programme/incubator-support-new-
and-existing-incubators
7 Ibid
.
Incubator Support Initiative Post-commencement Evaluation
8
Page 13 of 59
Released under FOI Act
Document 3
help develop new incubators in regional areas and/or sectors with high
potential for success in international trade
boost the effectiveness of high performing incubators, including
funding support to expand their services and/or develop the innovation
ecosystem
encourage incubators to work with more data-driven start-ups that use
public data as part of their business.
This evaluation
This report presents the findings and recommendations arising from a post-
commencement evaluation of Incubator Support undertaken by the
department’s Evaluation Unit. The evaluation was undertaken from May to
November 2018.
Authority for evaluation
EP is a Tier One evaluation priority in the department’s Evaluation Plan 2017-
2021. A post-commencement evaluation of EP was conducted in 2016, prior to
Incubator Support being established. The post-commencement evaluation of
Incubator Support was identified as a priority, with a view to gradually aligning
the evaluation stages of all four EP elements.
Evaluation oversight
Oversight of this evaluation was provided by a Reference Group (see table ii),
which endorsed the Terms of Reference (
Appendix A), reviewed the findings
and recommendations, and provided feedback on the draft report.
Table ii: Evaluation Reference Group members
Name
Role
General Manager, Insights and Evaluation Branch, Economic and
Chair
Analytical Services Division
General Manager, Commercialisation Policy Branch, Science and
Member
Commercialisation Policy Division
General Manager, Food, Chemicals and Business Facilitation
Member
Branch, Industry Growth Division
General Manager, Entrepreneurs’ Programme – Partnerships and
Member
Reform, AusIndustry
General Manager, Entrepreneurs’ Programme – Program
Member
Management and Delivery, AusIndustry
General Manager, Grant Advisory and Enabling Services,
Member
AusIndustry
Source: Incubator Support post-commencement evaluation Terms of Reference
Incubator Support Initiative Post-commencement Evaluation
9
Page 14 of 59
Released under FOI Act
Document 3
Evaluation scope and purpose
According to the department’s
Evaluation Strategy 2017–2021, post-
commencement evaluations typically follow a program’s first year of operation
and examine program design and initial implementation. This allows decision-
makers to identify early issues regarding administration and delivery and take
corrective action if needed.
The Terms of Reference outline the evaluation questions, which are grouped
under need, design and implementation. Evaluation questions from the 2016
EP post-commencement evaluation were included for continuity and
consistency.
This evaluation focuses on the time period September 2016 to June 2018. It
focuses on the following overarching questions:
What need is the Incubator Support initiative addressing?
To what extent is the design of Incubator Support evidence-based and
logically consistent?
Were the set up phase and grant delivery process appropriate?
Are governance arrangements effective?
Are mechanisms in place for robust performance assessment of IS?
Methodology
A mixed-methods approach was used to gather wide-ranging qualitative and
quantitative information about the need, design and implementation of
Incubator Support. Data collection methods and sources are described below
with further details in
Appendix B.
Desktop literature review
A desktop review was conducted of program documents and literature,
including:
background policy documents (such as the November 2015 National
Innovation and Science Agenda and February 2018 ‘EP policy rationale’8 )
program documentation (such as Program Guidelines and Standard
Operating Procedures)
project applications, assessment and reports
literature and reports on entrepreneurship
website information about other relevant national and international
programs
8 The ‘EP Policy Rationale’ is an internal departmental document (unpublished).
Incubator Support Initiative Post-commencement Evaluation
10
Page 15 of 59
Released under FOI Act
Document 3
Stakeholder interviews
A total of 28 semi-structured interviews were conducted with a range of internal
and external stakeholders between July and August 2018. As post-
commencement evaluations focus on design and initial implementation, the
majority of consultations were with internal stakeholders. Stakeholders
interviewed were from the policy area, Business Grants Hub (BGH), program
management, RIFs, Customer Service Managers, (CSMs), the EP Committee,
and other relevant experts. Representatives of three incubators who responded
to the survey and whose applications had been funded were also interviewed.
While the number of stakeholders that hold a particular view has not been
specifically quantified, qualifiers such as ‘a few’, ‘many’ or ‘the majority’ are
used to indicate the strength of support for a finding.
Further details about the interviews are in
Appendix B. When quoting
interviewees, this evaluation identifies them by the following categories:
Internal stakeholder
External expert
Participant.
Survey
A survey was sent to the 64 Incubator Support applicants (successful and
unsuccessful) who had applied for funding as at 31 May 2018. Survey
questions covered the application and reporting processes, interaction with
Incubator Support officials, and early outcomes. Twenty applicants responded
to the survey, giving a response rate of 31 per cent. Respondents comprised
eight EIR applicants (all successful) and 13 NEI applicants (seven successful
and six unsuccessful). One respondent was both an EIR and NEI applicant.
The survey results and survey questions are in
Appendix D.
Limitations of this evaluation
The interviews and survey are a key part of the evidence base for the
evaluation findings and this should be taken into consideration in interpreting
the findings. The range of stakeholders involved in the consultation process
was arguably weighted towards those who are likely to have an interest in
Incubator Support continuing, so there may be some bias. Given the self-
selection bias inherent in voluntary survey methodology, and the relatively low
response rate, the survey results should be considered to be indicative rather
than statistically representative of the population of incubators that have
applied for grant funding.
Structure of this report
This report is structured around the evaluation findings. The report commences
with findings on need and progresses through to design, implementation and
performance assessment. Recommendations are presented directly after the
finding to which they relate.
Incubator Support Initiative Post-commencement Evaluation
11
Page 16 of 59
Released under FOI Act
Document 3
1.
Need
1.1
There is a justifiable role for the Australian Government to
support exporting activity and innovation among
Australian businesses
According to the
Australian Innovation System Report 2015, innovation ‘is the
core driver of business competitiveness and productivity’, and creates more
opportunities for new products, industries and markets.9
Exports are a significant contributor to Australia’s prosperity, accounting for
over a quarter of Australia’s increase in GDP over the last 25 years. Innovative
start-ups in particular make up a disproportionate contribution to Australia’s
growth in export sales.10
However, a relatively low proportion of Australia’s income is from innovative
goods and services compared with other OECD countries, and there is a low
level of new-to-market goods and services.11 Australian start-ups also have a
relatively low three-year survival rate and reach smaller sizes on average.12
Australia’s low innovation performance reflects a system failure. The innovation
ecosystem is weakly networked, with a low level of collaboration between
business and research sectors.13 This system failure is also driven by firms’
lack of access to the right skills14 and a risk-averse business culture.15 During
consultations, a few interviewees stated that Australia’s business culture limits
the potential for commercialising innovation:
Australia’s actually a remarkably conservative business culture and it’s highly
risk-averse…we’re very good in invention, very good in innovation, but lousy
at commercialisation, and that’s partly because of conservatism. – External
expert.
In other countries, government policy has been a catalyst for the development
of successful start-up hotspots. For example, US government innovation
programs such as Small Business Innovation Research played a critical role in
the growth of the Silicon Valley through the provision of grants and encouraging
the commercialisation of research.16 Government policies were also
instrumental in promoting a start-up culture in South East Asian countries, such
9 DIIS (2015)
Australian Innovation System Report 2015, p. 1
10 Swanepoel, Tuhin (2016) DIIS Research Paper 7/2016:
Export behaviour and Business
performance: Evidence from Australian Microdata, p.4
11 DIIS (2016)
Australian Innovation System Report 2014, p. 3
12 DIIS (2015)
Australian Innovation System Report 2015, p. 50
13 DIIS (2016)
Australian Innovation System Report 2016, p. 12, 61
14 DIIS (2015)
Australian Innovation System Report 2015, p. 84
15 DIIS (2015)
Australian Innovation System Report 2016, p. 40
16 Keller, M. R., Block, F. (2013) Explaining the transformation in the US innovation system: the
impact of a small government program,
Socio-Economic Review; p. 629–656
Incubator Support Initiative Post-commencement Evaluation
1
Page 17 of 59
Released under FOI Act
Document 3
as by facilitating innovation networks.17 Note that these particular programs are
not incubator programs. A few interviewees said that Australia should seek to
emulate the success of these countries and all of those consulted agreed that
government should continue to support entrepreneurship through some
means.
Overall, the evidence suggests that there is a justifiable role for the government
to assist innovation and internationally oriented businesses, although it does
not specifically confirm that incubators are the most effective vehicle for that
assistance.
1.2
International evidence on the value-add of government
support for incubators is inconclusive
International empirical research about the impact of incubators on start-ups’
performance is mixed. A review of business incubators in the European Union
concluded that incubators had a positive impact on start-up outcomes.18
Another study found that evidence of the impact of incubators was
inconclusive, although the authors indicated that this might have been due in
part to the heterogeneity of models and contexts which reduces the
comparability of evaluation findings.19 A recent study of Australian start-ups
also found that evidence of the impacts of incubators was inconclusive, with
independent start-ups achieving similar results to start-ups supported by an
incubator.20
The majority of interviewees stated that government support for incubators
should be continued but a few disagreed, noting that there are already many
incubators operating in Australia and there is no market gap.
I think there is no shortage of incubators, accelerators and co-working spaces
in Australia. I don’t think that there’s a problem. I don’t think that there’s a
need for any funding. – External expert
Relatedly, a few stakeholders stated that while there is a need for the initiative
in regional Australia, it is not as valuable in metropolitan areas where there is
already a high saturation of incubators.
I suppose where I see the need for the program is more in those regional
areas where maybe an incubator needed a little bit of help in getting started
or needed further guidance in bringing together different stakeholders within
a region. But for the more metro-based incubators and existing ones, I really
don’t think the situation would be any different for them. – Internal stakeholder
17 OECD (2013),
Public Policy for Innovation in Innovation in South East Asia, OECD Publishing
18 European Commission Centre for Strategy and Evaluation Services (2002)
Benchmarking of
Business Incubators, EU Publications
19 Rigby and Ramlogan (2016)
The impact and effectiveness of entrepreneurship policy,
Handbook of Innovation Policy Impact, Edward Elgar Publishing, pp 129-161
20 Bliemel, Flores, de Klerk, Miles, Costa, and Monteiro (2016)
The Role and Performance of
Accelerators in the Australian Startup Ecosystem
Incubator Support Initiative Post-commencement Evaluation
2
Page 18 of 59
Released under FOI Act
Document 3
Many interviewees stated that the initiative has a valuable role in promoting a
more export-oriented focus among start-ups.
[Without Incubator Support] incubators and accelerators probably would
continue to exist but they wouldn’t necessarily address the specific access to
international market issue, which is especially important in the Australian
context. – Internal stakeholder
Overall, it is difficult to draw conclusions about the value-add of government
support for incubators in the Australian context at this stage. The operation of
the Incubator Support initiative should generate useful evidence in this respect.
2.
Design
2.1
All outcomes need to be clear, consistent and integrated
into the initiative’s design
The desktop review found a lack of consistency in stated outcomes between
key documents. For example, the stated outcomes for Incubator Support differ
between the November 2017 program guidelines and the 2018 EP policy
rationale.21 Although there are similarities between the sets of outcomes, the
program guidelines, which have been legislated22, have a stronger focus on
supporting innovative start-ups in international markets. The difference may
reflect an attempt to ‘operationalise’ the outcomes, but this is unclear.
Additionally, a few interviewees said that the initiative was developed quickly
following the 2015 NISA announcement, constraining the time available for
design and development of the initiative and possibly affecting the articulation
of outcomes. Another contributing factor may be the timing of various program
changes. For example, one of the stated outcomes in the guidelines is ‘to
create opportunities for Australian start-ups to develop sustainable
international businesses, through access to open public data’. This outcome is
elaborated in a factsheet but is not referred to in the EP policy rationale and
was not mentioned by any interviewees. This may be because the open public
data outcome was added to the Incubator Support Initiative in late 2017,
through the transfer of the DataStart initiative from PM&C to DIIS. In any case,
this evaluation found that the public data outcome is not well-integrated into the
initiative’s design.
Defining what success would look like is key to specifying intended outcomes.
When asked how they would determine the future success of the initiative,
responses from interviewees included: an increased number of incubators, an
increased incubator survival rate, improved start-up networks, increased start-
up survival rate, increased start-up productivity, and improved skills of
participants. Some interviewees said that ‘entrepreneurial education’ and
‘entrepreneurial careers’ were essential features of Incubator Support.
21 DIIS (2018) ‘Entrepreneurs’ Programme Policy Rationale’
22 DIIS (2017)
Industry Research and Development (Incubator Support Program) Instrument 2017,
accessed online a
t https://www.legislation.gov.au/Details/F2017L01576
Incubator Support Initiative Post-commencement Evaluation
3
Page 19 of 59
Released under FOI Act
Document 3
I probably look at IS through the lens of a training ground almost, like a very
hands-on training program for entrepreneurs, because I think more often than
not, the reality is that the actual ventures that people take through these
accelerators or incubators probably aren't going to be successful, but having
gone through that process, those people will start another venture perhaps in
the same industry, perhaps [not] or something else that is a little bit adjacent
and that might be successful. So, it's about encouraging entrepreneurship but
getting people into that mindset and thinking that entrepreneurship is a career
option rather than just going and working at a bank. – Internal stakeholder.
Have we just created this initiative basically to test the waters with potential
start-ups? Or have we created this to really get start-ups to be thinking about
and behave like entrepreneurs in their actual profession, and doing that as a
thing; being entrepreneurial, start-ups, pivoting new business, new market,
ideas. – Internal stakeholder.
Overall, more clarity is needed about the outcomes for Incubator Support in
order to determine whether the initiative’s inputs and activities are appropriate.
The review of the program logic, conducted concurrently with this evaluation,
was one means of achieving this.
2.2
Expected actual outcomes for the regional start-ups may
not match the intended outcomes
Changes to the initiative were introduced in late 2016, before its launch, to
better reflect the government’s focus on regional development. Overall,
interviewees held diverse views about the appropriateness of this regional
focus.
Some interviewees said the initiative provides opportunities for regional
Australia, including for diversifying rural economies, engaging and retaining
young people, and boosting employment. A range of interviewees, including
representatives of regional incubators, also said that the reduced co-funding
requirement for regional projects would make the grant more accessible for
regional incubators. However, a minority of internal and external interviewees
said the realities of regional contexts present a challenge for achieving
intended program outcomes. Reasons included that they lack the density of
networks, international connections and diversity of expertise needed to
support successful entrepreneurship.
I would see the main difference [between regional and metro incubators] as
access to quality people and quality deal flow. – External expert
[Regions] may have far lower levels of tertiary education and lower levels of
professional services, a higher percentage of government services
contributing to the economy. So there’s significant structural, demographic
and economic differences for regional centres, and…the chances are the
incubator initiatives are most likely to be driven by a community based
organisation, so not for profits, and may well depend on a particularly
enthusiastic individual volunteering or working part time. – Internal
stakeholder
In view of this, future research should investigate the extent of difference
between incubators and start-ups supported in metropolitan and regional
Incubator Support Initiative Post-commencement Evaluation
4
Page 20 of 59
Released under FOI Act
Document 3
areas, to predict the extent that intended outcomes are likely to be realised. If
there are substantial differences, options for future design decisions include
adjusting the initiative’s outcomes for regionally based incubators and start-ups
or adjusting the way the initiative is implemented in regional areas.
Recommendation 1: Future research should investigate the extent of differences in
outcomes and implementation between metropolitan and regional areas to inform future
program design decisions.
2.3
Incubator Support fits reasonably well within EP
Incubator Support was established in late 2016 and incorporated as an EP
element two years after the EP was established. As described in the program
guidelines, the intended outcomes of EP are to improve business capability,
build effective business, research and commercialisation networks, improve
business and commercialisation performance, and deliver value to participants.
The outcomes of Incubator Support mostly align with these broader EP
outcomes, as currently stated.
The majority of interviewees said that Incubator Support fits well within EP,
while a small number suggested that it would fit better with other start-up
focused initiatives. Among the other elements of EP, interviewees said that
Incubator Support most closely aligns with Accelerating Commercialisation, as
the Business Management and Innovation Connections elements focus on
more mature firms. In relation to design, some interviewees noted that
Incubator Support is the only element to target ultimate beneficiaries (start-ups)
via an intermediary (incubators). Overall, this evaluation found that there is
reasonable logic and support for Incubator Support to remain an element of
EP.
2.4
The department does not have much visibility of which
start-ups access each of the incubators’ services
Through incubators, the initiative seeks to reach ‘innovative, adaptive, early
stage and scalable companies, with global potential’.23 However, the
‘intermediary’ design limits the extent to which the department can understand
or influence the types of start-ups accessing funded services, especially where
incubators receive funding from other sources.
As discussed in section 3.4, applications and routine reporting requirements
(progress reports, final reports etc.) provide only limited information about the
specific start-ups that are benefiting from Incubator Support. Because it is
unclear whether the start-ups benefiting from Incubator Support are
appropriate targets for the initiative, it will be difficult to determine whether the
long-term outcomes of the initiative are achievable.
23 DIIS (2018)
Incubator Support Frequently Asked Questions, accessed at
https://www.business.gov.au/assistance/entrepreneurs-programme/incubator-support-new-and-
existing-incubators
Incubator Support Initiative Post-commencement Evaluation
5
Page 21 of 59
Released under FOI Act
Document 3
The 2002 study of European Union incubators concluded that ‘it is essential
that there is a clearly defined target market and that this is reflected in the
admission criteria’.24 The design of Incubator Support limits the extent to which
the department is able to define and collect information on characteristics of
start-ups accessing the initiative. This was considered during the concurrent
review of the program logic and data matrix, which has produced documents
which should result in improved data collection on the characteristics of start-
ups.
2.5
Future evaluations should consider the time required by
new incubators to become self-sustaining
Incubator Support allows new incubators to apply for funding for up to two
years. While incubators may subsequently reapply, future funding availability
and approval are competitive. A few interviewees expressed concern about the
sustainability of this short-term funding approach, while a survey respondent
described it as creating a ‘cliff situation’ for new incubators.
In the European context, the 2002 review
Benchmarking of Business
Incubators found that incubators are more likely to succeed when supported by
a partnership of public and private sponsors. The review recognised public
support as vital in the development phase, but stated that dependence on
public funding should be reduced over time. However, the authors noted that it
can often take several years before a business is able to attract sufficient
private sector funding and/or generate sufficient income to cover operating
costs.25
The new incubators funded under Incubator Support have not yet reached the
two year point. Future evaluations should consider the time required to reach
sustainability. This should inform the design of any future iterations of the
initiative.
2.6
The current design may not be feasible if funding is
reduced as is planned
Funding for Incubator Support is currently $23 million over four years until
2019-20, approximately $5.75 million per year on average. Without other
action, based on the current funding profile, in 2020–21 this funding will be
reduced to $2 million per year.
Approximately 35 per cent of current funds was expended over the 21 months
from launch to 30 June 2018, with 65 per cent remaining for the next 24 months
to June 2020. While grant approval and expenditure were initially slow due to
a high proportion of low quality applications, the rate of approvals and
expenditure has increased. In addition, the reduced co-funding requirement for
regional projects is expected to make the grant more accessible for regional
incubators.
24 European Commission Centre for Strategy and Evaluation Services (2002)
Benchmarking of
Business Incubators, EU Publications
25 Ibid.
Incubator Support Initiative Post-commencement Evaluation
6
Page 22 of 59
Released under FOI Act
Document 3
Interviewees considered current funding to be adequate but said the current
design would not be feasible if annual funding drops to $2 million per year. In
the lead up to this change, policy area interviewees said they are open to
exploring different future designs at different levels of funding.
[If funding drops] the program in its current format would have to be
completely redesigned…is that an opportunity…to take a different approach
to say “We’ve tried that, do we do something different now? Can we try
something different?” – Internal stakeholder
2.7
Staff resourcing arrangements require clarification in
order to assess risks to ongoing management
Original program documentation foreshadowed staff resourcing at six Average
Staffing Level (ASL) in the year of establishment and four ASL for the next three
years.
A range of interviewee groups stated that those numbers did not allow
adequate resourcing for program management, particularly following the mid-
2018 AusIndustry restructure. There are currently two dedicated ASL in
Program Management and the EL2 is shared with Accelerating
Commercialisation. Because the current ASL is split across Accelerating
Commercialisation and Incubator Support, it is difficult to assess the adequacy
of resourcing for Incubator Support and any associated management risks.
The program management team was recognised by other interviewees as
playing a critical role in coordinating with stakeholders, liaising with policy,
supporting the EP Committee, supporting RIFs and CSMs and, at times,
engaging directly with participants. Interviewee feedback about the program
management team was very positive but there is a key person risk and capacity
to proactively manage the program is limited.
We’re reacting to things instead of having the time to sit down and actually
think through things and plan for the future. – Internal stakeholder
The staffing level for Incubator Support carries some risks. This particularly
affects the program management team, which is a lynchpin in coordinating the
implementation of the initiative.
Incubator Support Initiative Post-commencement Evaluation
7
Page 23 of 59
Released under FOI Act
Document 3
3.
Implementation
3.1
Outputs are consistent with the initiative’s design and
purpose
This section draws on a review of program data and documents and advice
from program management to assess the outputs of the initiative. Although
there are no set targets that the outputs can be compared against, we find that
the number of grants and level of funding has been appropriate given the
funding available.
Number of grants
As at 30 June 2018, the program had received 131 (57 EIR and 74 NEI)
applications of which 82 (35 EIR and 47 NEI) were considered for approval and
53 (35 EIR and 18 NEI) were approved. One EIR project was subsequently
terminated and one NEI project withdrawn, resulting in a total of 51 (34 EIR and
17 NEI) projects being funded under the initiative up to the end of 2017-18 (see
table 3.1a).
All eligible EIR applications were approved, but a significant proportion
(26 per cent) of EIR applications were found to be ineligible. Reasons for
ineligibility varied, including: some applicants did not meet the definition of an
incubator; some proposed activities were not considered appropriate; proposed
experts were not considered appropriate for secondment; and/or it was not
clear that the proposed expert had the skills and abilities needed to deliver the
services outlined.
Of 47 NEI applications, eighteen (38 per cent) were approved, including
three of seven regional applications. A significant proportion (28 per cent)
were withdrawn prior to being considered for approval. The main reasons for
withdrawal were inability to provide mandatory documentation, e.g. evidence
of matched funding, or a change in circumstances that meant the applicant no
longer wanted or was able to proceed with the project. Several applications that
had been withdrawn were re-submitted at a later date and one application was
withdrawn after approval.
Incubator Support Initiative Post-commencement Evaluation
8
Page 24 of 59
Released under FOI Act
Document 3
Table 3.1a: Incubator Support applications and approvals
Applications
EIR
NEI
Total
Received
57
74
131
Ineligible
15
6
21
Withdrawn
7
21
28
Considered for
35
47
82
approval
Approved
35
18
53
Not approved
0
29
29
Approval rate*
100%
38%
62%
Funded projects
34**
17***
51
*Approved applications as a proportion of applications considered for approval
** One EIR project terminated by mutual agreement
*** One NEI funding offer withdrawn as applicant could not meet the funding conditions
Value of grants
Incubator Support grant funding was originally spread evenly across financial
years but in mid–2017 $3 million was re-phased from 2016–17 to 2017–18 and
2018–19. As at 30 June 2018, a total of $8.2 million ($1m EIR and $7.2m NEI)
had been awarded to Incubator Support grantees. Thirteen grants with a total
of $2.1 million were approved in 2016-17, while 38 grants with a total of $5.6
million were approved in 2017–18 (Table 3.1b). The program data shows that
the value of grants approved almost tripled from 2016–17 to 2017–18 and the
initiative appears to be tracking in line with the revised grant funding profile.
Table 3.1b: Incubator Support funding (and number of projects) approved by financial
year
Financial year
EIR
NEI
Total
2016–17
$161,844 (8)
$1,939,474 (5)
$2,101,318 (13)
2017–18
$852,864 (26)
$5,239,500 (12)
$6,092,364 (38)
Total
$1,014,708 (34)
$7,178,974 (17)
$8,193,682 (51)
Source: Incubator Support program data to 30 June 2018
As shown in table 3.1c, the majority of funding and projects have been in NSW
and Victoria, which have the largest innovation ecosystems in Australia.26 A
significant number of EIR projects have also been funded in Queensland, and
several NEI projects in South Australia.
26 Weisfeld, Z. (2017), ‘The rising success of startups down under: inside Australia’s
entrepreneurial ecosystem’,
Forbes, accessed online at
https://www.forbes.com/sites/groupthink/2017/08/10/the-rising-success-of-startups-down-under-
inside-australias-entrepreneurial-ecosystem/#61baf5411cda
Incubator Support Initiative Post-commencement Evaluation
9
Page 25 of 59
Released under FOI Act
Document 3
Table 3.1c: Incubator Support funding (and number of projects) by component and
jurisdiction
Jurisdiction
EIR
NEI
Total
ACT
$25,000 (1)
$25,000 (1)
NSW
$308,493 (9)
$3,214,035 (8)
$3,115,528 (17)
NT
$100,000 (1)
$500,000 (1)
$600,000 (2)
QLD
$222,125 (8)
$268,674 (1)
$465,799 (9)
SA
$20,000 (1)
$1,196,265 (3)
$1,216,265 (4)
VIC
$237,618 (9)
$2,000,000 (4)
$2,211,285 (13)
WA
$101,472 (5)
$101,472 (5)
Total
$1,014,708 (34)
$7,178,974 (17)
$8,193,682 (51)
Source: Incubator Support program data to 30 June 2018
Overall, the outputs delivered are consistent with the design and policy intent
of the initiative.
3.2
The initiative has mostly been implemented as planned,
although some changes were made to improve NEI
application quality and better adapt to regional
circumstances
Applications received for NEI in the early stages of the program were of poor
quality, prompting some changes to the initiative to improve application quality.
This section assesses the impacts of those changes, based on interviews and
a review of program documents and data.
Many interviewees said that the early NEI applications did not match the
expected standards. A review of program documents confirms that there was
a low rate of EP Committee support for applications in 2016–17. Some
interviewees also noted the poorer quality of Incubator Support applications
compared with Accelerating Commercialisation applications, which are also
assessed by the EP Committee but which benefit from intensive support from
advisors. Many interviewees highlighted that the low number and poor quality
of regional applications were an early concern for the initiative. This was
confirmed by a review of program documents.
In response to the above issues, changes were made to the guidelines,
supplementary guidance developed, a formal feedback process for draft
applications introduced, the role of RIFs introduced, and the co-funding
requirement reduced for regional applicants.
All of the interviewees consulted were positive about the changes made, and a
few, including EP Committee members, also said that the changes have led to
some improvements. A review of program data showed that the rate of EP
Committee support for applications overall increased from 20 per cent in 2016–
17 to 61 per cent in 2017–18.
In relation to regional application number and quality, the review showed little
change, with one of three regional applications supported in 2016–17 and two
Incubator Support Initiative Post-commencement Evaluation
10
Page 26 of 59
Released under FOI Act
Document 3
of four supported in 2017–18. Program management and CSM interviewees
said that the RIFs were likely to contribute to improved regional application
quality. However, at the time of the evaluation consultations, the RIFs had only
recently started and the EP Committee was yet to receive any NEI applications
that had received RIF advice.
While the above changes may have contributed to improved application quality
overall, it was too early to assess the impact of changes on regional
applications.
Figure 3.2: Proportion of applications supported by the EP Committee
10
s
n
8
tio
a
d 6
lic
p
re
p
e
f a
id
s 4
n
r o
o
e
c
b
2
m
u
N
0
Dec-16 Feb-17
Mar-17 May-17
Jun-17
Aug-17 Sep-17
Oct-17
Dec-17
Feb-18
Apr-18
May-18
Jun-18
Date of EP Committee meeting
Not supported
Supported
3.3
Incubators align moderately well with the intended primary
target market
According to program guidelines, Incubator Support targets incubators that
have the potential to foster high-growth start-ups with a focus on international
markets. Interviews also confirm that the incubators supported needed to be of
high quality. The analysis of alignment to target market in this and the following
sections is based on interviews, key program documents, and a review of nine
applications (three EIR and six NEI) and eight reports (four NEI progress
reports and four EIR final reports) selected at random.
An analysis of the program data indicates that the incubators funded represent
a diverse range of models. The majority are private incubators, while a smaller
number of university incubators, economic development incubators and basic
research incubators have also been funded (figure 3.3a).27 The majority
support start-ups of all stages, while a smaller proportion target scale-ups, early
27 Private investment incubators are aimed at developing business activities and attracting
additional financial resources. ‘University’ incubators are interested in development of intellectual
assets. ‘Basic research’ incubators use fundamental research to develop technologies that can be
commercialised later by patents and licensing. ‘Economic development’ incubators promote
entrepreneurship in the area with a focus on industry competitiveness, jobs, etc. [Barbero, J.L.,
Casillas, J.C., Ramos, A., Guitar, S. (2012), ‘Revisiting incubation performance: How incubator
typology affects results’,
Technological Forecasting and Social Change, 79(5) pp. 888-902]
Incubator Support Initiative Post-commencement Evaluation
11
Page 27 of 59
Released under FOI Act
Document 3
stage ventures or nascent entrepreneurs (figure 3.3b). Almost half have a
generic focus rather than focusing on a particular industry or sector. Almost a
quarter of grant recipients have an information technology focus, and the
remainder represent diverse industries and sectors (figure 3.3c). The majority
of incubators funded provide generic growth services rather than specific
services (figure 3.3d).
Figure 3.3a: Funded incubators by incubator model
Economic Development
l
Basic Research
e
d
o
University
r m
to
a
b
u
Private
c
In
0
5
10
15
20
Number of incubators
NEI
EIR
Source: Incubator Support program data
Figure 3.3b: Funded incubators by venture stage they support
Nascent entrepreneurs
Early stage ventures
e
g
ta
s
re
Scale-ups
tu
n
e
V Start-ups of all stages
0
5
10
15
20
Number of incubators
NEI
EIR
Source: Incubator Support program data
Incubator Support Initiative Post-commencement Evaluation
12
Page 28 of 59
Released under FOI Act
Document 3
Figure 3.3c: Funded incubators by industry or sector
Advanced and high-tech manufacturing
Agribusiness and technology
Creative industries
Energy and resources
r
to
c
e
IT (FinTech, cyber security, artificial…
r s
o
MedTech and pharmaceuticals
try
s
Social impact
u
d
In
Generic
0
5
10
15
Number of incubators
NEI
EIR
Source: Incubator Support program data
Figure 3.3d: Funded incubators by central service provided
Export development
Industry specific
e
m
e
Intellectual property
th
e
ic
Market validation and development
rv
e
l s
Research and development
tra
n
e
Generic growth services
C
0
5
10
15
20
Number of incubators
NEI
EIR
Source: Incubator Support program data
The international focus of the initiative is established in key program documents
and was emphasised during interviews. Of the applications reviewed, all
showed that their leadership or incoming experts had some international
experience and the majority made clear linkages between proposed activities
and international markets. A few applications, however, did not specify how
they planned to help start-ups reach international markets.
The review of applications demonstrated that there is a focus on the quality of
incubators, by proving the expertise of the people involved. This aligns with
Incubator Support Initiative Post-commencement Evaluation
13
Page 29 of 59
Released under FOI Act
Document 3
some comments by interviewees about the central importance of the
incubator’s quality. In their applications, incubators emphasised their credibility
through testimonials, detailed expert résumés and addressing of merit criteria.
Assessment commentary by CSMs generally focused on the quality of the
incubator and the credentials of people involved. EP Committee minutes
included some detail indicating a similar focus. While most interviewees who
commented were confident that the incubators supported matched the target
market, one interviewee was less confident about the quality of some
incubators supported.
It was difficult to determine the extent to which incubators were meeting a need
or addressing a gap within a particular region or sector. Many of the
applications reviewed did not provide this information, and this was noted in
some CSMs’ assessments.
Overall, incubators funded appear to align moderately well with the intended
primary target market. However, this could be improved by requiring applicants
to specify how they intend to help start-ups reach international markets, and to
demonstrate that they are meeting a need in a particular region or sector.
Recommendation 2: Require applicants to specify how they will help start-ups reach
international markets, how they are meeting a need in a particular region or sector, and
how they intend to track the start-ups they support.
3.4
It is difficult to determine the extent to which the program
is reaching the intended start-ups
Information about start-ups reached through the initiative is provided by some
grantees in documentation submitted as part of formal reporting requirements.
However, there is not enough information available to assess whether the
initiative is reaching the intended start-ups. The analysis in this section is based
on a review of a sample of applications and on stakeholder interviews.
One merit criterion asks about the ‘expected impact and benefits of the project’
but only some of the EIR and NEI applications reviewed included details about
the start-ups intended to benefit. Those who do provide this information only
outline the details of a small sample of start-ups. Where information was
provided, the type of information about start-ups varied.
Where applications and progress reports reviewed did provide information
about start-ups, those start-ups align with the intended beneficiaries of the
initiative. The start-ups profiled were developing innovative, new-to-market IT
and med-tech products that could be scalable to international markets.
It is difficult to determine the extent to which the initiative is reaching the
intended start-ups through the incubators supported. This could be improved
by requiring applicants to describe the type of start-ups they plan to target and
how they intend to target, screen and track them. See further discussion in
section 4.3 on performance assessment.
Incubator Support Initiative Post-commencement Evaluation
14
Page 30 of 59
Released under FOI Act
Document 3
3.5
Early outcomes for grant recipients are mostly positive
This section is based on survey responses and draws on early stage outcomes
as reported by applicants. At this stage any outcomes reported are prospective
only.
In response to open-ended survey questions about outcomes and impacts,
respondents’ feedback focused mostly on positive intended outcomes. They
said that the grant enabled them to access national and international resources
and connections, provide better experts and mentors, develop regional
relationships, and extend services to regional organisations. One respondent
said that as a result of the expert’s advice on commercialisation pathways,
founders had accessed other funding and investment. Successful NEI
applicants stated that the grant enabled them to expand their support services
for start-ups, including into new regions. One interviewee said that EIR has
attracted significant expertise and will likely have a large impact and positive
spillover effects.
Most EIR survey respondents said there were no negative outcomes or impacts
associated with applying for an EIR grant. Most negative impacts identified by
NEI respondents related to the application process. However, one NEI
respondent said that receiving 50 per cent of their funding up-front half-way
through the financial year created an unnecessary tax burden, and one noted
the unexpected extensive travel required to support and nurture regional
communities.
Overall, a number of positive outcomes were identified during stakeholder
consultations while few negative outcomes or impacts were identified, other
than the workload associated with NEI applications which is discussed in
section 3.7.
3.6
The EIR application, assessment and reporting processes
are suitable for participants
The EIR component is based on assessment of eligibility for requests up to
$50,000 inclusive, and eligibility and merit for requests from $50,000 up to
$100,000. The application, assessment and reporting processes are relatively
simple and straightforward, with funding decisions made by the departmental
program delegate. Survey respondents were generally satisfied with EIR and
no major concerns were raised about its implementation. This section is based
on applicant responses to the survey, stakeholder interviews, and the review
of three EIR applications and four EIR final reports selected at random.
EIR application
The eight EIR survey respondents rated the level of application effort as ‘low’
or ‘moderate’ (see figure 3.6a) and were ‘satisfied’ or ‘very satisfied’ with the
application process (see figure 3.6b). Respondents considered the level of
effort required to be appropriate, with one commenting that it achieved a good
balance between effort and accountability. One respondent noted that being
able to extract a copy of the application for their records was an improvement
on the previous year. Improvements suggested by respondents were to pre-
populate the form for applicants already in the system, improve the web
Incubator Support Initiative Post-commencement Evaluation
15
Page 31 of 59
Released under FOI Act
Document 3
interface, and make the process scalable according to the amount of funding
requested.
Figure 3.6a: EIR applicants’ ratings of the level of effort required
Very high
High
rt
ffo
Moderate
f e
l o
e
v
Low
e
L
Very low
0
1
2
3
4
5
6
Number of applicants
Source: Incubator Support survey July 2018
Figure 3.6b: EIR applicants’ satisfaction with the application process
Very satisfied
n
tio
Satisfied
c
fa
tis Neither satisfied nor dissatisfied
a
ll s
ra
Dissatisfied
e
v
O
Very dissatisfied
0
1
2
3
4
5
Number of applicants
Source: Incubator Support survey July 2018
A review of a sample of four EIR applications confirmed that the application
template appears relatively simple to complete. It requires only basic
information about the incubator, the project budget, key activities and
anticipated outcomes, and details of the proposed expert secondee. All four
applications reviewed were less than twenty pages in length. Many internal
interviewees stated that the guidelines are difficult for applicants to interpret
and need to be more clearly presented (discussed further in 4.7). Overall, the
application process for EIR is appropriate.
EIR assessment
The majority of stakeholders were satisfied with the assessment process for
EIR. One concern raised by a survey respondent was that approvals ‘seemed
to take a while’. One survey respondent said that timing delays in funding
approvals can impact on being able to confirm timing with the expert, resulting
in the loss of a window of opportunity.
Incubator Support Initiative Post-commencement Evaluation
16
Page 32 of 59
Released under FOI Act
Document 3
EIR reporting
Six survey respondents said they had completed an EIR project report.
Respondents said that reports took a week or less to complete and they were
‘satisfied’ or ‘very satisfied’ with the reporting process. They described the
reporting process as ‘simple and straightforward’ and as achieving ‘a good
balance between information required for the funding requested’. Most said
they did not think any changes were required, however one suggested using a
‘standard incubator reporting platform’. Of the four final EIR reports reviewed,
all of the applicants indicated that activities and expenditure were conducted in
accordance with their funding agreements.
Overall, the EIR application, assessment and reporting process appear to be
both suitable for participants and working well. The reporting templates may,
however, need to be reviewed following finalisation of the program logic and
data matrix to ensure that data collection adequately supports later evaluation.
See further discussion at section 4.3 below.
3.7
NEI applicants require further guidance on the type of
information and level of detail required in applications
Most issues raised about implementation of the initiative related to the NEI
component. This is not surprising given the increased complexity and grant size
of NEI compared with EIR. The NEI application, assessment and reporting
requirements are correspondingly more demanding. This section is based on
applicant responses to the survey, stakeholder interviews, and a review of six
NEI applications and four NEI progress reports selected at random.
NEI application
Respondents were asked to indicate how long it took to develop and submit
their application, and were prompted to record time taken rather than duration
over which the application was developed. The median28 time spent by the 13
NEI respondents on developing their application was 25 days. Successful
applicants invested more time (median 30 days) developing their application
than did unsuccessful applicants (median 23 days). Most respondents,
including the majority of successful applicants and half of unsuccessful
applicants, rated application effort as ‘very high’ (figure 3.7a).
Respondents’ level of satisfaction with the application process was mixed.
Even among the seven respondents whose application had been successful,
only three were ‘satisfied’ (figure 3.7b). Respondents described the application
as ‘detailed’ and ‘difficult’, and the process as ‘drawn out’. Issues raised were
that application questions were not intuitive and a large amount of supporting
documentation was required. Respondents suggested that the application be
shortened and made more logical, and that further guidance be provided about
the detail and attachments required. One respondent suggested that applicants
be given the opportunity to present to the EP Committee.
28 The median was used as the measure of central tendency given the range of responses (five to
390 days) and presence of outliers.
Incubator Support Initiative Post-commencement Evaluation
17
Page 33 of 59
Released under FOI Act
Document 3
Figure 3.7a: NEI applicants’ rating of the level of effort required
Very high
rt
High
ffo
f e
l o
Moderate
e
v
e
L
Low
Very low
0
1
2
3
4
5
6
Number of applicants
Successful
Unsuccessful
Source: Incubator Support survey July 2018
Figure 3.7b: NEI applicants’ level of satisfaction with the application process
Very satisfied
n
tio
c
Satisfied
fa
tis
a
Neither satisfied nor
f s
dissatisfied
l o
e
v
Dissatisfied
e
L
Very dissatisfied
0
1
2
3
4
Number of applicants
Successful
Unsuccessful
Source: Incubator Support survey July 2018
Six interviewees (mainly CSMs and RIFs) said that further changes to the
guidelines were needed to make them easier for users to navigate. CSMs said
that although improvements had been made recently, the application form
could be clearer, more specific about requirements, and include limits on
attachments. The majority of CSMs suggested including examples of eligible
activities to improve clarity, perhaps using fictional incubators.
The review of six successful NEI applications found that applications are
accompanied by a large number and volume of supporting documentation.
Most applications reviewed were over 100 pages in length, with one close to
200 pages, and information presented in supporting documentation was not
always consistent. In some cases, applicants did not include explanatory text
against the merit criteria categories, just links to attachments. While the
document review did not find the amount of text in the application form to be
an issue, this may be because it is resolved through CSM feedback to
Incubator Support Initiative Post-commencement Evaluation
18
Page 34 of 59
Released under FOI Act
Document 3
applicants prior to final submission. CSMs said that the large volume of
information received as part of each application adds to the time required to
assess an application.
Changes are clearly needed to increase clarity of requirements for applicants
and make extracting relevant information easier for the CSMs and the EP
Committee. The process would benefit from additional guidance to applicants
about information required and how to present this clearly and succinctly.
Recommendation 3: Provide further guidance for applicants to help clarify the type of
information and level of detail required in NEI applications.
NEI assessment
CSMs assess NEI applications against eligibility and conduct preliminary merit
assessments. Program management reviews applications and preliminary
assessments and sends eligible applications to the EP Committee. The EP
Committee assesses applications against merit criteria and makes
recommendations to the Minister. The Minister makes the decision on funding.
Two policy interviewees commented on flexibility being a key feature of the
initiative’s design and assessment process. However, this flexibility appears to
contribute to a lack of clarity when it comes to the assessment of applications.
The majority of CSMs interviewed described the assessment criteria as
‘vague’, and some CSMs described them as less straightforward than those for
Accelerating Commercialisation applications. This adds to the time needed to
complete assessments.
All CSMs interviewed wanted more support and guidance on assessment,
particularly against merit criteria, with one suggesting the scoring system be
made less subjective. However, some other interviewees were concerned that
CSMs did not have the necessary technical knowledge to conduct merit
assessments. A review of program data found reasonable consistency
between CSM and EP Committee assessments where scores were rated low
or high, but less consistency for applications in the middle.
Rather than seeking to improve CSMs’ technical capability to assess merit,
effort would be better spent improving application guidance to reduce the
workload associated with assessment, keeping expectations of preliminary
assessments by CSMs realistic, and relying on the expertise of the EP
Committee for assessments against merit.
NEI reporting
The NEI reporting requirements are relatively simple and require incubators to
validate that funds were used appropriately. All four NEI reports reviewed
indicated that activities are progressing in accordance with expectations,
barring minor external delays, and that expenditure was appropriate. In some
cases, expenditure was lower than originally expected, but was expected to be
carried forward into the next cycle.
The reporting requirements for NEI are straightforward and do not require
extensive detail. The report is intended to check that the funded activities are
Incubator Support Initiative Post-commencement Evaluation
19
Page 35 of 59
Released under FOI Act
Document 3
progressing according to expectations set out in the agreements. No survey
respondents or interviewees raised any major concerns about the reporting
process. Four of the seven NEI survey respondents had completed a report.
For those four, reporting took between ‘less than a day’ to ‘a week’ and none
reported that they were dissatisfied with the process.
As stated above, the reporting templates may need to be reviewed following
finalisation of the revised program logic and data matrix to ensure that data
collection adequately supports later evaluation.
3.8
Survey respondents are satisfied with the advice and
support they have received
EIR and NEI survey and interview respondents were mostly very positive about
their engagement with CSMs and other AusIndustry officers. Some
respondents had also interacted with a RIF and the majority were positive about
their experiences.
Despite frustrations expressed about the NEI application process, and
irrespective of NEI application outcome, both EIR and NEI survey respondents
described the departmental officials they had had contact with as being
‘accessible, ‘helpful’, ‘engaged, informative, collaborative’, ‘professional’, and
‘knowledgeable’. They described the assistance received as ‘prompt’ ‘timely’,
‘useful’ and ‘efficient’. There was little negative feedback, although one
respondent reported a lack of consistency across officials, while another said
feedback was provided too late. The three EIR and NEI participants interviewed
were also very positive about their interaction with CSMs.
Something that was really, really good about it… they weren’t trying to trick
us, or trip us up. They were trying to help us. I think that working relationship
was super positive and actually made it ultimately an enjoyable experience.
– Participant
Only seven survey respondents had had contact with a RIF (the survey was
conducted just two months after the RIFs commenced), and most of them were
happy with the support they had received from RIFs. All five EIR survey
respondents who had contact with a RIF were ‘satisfied’ or ‘very satisfied’ with
the advice they received. Of the two NEI survey respondents who had contact
with a RIF, one was ‘very satisfied’ with advice received, while the other was
‘dissatisfied’.
Overall, in surveys and interviews, it was clear that the interactions between
the program and applicants were generally considered to be positive and
constructive.
3.9
Governance processes are mostly effective, but efficiency
could be improved by delegating funding decisions,
increasing information sharing, and clarifying roles and
responsibilities
The ANAO’s former Better Practice Guide on Public Sector Governance
identified three key focus areas for achieving good governance: performance
Incubator Support Initiative Post-commencement Evaluation
20
Page 36 of 59
Released under FOI Act
Document 3
orientation; transparency and integrity; and effective collaboration.29 The
analysis of governance in this section is based on internal program
documentation and stakeholder interviews.
The governance of NEI could be made more efficient
The Minister has overall authority for Incubator Support. In relation to the NEI
component, the Minister has directed the Innovation and Science Australia
(ISA) Board to provide merit assessments, and in turn the ISA Board has
delegated this power to the EP Committee, members of which are appointed
by the Minister.30 For EIR, the Minister has delegated decisions on funding to
the program delegate (the General Manager, EP Program Management and
Delivery).
The roles and responsibilities of the EP Committee are formalised in Terms of
Reference. The Committee is required to make recommendations to the
Minister on NEI application merit assessments and provide other advice on
non-financial administration matters relating to Incubator Support (as part of
the EP more broadly).31 Many interviewees described the EP Committee role
positively:
The EP Committee skills and background are probably one of the strengths
of the Incubator Support program…AusIndustry is not really in a position to
make the kind of assessments that the EPC make... coming from the start-up
venture capital, early stage commercialisation background, [EP Committee
members] have skills in that background. – Internal stakeholder
While some interviewees said ministerial approval of NEI grants was
appropriate during the early phase of implementation, the majority said that this
was no longer necessary and could be delegated to the department. While the
Minister may make a decision that is contrary to the EP Committee’s
recommendation, as at 30 June 2018 this had not occurred. Consistency with
Accelerating Commercialisation, for which the EP Committee conducts a merit
assessment and makes a recommendation to the program delegate who
makes the decision, would be desirable. The evidence suggests that NEI
funding decisions should be transferred to the program delegate, as this would
reduce decision times, which were raised as a concern by many interviewees
and a few survey respondents, and reduce workloads. However, given the
stage in the program cycle, the merit of such a change should first be assessed.
Recommendation 4: Investigate the merit of seeking the Minister’s approval for NEI
funding decisions to be transferred to the program delegate, based on the
recommendations of the EP Committee.
The ANAO emphasises the importance of programs being agile and responsive
to shifts in conditions and priorities. The design and delivery of Incubator
29 ANAO (2014
) Public Sector Governance: Strengthening performance through good governance.
This ANAO Better Practice Guide has since been withdrawn from the ANAO website.
30 IR&D Act 1986; Minister’s Entrepreneurs’ Programme Direction No. 1 of 2016
31 DIIS (2018) ‘Entrepreneurs’ Programme Committee Terms of Reference’
Incubator Support Initiative Post-commencement Evaluation
21
Page 37 of 59
Released under FOI Act
Document 3
Support has been improved since it was launched, including through additional
support for regional applicants and a formal process for providing feedback to
unsuccessful applicants.
Clear and timely communication between stakeholders would improve the
performance orientation of the initiative by enabling more effective decision-
making. Almost all stakeholder groups interviewed wanted to receive more
information about one or more aspects of the program’s rationale, outputs and
outcomes. EP Committee members said that they would like feedback on
outcomes of funded incubators to help refine future recommendations. Some
CSMs, RIFs and BGH representatives said that they would like feedback about
program outputs and outcomes to help them assess how they were performing.
Some program management interviewees wanted to know more about the
research that informed the program design. A range of interviewees indicated
that they wanted more feedback on the program in general as well as on their
performance.
I haven’t had enough exposure to say which ones are working well or not…we
don’t get any feedback post our involvement at a committee level. – External
expert
As many interviewees indicated that they were committed to getting more
feedback about the program, it is likely that sharing more information about the
design, delivery and outcomes of the program would enhance the initiative’s
performance orientation.
Recommendation 5: Share more information about the rationale, outputs, outcomes
and evidence for the initiative’s design with internal stakeholders.
Governance is open and transparent
The Incubator Support initiative ensures transparency and integrity of
processes by adhering to the ISA Board’s Declaration of Interest (DOI)
procedures. CSMs, the EP Committee and the Minister are required to follow
DOI procedures when assessing applications.32 Procedures for managing
conflicts of interest are also set out in the program guidelines, CSM procedures,
and the EP Committee’s Terms of Reference.
No issues about transparency or integrity were raised during interviews. One
interviewee said that processes put in place to manage potential conflicts of
interest in relation to the EP Committee members were working well. This was
supported by a review of EP Committee meeting minutes, which confirmed that
the established DOI processes are routinely followed.
The committee takes the [DOI] processes very seriously…they take a very
hard line on how they deal with conflicts, I think that’s really impressive and
it’s one of the best I’ve seen. – Internal stakeholder
Transparency is ensured through the establishment of robust measures,
including to manage potential conflicts of interest when assessing applications.
32 DIIS (2018) ‘EP Committee Terms of Reference and Assessment Guide’; DIIS (2017) CSM
Standard Operating Procedure
Incubator Support Initiative Post-commencement Evaluation
22
Page 38 of 59
Released under FOI Act
Document 3
These measures appear to have been implemented effectively and are working
well.
Collaboration is mostly effective, but roles and responsibilities could be
clarified
Interviewees were mostly satisfied that relationships between areas were
collaborative, although a few stakeholders felt that responsibilities were unclear
when coordinating with Business Grants Hub (BGH).
Program management and policy area interviewees said that they
communicate well with each other.
I think at the officer level, there’s a good program and policy relationship. –
Internal stakeholder
Incubator Support was one of the earlier programs to go through BGH and was
described by one interviewee as a ‘test case’. Program management and policy
interviewees said that significant negotiations were needed to align
standardised grant practices with the design intent of Incubator Support, and
that this impacted the setup phase of the initiative. A BGH interviewee also
noted that it took time for some policy decisions on key features of the program
to be resolved.
Interviewees generally agreed that collaboration and coordination had
improved by the time changes were made to incorporate the regional
component. BGH interviewees said this was the result of improvements in
overall coordination, including clarifying roles and responsibilities, formalising
communication processes, and developing clarifying process documentation.
Most interviewees said roles were now more clearly delineated, but some said
this could be further improved.
Overall, collaboration has improved following early challenges and is expected
to continue to improve as BGH matures and processes are further refined.
However, roles and responsibilities, including for overall coordination, may
need to be further clarified and more clearly communicated when changes are
made to the guidelines and associated documents.
Recommendation 6: Clarify and clearly communicate roles and responsibilities,
including for overall coordination, when making changes to the guidelines, application
templates or related documents.
4.
Performance assessment
4.1
Data collection would be improved if grantees were fully
aware of requirements and templates aligned with agreed
data collection needs
Stakeholders across all areas considered data capture and use to be a major
concern in the design and implementation of Incubator Support. Current data
collection is unlikely to be adequate for effective monitoring and evaluation.
Incubator Support Initiative Post-commencement Evaluation
23
Page 39 of 59
Released under FOI Act
Document 3
A program logic and data matrix (showing indicators and data sources against
evaluation questions) were developed for EP as a whole and for Incubator
Support through the 2016 NISA 1.0 process. However, when the new regional
changes were introduced in December 2017, policy and program areas agreed
that they should be updated. The program logic and data matrix have been
revised in parallel with this evaluation. In the interim, however, the lack of an
updated agreed program logic and data matrix has contributed to the lack of
clarity on outcomes and target markets, and consequently on data collection
requirements.
There has to be a conscious decision, are we only seeking data from the
incubator, or are we seeking data from their participants? If it's the latter, do
we have mechanisms to realise that? – Internal stakeholder
The major sources of data are the application forms and project reports
submitted by applicants. Application forms collect information about the
incubator, planned project activities, budget (for NEI) and evidence of demand,
but not necessarily about start-ups.
Application templates could be refined to further standardise the data collected.
Progress reports for NEI and EIR collect limited information as they seek only
to understand how the funded activities are tracking and if the conditions of
funding have been met. The final report for NEI requires more extensive data,
but while the final report template is provided to grantees at the time of funding
negotiations, grantees may not be fully aware of their data collection
requirements and this will make later reporting difficult. It is important that
grantees are fully aware of all reporting and data collection requirements at the
start of the project.
The updated data matrix developed alongside this evaluation details data
collection requirements for evaluation. Based on the data matrix, there are
presently gaps in the data available about participant satisfaction with the
program, characteristics of start-ups and the outcomes for start-ups. Reporting
templates should be aligned with the agreed data collection needs in the future.
Recommendation 7: Customer Service Managers and Regional Incubator Facilitators
should reinforce early awareness among grantees of their reporting and associated data
collection requirements in the interests of assuring the availability and quality of data
submitted.
Recommendation 8: Revise reporting templates to align with agreed data collection
needs in accordance with the new program logic and data matrix for Incubator Support.
4.2
Indicators need to be reviewed to ensure alignment with
program objectives and outcomes
Indicators and measures need to be reviewed to ensure that they are not
potentially perverse or difficult to interpret. An example of a potentially perverse
indicator is the original key performance indicator (KPI) ‘number of incubators
Incubator Support Initiative Post-commencement Evaluation
24
Page 40 of 59
Released under FOI Act
Document 3
in new regions or sectors’. While many interviewees stated that Incubator
Support should, and does, focus on the quality of incubators, this KPI creates
a potentially perverse incentive to prioritise the quantity of incubators
supported.
An example of an indicator that is difficult to interpret is ‘number of networks’,
a question included in the final report template. One final report indicated that
200 networks had been established, while another said that five new networks
had been established. Such quantitative information is, on its own, difficult to
interpret or compare.
This evaluation therefore recommends that new key performance indicators for
Incubator Support be agreed, based on the new data matrix.33
Recommendation 9: Revise key performance indicators for Incubator Support, based
on the revised data matrix.
4.3
The incubator model creates some challenges for
assessing performance
The ultimate beneficiaries of Incubator Support are intended to be start-ups.
Section 3.4 noted that few incubators have provided information so far about
the start-ups accessing their services and benefiting from the initiative.
However, beyond this, it is difficult to source information about start-ups for a
number of reasons.
The ‘incubator as intermediary’ design makes it challenging to collect data on
start-ups supported and difficult to ascertain the validity of information collected
by incubators. Stakeholders noted that it is difficult to source information about
the progress of start-ups after they have ‘graduated’ from the incubator. To
address this issue, one option could be for the department to contact start-ups
directly to collect information about the services they have received.
Interviews with grantees indicated that some of the start-ups reached may not
lie within the target market of the initiative. One respondent mentioned start-
ups they were working with, some of which appear unlikely to have scalable
models or the potential to expand internationally.
Apart from this, interviewees also said that it is difficult to precisely define the
start-up of interest given the mutable nature of entrepreneurship. For example,
entrepreneurs may try establishing several businesses before they find an idea
that works or, alternatively, may overtly intend to establish a business, sell and
move on.
One of the things that I think is worth noting is that it's very difficult to define
a start-up. In some ways, you can use age as a descriptor but often a start-
up will go through a very long germination period where a small group of
founders are working on the idea, testing it, throwing things back and forth,
going down a particular track, realising it's not going to work, changing
33 The revision of the data matrix undertaken in parallel to this evaluation included the specification
of indicators of efficiency and effectiveness, which should inform the determination of key
performance indicators.
Incubator Support Initiative Post-commencement Evaluation
25
Page 41 of 59
Released under FOI Act
Document 3
direction, going down another track and it may be the same organisational
structure through that. – Internal stakeholder
Overall, there is a need for greater clarity about how the initiative will collect
data from start-ups.
Recommendation 10: Clarify how information will be sourced from start-ups to assess
whether the initiative is having its intended impact on the ultimate beneficiary.
5.
Conclusion
This post-commencement evaluation of Incubator Support has found that the
initiative has broadly been well-implemented.
However, a number of issues have been identified.
In particular, stakeholders across all areas identified data capture and
measuring performance to be major concerns in the design and implementation
of the initiative. Going forward, a key priority for Incubator Support will be
ensuring that data relevant to measuring the initiative’s performance is
appropriately captured.
To address this, the evaluation recommends that:
New performance indicators for Incubator Support be developed.34
Reporting templates be revised to align with agreed data collection needs
in accordance with the revised program logic, data matrix and KPIs for
Incubator Support.
Grantees be made aware of all reporting requirements at the start of the
project.
The evaluation also notes that intended outcomes need to be clear, consistent
across key policy and program documents and integrated into the program
design. Future research should examine whether the expected outcomes are
appropriate for regional areas given that start-ups in regional areas are likely
to find it more difficult to succeed.
34 The revision of key performance indicators has been undertaken in parallel with this evaluation.
Incubator Support Initiative Post-commencement Evaluation
26
Page 42 of 59
Released under FOI Act
Document 3
Appendix A
Incubator Support initiative
post-commencement
evaluation Terms of
Reference
The Department of Industry, Innovation and Science (DIIS) will undertake a
post-commencement evaluation of the Incubator Support (IS) element of the
Entrepreneurs’ Programme (EP). The evaluation will be overseen by the EP
Monitoring Evaluation Reference Group (and conducted by the department’s
Evaluation Unit (EU) in the Economic and Analytical Services Division (EASD).
Background
The IS initiative is one of the four elements of EP and aims to improve the
prospect of Australian start-ups achieving commercial success in international
markets. It was announced in December 2015 as part of the National
Innovation and Science Agenda (NISA) and launched in September 2016. The
$23 million initiative supports incubators to deliver a range of services to
Australian start-ups such as seed funding, co-location, mentoring, professional
services and access to networks. It provides funding through two components,
both of which require matched funding from applicants:
1.
The New and Existing Incubators component aims to develop new
incubators in regions with high potential for success in international trade
and boost the performance of existing successful incubators.
2.
The Expert in Residence component aims to provide access to top quality
research, managerial and technical talent through secondments of expert
advisors with a background in successful commercial start-ups.
A post-commencement evaluation of EP was conducted in 2016. The IS
initiative was not included as it was not yet established. The evaluation of IS
has been identified by EP policy and program staff as a priority project to align
the evaluation stages of all four EP elements and prepare for the EP monitoring
evaluation.
Authority for evaluation
EP has been identified as a ‘Tier One’ evaluation priority of high strategic
importance. The department’s
Evaluation Strategy establishes a principle to
undertake a post-commencement evaluation following a program’s first year of
operation. This type of evaluation typically examines the design and initial
implementation of a program. It allows decision-makers to identify early issues
regarding program administration and delivery and take corrective action if
needed.
Incubator Support Initiative Post-commencement Evaluation
27
Page 43 of 59
Released under FOI Act
Document 3
Evaluation scope and timing
The IS post-commencement evaluation is anticipated to begin in the first week
of May 2018 and to be completed within six months. Evaluation questions will
be structured around three areas of Peter Rossi’s evaluation hierarchy:35
1.
The need for the initiative.
2.
The initiative’s design and theory.
3.
The initiative’s processes and implementation.
The evaluation will include questions from the 2016 EP post-commencement
TOR for continuity and consistency.
Evaluation questions
Need
1. What need is the IS initiative addressing?
1a. What was the need that led to the IS initiative?
1b. How strong is the evidence of the need for government intervention?
Design
2. To what extent is the design of IS evidence based and logically consistent?
2a. Are the eligibility criteria for IS appropriate? Is the target market
suitable?
2b. Is the initiative funded to the right level? Is the resourcing (ASL)
adequate?
2c. Are IS inputs, activities, outputs and outcomes consistent with
addressing the IS policy problem?
Implementation
3. Was the set up phase of IS effective and is the grant delivery process
appropriate?
3a. Are the IS outputs being delivered consistent with the design and
policy intent of the initiative?
3b. What aspects of IS were implemented as planned and what had to be
changed? Why?
3c. Is there evidence of any unintended outcomes, either positive or
negative, for either the program, its staff, or participants?
3d. What are the characteristics of participants and are they in line with
the targeted group? If not, why not?
3e. To what extent are the application and reporting requirements for
participants suitable?
35 Rossi, P., M. Lipsey & H. Freeman, 2004,
Evaluation: A Systematic Approach, SAGE
Incubator Support Initiative Post-commencement Evaluation
28
Page 44 of 59
Released under FOI Act
Document 3
3f. How satisfied are program participants with their interaction with the
program?
4. Are IS governance arrangements effective?
4a. How well do the IS governance arrangements compare against the
ANAO’s good governance focus areas?36
4b. Are there areas for improvement?
5. Are mechanisms in place for robust performance assessment of IS?
5a. Is the data collected appropriate for the effective monitoring of inputs,
outputs and outcomes of the IS element?
5b. Is the right information available, at the right time and in the right format
to manage the program effectively?
Methodology
The evaluation methodology and the extent to which the above questions can
be explored will depend on the availability and accessibility of data at the time
of review. The evaluation methodology will include document review and
interviews with internal program staff and management. The evaluation may
consult external stakeholders including grant recipients and unsuccessful grant
applicants.
Evaluation resourcing
The EU will be responsible for conducting the evaluation. Time will also be
required from the policy and program areas to provide the relevant data for the
evaluation and take part in stakeholder interviews and other data collection
activities.
Governance
The evaluation’s governance will follow that outlined in the department’s
Evaluation Strategy. The evaluation’s reference group members are:
General Manager, Insights and Evaluation Branch, Economic and
Analytical Services Division (Chair)
General Manager, Commercialisation Policy Branch, Science and
Commercialisation Policy Division
General Manager, Food, Chemicals & Business Facilitation Branch,
Industry Growth Division
General Manager, Entrepreneurs’ Programme – Partnerships and Reform,
AusIndustry Support for Business Division
General Manager, Entrepreneurs’ Programme – Program Management
and Delivery, AusIndustry Support for Business Division
36 ANAO (2014)
Public Sector Governance: Strengthening performance through good governance.
This ANAO Better Practice Guide has since been withdrawn from the ANAO website.
Incubator Support Initiative Post-commencement Evaluation
29
Page 45 of 59
Released under FOI Act
Document 3
General Manager, Grant Advisory and Enabling Services, AusIndustry
Support for Business Division
Membership is based on the role rather than the individual. If members are not
available to attend a meeting, they are welcome to send a proxy in their place.
The Reference Group is anticipated to meet for an update about preliminary
findings and to provide feedback about the final report.
Incubator Support Initiative Post-commencement Evaluation
30
Page 46 of 59
Released under FOI Act
Document 3
Appendix B
Methodology
Approach
A mixed-methods approach incorporating quantitative and qualitative data was
used to inform the findings of this evaluation. Data was collected through
interviews with stakeholders, a survey of Incubator Support applicants, and a
review of documents, literature and program data. Where possible, data
sources were triangulated to establish the strength of evidence for a finding.
Limitations
The interviews and survey are a key component of the evidence base for the
evaluation findings. As the stakeholders consulted were arguably likely to have
an interest in Incubator Support continuing, this may have introduced a
positivity bias. Given the inherent selection bias with voluntary survey
methodology and the relatively low response rate, the survey results should be
considered as indicative rather than statistically representative of the views of
previous Incubator Support applicants.
Interviews
Semi-structured interviews were used to gather wide-ranging, qualitative
information about the
need,
design and
implementation of the initiative.
Twenty eight semi-structured interviews were conducted either face-to-face or
by telephone with a range of internal and external stakeholders. As post-
commencement evaluations focus on evaluating the program’s design and
initial implementation,37 the majority of those interviewed were internal
stakeholders. See table B1 for a breakdown of interviewees by stakeholder
group.
Questions for each interview were adapted to be relevant to the interviewee’s
position and experience. As interview guides were not standardised, the
number of interviewees that held a particular view could not be quantified. As
such, the qualitative findings included in this post-commencement evaluation
should not be considered statistically representative. We have endeavoured to
ensure the validity and reliability of all information incorporated in this report by
coding and analysing interview responses through coding platform MAXQDA.
37 DIIS (2019
) Evaluation Strategy 2017-2021
Incubator Support Initiative Post-commencement Evaluation
31
Page 47 of 59
Released under FOI Act
Document 3
Table B1: Stakeholders consulted, by subgroup
Stakeholder type
Number (and level, where relevant) of
people interviewed
External experts
EP Committee
2
Other relevant experts
1
Internal stakeholders
Policy area
5 (GM, EL and APS levels)
Program Management
8 (GM, EL and APS levels)
Customer Service Managers
3
Business Grants Hub
2 (GM and APS levels)
Regional Incubator Facilitators
4
Participants
Participant incubators
3
TOTAL
28
Notes: Consultations included stakeholders who were currently or previously involved with
Incubator Support
Survey
A structured survey of Incubator Support applicants was used to gather
feedback about the implementation and outcomes of the initiative.
The survey was sent to all of the applicants who had applied for an Incubator
Support grant up to 31 May 2018, including both successful and unsuccessful
applicants. The survey included questions about the application process,
reporting process, contact with department officials, and early outcomes. Out
of the 64 total applicants, 20 responded to the survey. The survey included
open and close-ended questions.
The summary of survey findings and survey questions is in Appendix D.
Desktop review
We reviewed internal documents detailing the need, design and
implementation of the Incubator Support initiative, as well as early stage
outcomes. We also conducted research to understand the context of the
Incubator Support initiative.
Incubator Support Initiative Post-commencement Evaluation
32
Page 48 of 59
Released under FOI Act
Document 3
Table B2: Documents referred to, by focus area
Focus area
Documents referred
Need
Incubator Support initiative Cabinet documents
National Innovation and Science Agenda Report
‘EP Policy Rationale’
DIIS reports (e.g. OCE publications)
External research
Design
Incubator Support initiative Cabinet documents
Incubator Support initiative Program Guidelines
(original and current)
‘EP Policy Rationale’
Legislative authority document
External research
Incubator Support initiative program logic and data
matrix (original)
Implementation
Incubator Support Program Guidelines (original and
current)
CSM Standard Operating Procedures
EP Committee Terms of Reference
EP Committee meeting minutes
Applications
Reporting templates, progress reports (for NEI) and
final reports (for EIR)
NEI and EIR merit assessments
ANAO Guidance
Program database
Performance Assessment
Incubator Support initiative program logic and data
matrix (original)
Applications
Progress reports (for NEI) and final reports (for EIR)
Incubator Support Initiative Post-commencement Evaluation
33
Page 49 of 59
Released under FOI Act
Document 3
Appendix C
Response to Terms of
Reference questions
Table C1: Evaluation questions and section of this report where they are addressed
Overarching evaluation
Evaluation questions
Section addressed
questions
What need is Incubator
What was the need that led to the Incubator
1.1, 1.2
Support addressing?
Support initiative?
How strong is the evidence of the need for
1.2
government intervention?
To what extent is the
Are the eligibility criteria for Incubator Support
2.2, 2.4
design of Incubator
appropriate? Is the target market suitable?
Support evidence based
and logically consistent?
Is the initiative funded to the right level? Is the
2.6, 2.7
resourcing (ASL) adequate?
Are IS inputs, activities, outputs and outcomes
2.1, 2.2, 2.5, 2.6
consistent with addressing the IS policy
problem?
Were the set up phase
Are the IS outputs being delivered consistent
3.1
and grant delivery
with the design and policy intent of the initiative?
process appropriate?
What aspects of IS were implemented as
3.2
planned and what had to be changed? Why?
Is there evidence of any unintended outcomes,
3.5
either positive or negative, for either the
program, its staff, or participants?
What are the characteristics of participants and
3.3, 3.4
are they in line with the targeted group? If not,
why not?
To what extent are the application and reporting
3.4, 3.5, 3.6, 3.7
requirements for participants suitable?
How satisfied are program participants with their
3.6, 3.7, 3.8
interaction with the program?
Are governance
How well do the Incubator Support governance
3.9
arrangements effective?
arrangements compare against the ANAO’s
good governance focus areas?38
Are there areas for improvement?
3.9
Are mechanisms in place
Is the data collected appropriate for the effective
4.1, 4.2, 4.3
for robust performance
monitoring of inputs, outputs and outcomes of
assessment of IS?
the IS element?
Is the right information available, at the right
4.1, 4.2, 4.3
time and in the right format to manage the
program effectively?
38
ANAO (2014
) Public Sector Governance: Strengthening performance through good governance.
Incubator Support Initiative Post-commencement Evaluation
34
Page 50 of 59
Released under FOI Act
Document 3
Appendix D
Analysis of applicant surveys
The purpose of the Incubator Support survey was to seek feedback from
applicants about their interaction with the program. The survey questions are
included in Table D3.
All 64 incubators who had submitted an eligible application by the end of May
2018 were invited to participate. Twenty applicants responded, comprising
eight EIR applicants and 13 NEI applicants, with one applicant having applied
for both EIR and NEI. All of the eight EIR survey respondents had been
successful39 while seven of the 13 NEI respondents had been successful and
six had been unsuccessful.
Results
Application process
The findings about the application process for EIR and NEI were very different,
with EIR respondents more satisfied with the EIR application process than the
NEI applicants were with the NEI application process. This is likely to be
because the application process for NEI is relatively more demanding.
Expert in Residence
Respondents indicated that they had no significant concerns with the
application process for EIR. They were mostly satisfied and the process did not
take them a long time to complete. The average number of days of effort to
complete the application was four (median six days, range two to 20 days).
Correspondingly, respondents rated the level of effort required as ‘low’ (two
responses) or ‘moderate’ (six). Most of the applicants were ‘satisfied’ (five) or
‘very satisfied’ (three) with the application process.
Respondents said the level of effort was ‘appropriate’ and ‘pretty smooth
overall’. One respondent commented that the EIR application process struck
the right balance between effort of applying and accountability, and the ability
to keep the application form for their records was an improvement.
Respondents’ suggested improvements were to pre-populate the form for
incubators already in the system through a prior application, improve the web
interface, and make the process scalable depending on the level of funding
requested.
New and Existing Incubators
NEI applicants who responded to the survey had mixed views about the
application process, but were overall less satisfied than EIR applicants. This
was due to a variety of issues including the time taken to complete the
application, the supplementary documentation required and the time taken to
process the application.
39 EIR applicants who request up to $50,000 are funded, provided their application is eligible.
Incubator Support Initiative Post-commencement Evaluation
35
Page 51 of 59
Released under FOI Act
Document 3
Respondents were asked to indicate how long it took to develop and submit
their application, and were prompted to record time taken rather than duration
over which the application was developed. Given the large range in responses
(five to 390 days) and presence of outliers, the median was used as a measure
of central tendency. The median time spent by the 13 NEI respondents on
developing their applications was 25 days. Successful applicants invested
more time (median 30 days) developing their applications than unsuccessful
applicants (median 23 days). Most respondents rated application effort as ‘very
high’, with a larger proportion of successful applicants rating application effort
as ‘very high’ compared with unsuccessful applicants (figure D1).
Respondents rated the level of effort it took to complete the application as
‘moderate’ (two), ‘high’ (three) or ‘very high’ (eight). Successful applicants
generally rated effort higher than unsuccessful applicants. Reasons for the
ratings given included that it took a lot of effort to understand what information
was required and to obtain supporting documents.
The level of satisfaction with the application process was mixed, with
respondents ‘dissatisfied’ (three), ‘neither satisfied nor dissatisfied’ (five),
‘satisfied’ (four) or ‘very satisfied’ (one). Successful applicants were generally
more satisfied than unsuccessful applicants, but even among successful
applicants only three out of seven were satisfied.
Some respondents described the application as ‘detailed’, ‘difficult’ and ‘drawn
out’. Comments included that: ‘many incubators may not have the staffing
capacity to complete an application without assistance’; ‘crossing two financial
years’ was problematic; file size restrictions were unnecessary; the ‘application
questions are not user-intuitive’; and ‘engagement with DIIS was long and
drawn out’.
Recommended changes included making the application form shorter and
more logical and providing more advice to applicants, including about how
much detail to provide in the response space and attachments.
Figure D1: NEI applicants’ ratings of the level of effort required
Very high
High
rt
ffo
Moderate
f e
l o
e
v
e
Low
L
Very low
0
1
2
3
4
5
6
Number of applicants
Successful applicants
Unsuccessful applicants
Source: Incubator Support survey July 2018
Incubator Support Initiative Post-commencement Evaluation
36
Page 52 of 59
Released under FOI Act
Document 3
Figure D2: NEI applicants’ rating of their satisfaction with the application process
Very satisfied
n
tio
Satisfied
c
fa
tis
a Neither satisfied nor dissatisfied
f s
l o
e
v
Dissatisfied
e
L
Very dissatisfied
0
0.5
1
1.5
2
2.5
3
3.5
4
Number of applicants
Successful
Unsuccessful
Source: Incubator Support survey July 2018
Reporting process
Most of the EIR applicants and some of the NEI applicants who responded to
the survey had completed a progress or final report. Overall, respondents were
satisfied with the reporting process and did not have any significant concerns.
They indicated that it did not take much time to complete the report.
Expert in Residence
Of the eight respondents, six had completed a report, one had not, and one did
not answer this question. Of the six respondents who had completed a report,
five said it took ‘two to four days’ and one said it took ‘less than a day’. Three
were ‘satisfied’ with the reporting process and three were ‘very satisfied’
Respondents described the reporting process as simple and straightforward
with sufficient time allowed. The majority of respondents did not think any
changes were required, although one suggested that the department consider
using a standard incubator reporting platform.
New and Existing Incubators
Of the seven successful applicants who responded to the survey, four had
completed a report and three had not. Among the four who had completed a
report, one indicated the report took ’more than a week’ to complete, one took
‘a week’, one took ‘two to four’ days, and one ‘less than a day’. One respondent
was ‘neither satisfied nor dissatisfied’ with the reporting process while three
were ‘satisfied’.
One respondent recommended the process be streamlined in future, while
another commented that it needs to be acknowledged that adjustment will need
to be made to a proposed project from approval to implementation.
Incubator Support Initiative Post-commencement Evaluation
37
Page 53 of 59
Released under FOI Act
Document 3
Contact with CSMs and RIFs
Most of the respondents who had contact with AusIndustry staff, such as
Customer Service Managers (CSMs) and Regional Incubator Facilitators
(RIFs), stated that they were happy with their interaction, noting that they were
generally helpful and effective.
Expert in Residence
Of the eight respondents, seven had had contact with a CSM or other
AusIndustry officer, and one did not answer this question. Three respondents
were ‘satisfied’ with the assistance provided, three were ‘very satisfied’, and
one was ‘neither satisfied nor dissatisfied’.
The type of assistance respondents said they required was generally
clarification about eligibility and guidance on application and reporting
requirements. Respondents were positive about the assistance received, which
was described as being ‘timely’, ‘useful’, of ‘quality’, and ‘efficient’. Officials
were reported as being ‘pleasant’, ‘helpful’, and ‘engaged, informative,
collaborative’.
Respondents were positive about their engagement with RIFs. One respondent
commented that the RIF showed high initiative and was very well connected.
Another commented that their face-to-face engagement with the RIF was
excellent.
New and Existing Incubators
Of the 13 respondents, nine reported having had contact with a CSM or other
AusIndustry officer, while four had not, and this breakdown was similar for both
successful and unsuccessful applicants.
Of the nine who had had contact with a CSM, one had contact ‘once’, four had
contact ‘two to five times’, two had contact ‘five to ten times’ and two had
contact ‘more than ten times’. One respondent was ‘dissatisfied’ with the
assistance received, one was ‘neither satisfied nor dissatisfied’, four were
‘satisfied’ and three were ‘very satisfied’
Almost all respondents were very positive about the assistance received.
Officials were described as being ‘very forthcoming and helpful’, ‘accessible’,
‘professional’, ‘very knowledgeable’ and ‘prompt’ in responding. However, one
respondent commented that clarity and consistency was an issue at times due
to having contact with a number of people. Another said that feedback was
provided too late.
Of the eight respondents, two reported having contact with a RIF, while eleven
had not. One respondent who had contact with a RIF was ‘dissatisfied’ with the
advice provided and one was ‘very satisfied’.
One respondent sought advice about the information the EP Committee
needed to make a decision and commented that they received good, thorough
advice. The other said the RIF sought advice from them and appeared more
focused on finding new applicants than assisting with implementation.
Incubator Support Initiative Post-commencement Evaluation
38
Page 54 of 59
Released under FOI Act
Document 3
Impacts and other feedback
Overall, respondents were mostly positive about the impacts of applying for
Incubator Support. Respondents who applied for EIR were generally more
positive than applicants for NEI, who identified some negative aspects
associated with the time taken in the application process.
Expert in Residence
Positive impacts
Survey respondents identified the following positive impacts of applying:
access to international and national resources and connections; delivery of an
inspiring workshop; better servicing of scale-ups by providing experts/mentors;
work funded helped raise awareness about their new regional incubator;
development of regional relationships and extension of services to regional
organisations.
Negative impacts
Most survey respondents said there were no negative impacts of applying for
EIR. One respondent stated that timing delays in funding approvals impact on
being able to confirm with the expert.
New and Existing Incubators
Positive impacts
Successful applicants identified that the grant enabled them: to continue
supporting Australia as a ‘global powerhouse’; to expand their support for start-
ups into new regions and develop a new regional model; to accelerate the
expansion of services; support incubator establishment and engagement with
academia and industry; to increase recognition; and to run a second program.
Unsuccessful applicants indicated that they subsequently reallocated funds to
another project that would have greater impact, and learnt about the
information required and how to present it. One suggested that a one-page
feedback would be useful for future applications.
Negative impacts
Successful applicants identified a number of negative impacts. One stated that
the process was ‘long, drawn out, sometimes stressful’ and not developed with
regional incubators in mind. This was consistent with another who identified
that regional communities require more assistance and ‘nurturing’, which
resulted in extensive travel requirements. Another commented that receiving
50 per cent of their funding up-front half-way through the financial year created
an unnecessary tax burden. One stated that developing the proposal and
application created a large workload for staff.
Unsuccessful applicants commented that: ‘the application took too much time’;
they ‘wasted five days of effort’; ‘the process is too bureaucratic’; and they
‘could not proceed with the project without the funding’.
Incubator Support Initiative Post-commencement Evaluation
39
Page 55 of 59
Released under FOI Act
Document 3
Survey of participants
Table D3: Incubator Support survey
Question
Type of response
Response options
YOUR APPLICATION
During which financial year did your organisation apply for IS?
Multiple Choice – Radio buttons
2016-17 Financial year
2017-18 Financial year
Were you closely involved with the application process for your organisation?
Multiple Choice – Radio buttons
Yes
No
Which type of grant did your organisation apply for?
Multiple Choice – Check Boxes
New and existing incubators
Please tick all that apply
Experts-in-residence
Was your application for X successful?
Multiple Choice – Radio buttons
Yes
If you have applied more than once please record the results of your most recent
No
application.
[Conditional – appears based on 3]
APPLICATION PROCESS
Approximately how much time did it take your organisation to develop and submit
Text box – Single line
Text box day/s
the application?
Please record total time, not the time period over which input was provided.
For the amount of funding your organisation requested, how would you describe the
Multiple Choice – Radio buttons
Very low
level of effort required to complete the application?
Single line text box for optional
Low
further comments
Moderate
High
Very high
Text box
In a few words, what factors influenced your rating?
Text box – Multiple lines
Text box
Overall, how satisfied or dissatisfied were you with the application process? Please
Multiple Choice – Radio buttons
Very satisfied
let us know how you feel about the process, not the outcome of the process.
Single line text box for optional
Satisfied
further comments
Neither satisfied nor dissatisfied
Dissatisfied
Very dissatisfied
Text box
Incubator Support Initiative Post-commencement Evaluation
40
Page 56 of 59
Released under FOI Act
Document 3
Question
Type of response
Response options
If you could make a change(s) to improve the application process, what would it be?
Text box – Multiple lines
Text box
REPORTING
Has your organisation submitted any reports; e.g. progress or final reports?
Options
Yes
[Conditional – appears if ‘yes’ to 4]
No
Approximately how much time did it take you to complete the progress report?
Multiple Choice – Radio buttons
Less than a day
Please record total time input for your organisation, not the time period over which
Single line text box for optional
Two to four days
input was provided. If you have submitted more than one, please tell us the
further comments
A week
average, and provide details.
More than a week open text
[Conditional – appears if ‘yes’ to 6a]
for how long
Overall, how satisfied or dissatisfied are you with the reporting process?
Multiple Choice – Radio buttons
Very satisfied
[Conditional – appears if ‘yes’ to 6a]
Satisfied
Single line text box for optional
Neither satisfied nor dissatisfied
further comments
Dissatisfied
Very dissatisfied
Text box
In a few words, what factors influenced the rating you gave for 6c?
Text box – Multiple lines
Text box
[Conditional – appears if ‘yes’ to 6a]
If you could make a change to improve the reporting process, what would it be?
Text box – Multiple lines
Text box
[Conditional – appears if ‘yes’ to 6a]
CUSTOMER SERVICE MANAGERS/AUSINDUSTRY STAFF
Have you contacted, or been contacted by, a Customer Service Manager or other
Multiple Choice – Radio buttons
Yes
AusIndustry officer in relation to your application and/or reports?
No
Uncertain
Who did you contact/who were you contacted by?
Multiple Choice – Radio buttons
Customer Service Manager
Other AusIndustry officer
Both
Uncertain
How many times has your organisation contacted, or been contacted by, a
Multiple Choice – Radio buttons
Once
Customer Service Manager or AusIndustry officer in relation to your application
Two to five times
and/or reports?
Five to ten times
[Conditional – appears if ‘yes’ to 7a]
More than ten times
Uncertain
Incubator Support Initiative Post-commencement Evaluation
41
Page 57 of 59
Released under FOI Act
Document 3
Question
Type of response
Response options
In a sentence or two, what type(s) of assistance did you require?
Text box – Multiple lines
Open text
[Conditional – appears if ‘yes’ to 7a]
How satisfied or dissatisfied were you with the assistance provided?
Multiple Choice – Radio buttons
Very satisfied
[Conditional – appears if ‘yes’ to 7a]
Satisfied
Neither satisfied nor dissatisfied
With single line text box for
Dissatisfied
optional further comments
Very dissatisfied
Text box
In a few words, what factors influenced the rating you gave for 7e?
Text box – Multiple lines
Text box
[Conditional – appears if ‘yes’ to 7a]
REGIONAL INCUBATOR FACILITATORS
In May 2018 the program implemented changes to IS guidelines with the
introduction of four Regional Incubator Facilitators. Regional Incubator Facilitators
are employed to:
Provide advice and mentoring
Provide support to develop local and international networks
Promote joint applications and knowledge sharing between regional and
metropolitan incubators
Provide feedback on draft New and Existing Incubator applications (including
metropolitan applications), and feedback to unsuccessful applicants.
Have you contacted, or been contacted by a Regional Incubator Facilitator?
Multiple Choice – Radio buttons
Yes
No
In a sentence or two, what type(s) of advice did you require?
Text box – Multiple lines
Text box
[Conditional – appears if ‘yes’ to 8a]
How satisfied or dissatisfied were you with the advice provided by with the Regional
Multiple Choice – Radio buttons
Very satisfied
Incubator Facilitator?
Satisfied
Neither satisfied nor dissatisfied
With single line text box for
Dissatisfied
[Conditional – appears if ‘yes’ to 8a]
optional further comments
Very dissatisfied
Text box
In a few words, what factors influenced the rating you gave for 8c?
Text box – Multiple lines
Text box
[Conditional – appears if ‘yes’ to 8a]
IMPACTS
Incubator Support Initiative Post-commencement Evaluation
42
Page 58 of 59
Released under FOI Act
Document 3
Question
Type of response
Response options
Have there been any positive impacts for you/your organisation as a result of
Text box – Multiple lines
Text box
applying for an IS grant?
Please include anticipated as well as any unanticipated impacts.
Have there been any negative impacts for you/your organisation as a result of
Text box – Multiple lines
Text box
applying for an IS grant?
Please include anticipated as well as any unanticipated impacts.
FEEDBACK
Do you have any other feedback or ideas that you would like to share?
Text box – Multiple lines
Text box
CONTACT
The evaluation team may contact a sample of applicants to follow-up on responses
provided to this survey.
Would you be happy for a member of the evaluation team to contact you for a short
Multiple Choice – Radio buttons
Yes
telephone interview?
No
Name
Text box – Single line
Text box
[Conditional – appears if ‘yes’ to 5i]
Organisation
Text box – Single line
Text box
[Conditional – appears if ‘yes’ to 5i]
Preferred contact number
Text box – Single line
Text box
Incubator Support Initiative Post-commencement Evaluation
43
Page 59 of 59
Document Outline