The Electronic Journal of Business Research Methods provides perspectives on topics relevant to research in the field of business and management
For general enquiries email administrator@ejbrm.com
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop
Information about The European Conference on Research Methodology for Business and Management Studies is available here

linkedin-120 

 

twitter2-125 

 

fb_logo-125 

 

Journal Article

The Development of an Evaluation Framework for Partnership Working  pp1-10

Maurice Atkinson

© Sep 2005 Volume 3 Issue 1, Editor: Arthur Money, pp1 - 92

Look inside Download PDF (free)

Abstract

This paper describes the outcomes of the "Action Planning" stage of an action research project utilising a search conference for the purposes of organisational development. The aim of the project was the design of a methodology to facilitate the evaluation of the complexities of working in partnership and to assess the extent to which collaboration actually adds value in terms of both process and outcomes. The research centred on multi‑agency partnership working within Children's Services Planning (CSP) in the Southern Health and Social Services Board's area in Northern Ireland. The resulting evaluation framework contains seven interconnected dimensions with associated sub‑ dimensions and assessment criteria. The framework is underpinned by the concept of a virtuous circle formed by evaluation, learning, improvement, measurement, and back to evaluation.

 

Keywords: Evaluation, evaluation framework, partnership working, collaboration, action research, Children's Services Planning, Health and Personal Social Service

 

Share |

Journal Article

A Case Study on the Selection and Evaluation of Software for an Internet Organisation  pp57-66

Pieter van Staaden, Sam Lubbe

© Nov 2006 Volume 4 Issue 1, Editor: Arthur Money, pp1 - 66

Look inside Download PDF (free)

Abstract

The authors conducted research to determine whether IT‑managers, IT auditors, users, management, etc. (all decision‑makers) use a certain evaluation and selection process to acquire software to meet business objectives and the requirement of users. An argument was used that the more thorough the software evaluation and selection process, the more likely it would be that the organisation will choose software that meets these targets. The main objective of the research was therefore to determine whether Media24 uses evaluation methods and obtain the desired results. Media24 is Africa's biggest publishing group and offers entertainment, information and education 24 hours a day and served as a good example of an Internet Service organisation that made their employees available for this research project. The results confirmed that Media24 uses suggested protocol as noted in the theory for software acquisition correctly during most stages.

 

Keywords: Black Box testing, business process, commercial software system, document, evaluation, request for proposal, RFP, requirements, selection, software, vendors

 

Share |

Journal Article

A Framework for Mixed Stakeholders and Mixed Methods  pp21-28

Barbara Crump, Keri Logan

© Sep 2008 Volume 6 Issue 1, ECRM 2008, Editor: Ann Brown, pp1 - 94

Look inside Download PDF (free)

Abstract

Balancing stakeholder expectations and requirements is frequently a challenge for the ethical researcher contracted to evaluate government‑funded community projects. Invariably these projects involve people from diverse backgrounds with their own agenda and expectations for the project. This was the scenario for adopting a mixed‑method evaluation of Wellington's Smart Newtown community computing project where free Internet access as well as some computer skills training was made available at the newly‑established computing centres. The four‑year, multiple stakeholder evaluation project involved qualitative and quantitative approaches, situated within a five‑purpose conceptual framework of: triangulation, complementarily, development, initiation, and expansion. The framework provided a robust platform that ensured a systematic and thorough approach in both collection and analysis of data. In this paper we describe the application of each "purpose" of the framework to the different data sets that resulted in an objective, impartial evaluation which was subsequently used for deciding future directions of publicly‑funded community computing centres.

 

Keywords: Mixed method, evaluation, community computing, triangulation

 

Share |

Journal Article

The Factors that Influence Adoption and Usage Decision in SMEs: Evaluating Interpretive Case Study Research in Information Systems  pp13-24

Japhet Lawrence

© Sep 2010 Volume 8 Issue 1, Editor: Ann Brown, pp1 - 62

Look inside Download PDF (free)

Abstract

The conventions for evaluating information systems case studies conducted according to the natural science model of social science are now widely accepted, as a valid research strategy within the Information System research community. While these criteria are useful in evaluating case study research conducted according to the natural science model of social science, however, they are inappropriate for interpretive research. The nature and purpose of interpretive research differs from positivist research. Although, there are no agreed criteria for evaluating research of this kind, nonetheless, there must be some criteria by which the quality of interpretive research can be evaluated. This paper evaluates a case study research conducted under the interpretive philosophy. The paper discusses the criteria proposed by Myers (1997) for evaluating interpretive research in information systems.

 

Keywords: Interpretive research, case study, IS evaluation, internet, SME

 

Share |

Journal Article

Using Focus Groups in Studies of ISD Team Behaviour  pp119-131

Colm O’hEocha, Kieran Conboy, Xiaofeng Wang

© Dec 2010 Volume 8 Issue 2, ECRM Special Issue Part 1, Editor: Ann Brown, David Douglas, Marian Carcary and Jose Esteves, pp63 - 162

Look inside Download PDF (free)

Abstract

This paper discusses an innovative focus group approach used to study an Information Systems Development (ISD) environment. The research had to cope with the application of a broad framework, untested in practice, seeking to elicit potentially highly sensitive opinions and judgments in a highly pressurised, time‑restricted environment. The researchers’ design of the focus groups is discussed along with an evaluation of the final approach used. The paper concludes with a set of issues for future researchers to consider when designing focus groups for their own studies, along with a set of lessons learned and recommendations arising from the research team’s experience in this study

 

Keywords: focus group, information systems development, evaluation criteria

 

Share |

Journal Article

Eating our own Cooking: Toward a More Rigorous Design Science of Research Methods  pp141-153

John Venable, Richard Baskerville

© Dec 2012 Volume 10 Issue 2, ECRM, Editor: Ann Brown, pp53 - 153

Look inside Download PDF (free)

Abstract

This paper argues that Design Science is an appropriate paradigm for research into Research Methods. Research Methods (along with their tools and techniques) are purposeful artefacts, designed and created by people to achieve a specific purpose – i.e. to create new, truthful knowledge. Like other artefacts, research methods vary in their fitness to purpose, i.e. in their utility, depending on their fit and appropriate application to the particular purpose, contexts, and contingencies for which they were developed. Design Science Research aims at developing new purposeful artefacts with evidence of their utility. Applying a DSR perspective to research methods should yield increased utility in the application of research methods, better guidance in applying them and greater confidence in achieving the desired outcomes of applying them. Based on these premises, this paper reviews the basic concerns and issues in Design Science Research (using the balanced scorecard as an example purposeful artefact), then analyses the logical consequences of taking a Design Science perspective on research methods (using the Partial Least Square approach as an example research method purposeful artefact). First, it analyses the various purposes of research methods to clarify the alternative and competing design goals of research methods. Second, it analyses and characterises the types of purposeful (design) artefacts that comprise research methods. Third, it considers issues of the evaluation of research methods. Fourth and finally, it considered the development of design theories of research methods.

 

Keywords: research method, research design, design science research, evaluation, design theory, research rigour

 

Share |

Journal Article

Looking at the Past to Enrich the Future: A Reflection on Klein and Myers’ Quality Criteria for Interpretive Research  pp77-88

Ana Cardoso, Isabel Ramos

© Dec 2012 Volume 10 Issue 2, ECRM, Editor: Ann Brown, pp53 - 153

Look inside Download PDF (free)

Abstract

In the last two decades, interpretive research has become more established and more popular in information systems field. The work of Klein and Myers (1999) consists of a set of principles for conducting and evaluating interpretive research, which provide fair and appropriate criteria for assessing the validity and reliability of such studies and, given the number of citations, has had a significant impact in the interpretive research literature. Our article focuses on understanding how this set of principles has informed research articles published in two of the highest‑ranked information systems journals and, specifically, questions if these principles have been translated into common practices when conducting interpretive research in the field of information systems and whether authors incorporate them explicitly when they communicate the results of their research. We reviewed articles published in Management Information Systems Quarterly and Information Systems Research, collected any explicit or implicit evidence of quality criteria that informed the research, and highlighted direct or indirect reference to Klein and Myers criteria. We summarize and compare our findings in a comprehensive table, and note that, apparently, the principle of hermeneutic circle and the principle of suspicion are the most explicitly discussed in this sample. Moreover, Klein and Myers’ set of principles seem to have had a greater influence in the papers published in the period from 2002 to 2006. This study provides a reflexion about methodological rigor in interpretive research that, to our knowledge, had never been done. Thus, the findings here presented may be useful for junior researchers and doctorate level students to understand how validity and quality criteria are enacted in high‑quality interpretive research and, we hope, may encourage them to build on the exemplary work of the authors we reviewed and thus to contribute to enriching the literature of qualitative research methodology in information systems field.

 

Keywords: interpretive research evaluation, quality and rigor criteria, information systems, Klein & Myers’ set of principles, hermeneutics, phenomenology.

 

Share |

Journal Article

Using Sequential Mixed Methods in Enterprise Policy Evaluation: A Pragmatic Design Choice?  pp16-26

Anthony Paul Buckley

© Dec 2015 Volume 13 Issue 1, Mixed Methods, Editor: Ros Cameron, pp1 - 61

Look inside Download PDF (free)

Abstract

Abstract: How might policy instruments contribute to indigenous firm growth and how can the effects of these instruments be evaluated at both firm and policy level? This paper illustrates how a mixed methods research design and data analysis strategy can pragmatically address the research questions outlined above. The advantages and challenges of employing quantitative research methods (what happened?) followed by confirmatory qualitative research methods (how and why did it happen?) in a multiphase s equential explanatory design is explored. The data analysis strategy is firstly to analyse the data generated from a before and after quasi‑experiment (with statistical controls), then data from the confirmatory qualitative techniques (in…depth desc riptive case studies) and cross‑case analysis are added. The proposed research design and analysis approach is applicable to complex research settings where a study is unable, for a variety of reasons, to meet the exacting requirements of a true experime ntal design e.g. random assignment, establishment of counterfactuals, valid control groups etc. This sequential multiphase approach can deliver findings on the relative contribution of the myriad factors influencing a result showing whether the policy intervention in this study made a contribution to an observed result and in what way? The findings from the Phase 1: Quasi…experiment, Phase 2: Case studies and Phase 3: Cross‑case analysis collectively demonstrates that the policy instrument evaluated i n this study made a marginal contribution at best to individual firm performance. Overall the state received a negative return on its investment (despite selecting the cohort of firms to invest in). The study concludes that, in the analysis period, the salient factors influencing value creation in the firms (and conversely the barriers to firm growth) were internal to the firm.

 

Keywords: Keywords: sequential mixed methods, evaluation, enterprise policy, firm growth

 

Share |