A number of Learning Management Systems (LMSs) exist on the market today. A subset of a LMS is the component in which student assessment is managed. In some forms of assessment, such as short-answer questions, the LMS is incapable of evaluating the students' responses and therefore human intervention is necessary. This study leverages the research conducted in recent Natural Language Processing studies to provide a fair, timely and accurate assessment of student short-answers based on the semantic meaning of those answers. A component-based system utilizing a Text Pre-Processing phase and a Word/Synonym Matching phase has been developed to automate the marking process. An evaluation plan is also made to verify the possibility of applying such computerized assessment system in practical situations as well as to reveal areas in which the system could be improved later.