<samp id="e4iaa"><tbody id="e4iaa"></tbody></samp>
<ul id="e4iaa"></ul>
<blockquote id="e4iaa"><tfoot id="e4iaa"></tfoot></blockquote>
    • <samp id="e4iaa"><tbody id="e4iaa"></tbody></samp>
      <ul id="e4iaa"></ul>
      <samp id="e4iaa"><tbody id="e4iaa"></tbody></samp><ul id="e4iaa"></ul>
      <ul id="e4iaa"></ul>
      <th id="e4iaa"><menu id="e4iaa"></menu></th>

      COMP3310代做、代寫C++, Java/Python編程

      時間:2024-04-16  來源:  作者: 我要糾錯



      Page 1 of 3
      COMP3310 - Assignment 2: Indexing a Gopher.
      Background:
      • This assignment is worth 12.5% of the final mark.
      • It is due by 23:55 Friday 26 April AEST (end of Week 8)
      • Late submissions will not be accepted, except in special circumstances.
      o Extensions must be requested as early as possible before the due date, with suitable
      evidence or justification.
      • If you would like feedback on particular aspects of your submission, please note that in the
      README file within your submission.
      This is a coding assignment, to enhance and check your network programming skills. The main focus is on
      native socket programming, and your ability to understand and implement the key elements of an
      application protocol from its RFC specification.
      Please note that this is an ongoing experiment for the course, trialling gopher for this assignment. We may
      discover some additional challenges as we go, that requires some adjustments to the assignment activities, or
      a swap of server. Any adjustments will be noted via a forum Announcement.
      Assignment 2 outline
      An Internet Gopher server was one of the precursors to the web, combining a simple query/response
      protocol with a reasonably flexible content server, and a basic model for referencing and describing
      resources on different machines. The name comes from the (Americanised) idea to “go-for” some content…
      and also the complexity of their interconnected burrows1
      .
      For this assignment, you need to write your own gopher client in C, Java or Python2,3
      , without the use of any
      external gopher-related libraries. The client will need to ‘spider’ or ‘crawl’ or ‘index’ a specified server, do
      some simple analysis and reporting of what resources are there, as well as detect, report and deal with any
      issues with the server or its content.
      Your code MUST open sockets in the standard socket() API way, as per the tutorial exercises. Your code
      MUST make appropriate and correctly-formed gopher requests on its own, and capture/interpret the results
      on its own. You will be handcrafting gopher protocol packets, so you’ll need to understand the structures of
      requests/responses as per the gopher RFC 1436.
      We will provide a gopher server to run against, with a mix of content – text and binary files, across some
      folder structure, along with various pointers to resources.
      In the meantime, you SHOULD install a gopher server on your computer for local access, debugging and
      wiresharking. There are a number available, with pygopherd perhaps the more recently updated but more
      complex, and Motsognir, which is a bit older but simpler. If you find another good one, please share on the
      forum.
      1 https://en.wikipedia.org/wiki/Gopher
      2 As most high-performance networking servers, and kernel networking modules, are written in C with other languages
      a distant second, it is worth learning it. But, time is short, and everyone has a different background.
      3
      If you want to use another language (outside of C/Java/Python), discuss with your tutor – it has to have native socket
      access, and somebody on the tutoring team has to be able to mark it.
      Page 2 of 3
      Wireshark will be very helpful for debugging purposes. A common trap is not getting your line-ending right on
      requests, and this is rather OS and language-specific. Remember to be conservative in what you send and
      reasonably liberal in what you accept.
      What your successful and highly-rated indexing client will need to do:
      1. Connect to the class gopher server, and get the initial response.
      a. Wireshark (just) this initial-response conversation in both directions, from the starting TCP
      connection to its closing, and include that wireshark summary in your README.
      b. The class gopher site is not yet fully operational, an announcement will be made when it’s ready.
      2. Starting with the initial response, automatically scan through the directories on the server, following links
      to any other directories on the same server, and download any text and binary (non-text) files you find.
      The downloading allows you to measure the file characteristics. Keep scanning till you run out of
      references to visit. Note that there will be items linked more than once, so beware of getting stuck in a
      loop.
      3. While running, prints to STDOUT:
      a. The timestamp (time of day) of each request, with
      b. The client-request you are sending. This is good for debugging and checking if something gets
      stuck somewhere, especially when dealing with a remote server.
      4. Count, possibly store, and (at the end of the run) print out:
      a. The number of Gopher directories on the server.
      b. The number, and a list of all simple text files (full path)
      c. The number, and a list of all binary (i.e. non-text) files (full path)
      d. The contents of the smallest text file.
      e. The size of the largest text file.
      f. The size of the smallest and the largest binary files.
      g. The number of unique invalid references (those with an “error” type)
      h. A list of external servers (those on a different host and/or port) that were referenced, and
      whether or not they were "up" (i.e. whether they accepted a connection on the specified port).
      i. You should only connect to each external server (host+port combination) once. Don't
      crawl their contents! We only need to know if they're "up" or not.
      i. Any references that have “issues/errors”, that your code needs to explicitly deal with.
      Requests that return errors, or that had to abort (e.g. due to a timeout, or for any other reason) do not count
      towards the number of (smallest/largest)(text/binary) files.
      You will need to keep an eye on your client while it runs, as some items might be a little challenging if you’re
      not careful… Not every server provides perfectly formed replies, nor in a timely fashion, nor properly
      terminated file transfers, for example. Identify any such situations you find on the gopher server in your
      README or code comments, and how you dealt with each of them – being reasonably liberal in what you
      accept and can interpret, or flagging what you cannot accept.
      We will test your code against the specified gopher, and check its outputs. If you have any uncertainties
      about how to count some things, you can ask your tutor or in the forum. In general, if you explain in your
      README how you decide to count things and handle edge-cases, that will be fine.
      You can make your crawler's output pretty or add additional information if you'd like, but don't go
      overboard. We need to be able to easily see everything that's listed here.
      Page 3 of 3
      Submission and Assessment
      There are a number of existing gopher clients, servers and libraries out there, many of them with source.
      While perhaps educational for you, the assessors know they exist and they will be checking your code against
      them, and against other submissions from this class.
      You need to submit your source code, and a README file (text/word/pdf). Any instructions to run the code,
      and any additional comments and insights, please provide those in the README. Your submission must be a
      zip file, packaging everything as needed, and submitted through the appropriate link on wattle.
      Your code will be assessed on [with marks% available]
      1. Output correctness [45%]
      o Does the gopher server correctly respond to all of your queries?
      o Does your code report the right numbers? (within your interpretation, perhaps)
      o Does your code cope well with issues it encounters?
      o Does your code provide the running log of requests as above?
      2. Performance [10%]
      o A great indexer should run as fast as the server allows, and not consume vast amounts of
      memory, nor take a very long time. There won’t be too many resources on the server.
      3. Code “correctness, clarity, and style” [45%]
      o Use of native sockets, writing own gopher requests correctly.
      o Documentation, i.e. comments in the code and the README - how easily can somebody else
      pick this code up and, say, modify it.
      o How easy the code is to run, using a standard desktop environment.
      o How does it neatly handle edge-cases, where the server may not be responding perfectly.
      During marking your tutor may ask you to explain some particular coding decisions.
      Reminder: Wireshark is very helpful to check behaviours of your code by comparing against existing gopher
      clients (some are preinstalled in Linux distributions, or are easily added). There are a number of youtube
      videos on gopher as well that e.g. show how the clients work. Your tutors can help you with advice (direct or
      via the forum) as can fellow students. It’s fine to work in groups, but your submission has to be entirely your
      own work.

      請加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp

      標簽:

      掃一掃在手機打開當前頁
    • 上一篇:代做COMP9024、代寫C++設計編程
    • 下一篇:代寫CS360、代做Java/Python程序設計
    • 無相關信息
      昆明生活資訊

      昆明圖文信息
      蝴蝶泉(4A)-大理旅游
      蝴蝶泉(4A)-大理旅游
      油炸竹蟲
      油炸竹蟲
      酸筍煮魚(雞)
      酸筍煮魚(雞)
      竹筒飯
      竹筒飯
      香茅草烤魚
      香茅草烤魚
      檸檬烤魚
      檸檬烤魚
      昆明西山國家級風景名勝區
      昆明西山國家級風景名勝區
      昆明旅游索道攻略
      昆明旅游索道攻略
    • 福建中專招生網 NBA直播 短信驗證碼平臺 幣安官網下載 WPS下載

      關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

      Copyright © 2025 kmw.cc Inc. All Rights Reserved. 昆明網 版權所有
      ICP備06013414號-3 公安備 42010502001045

      主站蜘蛛池模板: 国产a v无码专区亚洲av| 亚洲国产成人片在线观看无码| 无码毛片一区二区三区中文字幕| 国产成人无码aa精品一区| 无码AV一区二区三区无码 | 日韩精品无码成人专区| 国产亚洲?V无码?V男人的天堂 | 亚洲日韩欧洲无码av夜夜摸| 日韩AV无码久久一区二区| 无码国模国产在线观看| 久久综合精品国产二区无码| 国产成人精品无码专区| 国产激情无码一区二区| 精品无码黑人又粗又大又长 | 亚洲成AV人片在线观看无码| 中文字幕人成无码免费视频| 精品久久久久久无码中文字幕| 无码毛片视频一区二区本码| 精品人体无码一区二区三区| 亚洲日韩精品无码专区| 无码人妻品一区二区三区精99| 无码国产精品一区二区高潮| 亚洲午夜无码久久久久软件| 久久精品九九热无码免贵| 宅男在线国产精品无码| 亚洲Av无码国产情品久久| 精品国产AV无码一区二区三区 | 最新亚洲人成无码网站| 人妻丰满熟妇AV无码区| 无码毛片AAA在线| 免费看又黄又无码的网站| 国产免费无码一区二区| 一本一道av中文字幕无码| 在线精品自拍无码| 丰满日韩放荡少妇无码视频| 国产高新无码在线观看| 人妻av无码专区| 宅男在线国产精品无码| 无码无遮挡又大又爽又黄的视频| 国产成人麻豆亚洲综合无码精品| 黄A无码片内射无码视频 |