1
0
mirror of https://github.com/google/comprehensive-rust.git synced 2025-06-17 06:37:34 +02:00

Comprehensive Rust v2 (#1073)

I've taken some work by @fw-immunant and others on the new organization
of the course and condensed it into a form amenable to a text editor and
some computational analysis. You can see the inputs in `course.py` but
the interesting bits are the output: `outline.md` and `slides.md`.

The idea is to break the course into more, smaller segments with
exercises at the ends and breaks in between. So `outline.md` lists the
segments, their duration, and sums those durations up per-day. It shows
we're about an hour too long right now! There are more details of the
segments in `slides.md`, or you can see mostly the same stuff in
`course.py`.

This now contains all of the content from the v1 course, ensuring both
that we've covered everything and that we'll have somewhere to redirect
every page.

Fixes #1082.
Fixes #1465.

---------

Co-authored-by: Nicole LeGare <dlegare.1001@gmail.com>
Co-authored-by: Martin Geisler <mgeisler@google.com>
This commit is contained in:
Dustin J. Mitchell
2023-11-29 10:39:24 -05:00
committed by GitHub
parent ea204774b6
commit 6d19292f16
309 changed files with 6807 additions and 4281 deletions

View File

@ -41,8 +41,8 @@ impl Philosopher {
fn eat(&self) {
// ANCHOR_END: Philosopher-eat
println!("{} is trying to eat", &self.name);
let left = self.left_fork.lock().unwrap();
let right = self.right_fork.lock().unwrap();
let _left = self.left_fork.lock().unwrap();
let _right = self.right_fork.lock().unwrap();
// ANCHOR: Philosopher-eat-end
println!("{} is eating...", &self.name);

View File

@ -13,7 +13,7 @@
// limitations under the License.
// ANCHOR: solution
use std::{sync::Arc, sync::Mutex, sync::mpsc, thread};
use std::{sync::mpsc, sync::Arc, sync::Mutex, thread};
// ANCHOR: setup
use reqwest::{blocking::Client, Url};
@ -138,7 +138,10 @@ fn control_crawl(
result_receiver: mpsc::Receiver<CrawlResult>,
) -> Vec<Url> {
let mut crawl_state = CrawlState::new(&start_url);
let start_command = CrawlCommand { url: start_url, extract_links: true };
let start_command = CrawlCommand {
url: start_url,
extract_links: true,
};
command_sender.send(start_command).unwrap();
let mut pending_urls = 1;